
The myth that “everyone leaves for residency” is statistically false. The data say most people stay put—or at least stay regionally close.
For residency, geography is not random noise. It is one of the strongest, most consistent predictors of where you will match. Program directors know this. Applicants often ignore it until rank list season, when it is too late to change the pattern that has already been baked in over four years.
Let us walk through what the numbers actually show about geographic matching patterns—how often applicants stay near med school, how this varies by region and specialty, and what you should infer (and not infer) from that if you are planning your own application strategy.
1. What the big datasets actually show
Most of what we know on this topic comes from:
- NRMP’s Charting Outcomes and Match reports
- AAMC’s “Matriculating Student Questionnaire” and “Graduate Medical Education” reports
- Regional analyses from state and consortium GME offices
You rarely see it neatly summarized in one place, so I will do that.
Overall “stay close” rates
If you take all US MD seniors in the National Resident Matching Program and ask: “Did this person match in the same census region as their medical school?” you see a strikingly stable pattern year after year—roughly half or slightly more stay in-region.
Averages across recent years (MD seniors only, excluding IMGs and DO for clarity):
- Same census region as medical school: about 52–58%
- Same state as medical school: usually 35–45%
- Same city or immediate metro area as medical school: more variable, around 15–25%, heavily dependent on whether the school’s city is residency-dense (e.g., Boston, NYC, Chicago, Houston)
To make this concrete, imagine 100 graduating MD seniors from a typical US allopathic school:
- Around 55 will match somewhere in the same census region
- Around 40 will match in the same state
- Around 20 will not even leave the metro footprint
That is the high-level pattern. Regional detail matters a lot, though.
| Category | Value |
|---|---|
| Same City | 20 |
| Same State | 40 |
| Same Region | 55 |
| Different Region | 45 |
You will notice those numbers sum over 100 because they are overlapping categories (same city is a subset of same state, which is a subset of same region). The point is not the arithmetic; the point is that “leave your region entirely” is a minority behavior.
2. Regional loyalty: some med schools are stickier than others
Not all regions behave the same. Large, training-dense regions retain their own graduates at very high rates. More rural or under-dense regions leak talent outward.
If you look at census regions—Northeast, Midwest, South, West—you see consistent patterns:
| Medical School Region | Match in Same Region |
|---|---|
| Northeast | ~65–70% |
| Midwest | ~55–60% |
| South | ~50–55% |
| West | ~60–65% |
Interpretation:
- Northeast and West are “sticky”
- Midwest is moderately sticky
- South is the most “exporting” region (many grads leave, especially for big coastal metros)
Why the Northeast and West hang on to their grads
The numbers make sense when you remember where the jobs are.
- Northeast: dense clusters (Boston, NYC, Philly, DC–Baltimore). A lot of residency slots within driving distance of any major med school. Staying regional still gives you dozens of program types and prestige levels.
- West: fewer states, but huge anchors—California especially. If you trained in California or the Pacific Northwest, most of your realistic “good fit” options are also in-region.
By contrast, a student at a public med school in a smaller Southern or Midwestern state will often have:
- A few in-state academic centers
- Several community programs
- But fewer “next tier” options without going to another region altogether
So they scatter more, and the data reflect that.
Institution-level variation: extremes
When you drill down to individual schools, the spread is enormous.
Based on published institutional match lists and regional GME analyses, you will find:
- Some state schools in large metros: 60–75% of grads stay in-state
- Some private coastal schools (e.g., top-tier Northeast): 70–80% stay in the Northeast, even if not in-state
- Some new or smaller state schools in export states: only 25–35% of grads remain in-state; many go to bigger neighboring states or coasts
If you are being analytical about your own odds, look at your specific school’s last 3–5 match lists. Do not guess. Count how many grads stayed in-state and how many left regionally. You will see a pattern that is remarkably stable year over year.
3. Specialty differences: primary care vs competitive fields
“Do people stay near med school?” is the wrong single question. The more accurate version is: “For my specialty, and with my profile, how do geographic patterns shift?”
Primary care and “core” specialties
For Internal Medicine, Family Medicine, Pediatrics, Psychiatry, OB/GYN, and to a large extent General Surgery, the data show stronger geographic ties.
Why? Three quantifiable reasons:
- Program supply is high everywhere. Most states have multiple programs.
- Local pipelines. Many state-funded hospitals have explicit missions to train and retain local graduates.
- Applicant numbers. These specialties have larger applicant pools and more slots, which structurally favors regionally matched candidates who signal local commitment.
You often see:
- 60–70% of Family Med residents in a program’s roster are from the same region (many from the same state)
- 50–65% of Internal Medicine residents have some prior tie (med school, undergrad, hometown) to the region
If you want to stay near your med school in these specialties and you are not a catastrophic outlier on scores, the structure of the system quietly tilts in your favor.
Competitive specialties: more national, but not random
For specialties like Dermatology, Orthopedic Surgery, ENT, Plastic Surgery, Neurosurgery, and certain fellowships spots downstream (Cards, GI later), the distribution looks more national.
However, “more national” does not mean “geography irrelevant.”
Patterns you see repeatedly when you scrape and code resident bios:
- 40–55% of residents in competitive programs have either med school, undergrad, or home ties to the same region as the program
- In many top-tier programs, 15–30% are from the institution’s own medical school (internal pipeline)
- A large fraction of out-of-region residents come from a small cluster of similarly prestigious institutions on other coasts or large metros
In other words, the applicant pool becomes national, but the acceptance pattern still shows regional and institutional clustering. Connections and familiarity matter a lot when the noise-to-signal ratio is high.
The data-backed takeaway: if you are chasing a highly competitive field, you can move regions, but you must offer either (a) top-tier metrics / pedigree, or (b) a strong, documentable regional tie that offsets being “geographically foreign.”
4. Institutional loyalty vs local loyalty
Another nuance the numbers reveal: there are two different “stay close” effects operating in parallel.
- Institutional loyalty – staying at the same institution or health system for residency
- Local/regional loyalty – staying in the same city/region but at a different institution
These are not the same thing, and they behave differently.
Staying at your own institution
Across many academic medical centers, a non-trivial fraction of residents trained at the same med school:
- Internal Medicine at a major academic center: 15–30% “home” MDs
- Surgery: often 10–25% home MDs
- OB/GYN, Pediatrics, Psych: 10–30% depending on region and program culture
Some institutions are notorious for keeping “their own.” Others deliberately diversify, capping this around 10%.
So if you attend a med school embedded in a large health system with many residencies (think big university hospitals), your odds of staying in the same city are meaningfully higher than a student at a freestanding med school with only a thin residency presence.
Staying local but changing institutions
Then there is the second pattern: staying within the city or region, but not at your own academic center.
Example I have seen repeatedly:
- Student at State University Med Center in a mid-sized city
- Matches to a community Internal Medicine program 20 minutes away
- Or to another academic or hybrid hospital across town
From the 10,000-foot level, this person “stayed local.” From the viewpoint of each program’s internal stats, they are an external candidate with a local tie.
When you read program websites or GME reports listing where residents trained, and you categorize them, the share of residents with any local tie (med school, undergrad, high school) routinely exceeds 50%. Sometimes much more.
This matters strategically. You can absolutely use your medical school as a launchpad into nearby systems, not only your home institution.
5. US MD vs DO vs IMG: how geography tightens as you move away from MD
The further you are from the US MD category, the tighter geography usually becomes.
If you look at ACGME and NRMP data split by applicant type:
US MD seniors
We already covered this: ~50–60% same region, ~35–45% same state. Moderate freedom to migrate.
US DO seniors
When you look at DO graduates post-single accreditation, you see:
- Higher concentration in certain regions (Midwest, South, parts of the Northeast)
- More matches within the same state or adjacent states
- A stronger pull to community and regional academic programs
Rough comparative picture from regional GME analyses and COM match lists:
- Same state: often 45–55%
- Same region: 60–70%
Not because DO graduates are incapable of moving, but because the opportunity structure—the programs that are historically DO-friendly—tends to be regional and often tied to specific osteopathic schools and hospitals.
IMGs (US citizen and non-US)
For IMGs, geography becomes even more constrained:
- Heavy concentration in certain states (New York, New Jersey, Florida, Texas, Michigan, Illinois) that traditionally accept large IMG cohorts
- Many IMGs cluster in urban safety-net hospitals, community programs, and certain specialties (Internal Medicine, Family Medicine, Pediatrics, Psychiatry)
It is not uncommon to see:
- 70–80% of a given IMG-heavy program’s roster trained at med schools from similar regions abroad
- Very tight pipelines (e.g., several residents from the same Caribbean or international school year after year)
So the data-driven rule of thumb:
- US MD: most mobile group; region matters, but you can cross it
- DO: more regional; particular states and systems are disproportionately represented
- IMG: most geographically constrained, with clustering in IMG-supportive states and institutions
6. What actually drives geographic matching: the variables that matter
A lot of people talk about “geographic bias” like it is mysterious. It is not. When you look at program director survey data and cross-reference actual resident rosters, a clear pattern of drivers emerges.
I would weight them roughly like this (conceptually, not literal regression coefficients):
- Documented regional ties (med school, undergrad, family, hometown)
- Program mission (state-funded, local retention goals, rural pipeline)
- Applicant strength (scores, grades, research)
- Specialty competitiveness
- Applicant’s own preference signaling (how they build their list, away rotations, personal statement, signals where applicable)
Regional ties are not just med school
Programs look for any signal that you are likely to stay and be happy there. That can be:
- You grew up in the state
- Your extended family lives within driving distance
- You went to undergrad there
- You have a spouse with a job anchored in that city
I have watched mediocre applicants with strong local ties beat objectively stronger but geographically random candidates in rank decisions. Not every time. Enough times to treat it as a real variable.
State-funded programs are not neutral
If a program is funded by a state legislature that cares about keeping physicians in-state, it will bake this into its decisions:
- Preference for in-state med schools
- Preference for applicants with stated long-term intent to practice in the state
- Dedicated rural tracks targeting local or regional candidates
They almost never say “we only want locals.” But run the numbers on where their residents went to med school. The pattern will be there.
Applicant strength bends geography, but does not erase it
Exceptional applicants can move across the map freely. The top of the score and research distribution has far more geographic flexibility.
However, once you move into the large middle group—strong but not elite—geography becomes a tiebreaker. A program director will not say it outright, but it is obvious in how the lists come out.
7. How you should use this data for your own strategy
Let us translate all of this into actual decisions.
Step 1: Quantify your school’s pattern
Do not guess. Pull your med school’s last 3–5 publicly available match lists.
- Count how many grads stay in-state
- Count how many stay in-region
- For your intended specialty, see how many matched locally vs elsewhere
If you see:
- 70% of last year’s IM class stayed in-state → there is a strong local pipeline
- 20% of recent Derm matches went to a single neighboring coastal city → that is your likely target zone if you are competitive
This gives you a baseline probability map instead of wishful thinking.
Step 2: Map your own ties
List your actual geographic advantages:
- Home state(s)
- Undergrad state
- Med school state
- Long-term partner or family anchors
Any region where you can legitimately claim 2+ strong ties should be treated as a target region. Programs in those areas will read you differently from a random out-of-state applicant.
Step 3: Build a list that respects the numbers
If you are average-to-strong in a “normal” specialty:
- Expect at least half your realistic outcomes to be within your med school’s region
- If you truly want to leave that region, you must over-apply to your target regions and probably under-apply to your “comfortable” region, or you will default back there
For a highly competitive specialty:
- Assume the market is national but will still cluster around your med school region and your prevous institutions
- If you want West Coast from an East Coast school, your file has to look like an outlier: top scores, strong research, or explicit West ties
Step 4: Stop pretending geography is neutral
The data deny the fantasy that you can sprinkle applications evenly across the country and then pick a coastal city from a buffet of options. That is not how match lists behave.
You are much more likely to end up:
- In your med school state, or
- In a neighboring or familiar region, unless
- You deliberately construct a strategy that counterweights that inertia with both numbers (where you apply) and narrative (how you present your ties)
8. A quick reality check: where do residents actually come from?
To make this visual: consider a typical Internal Medicine residency at a major midwestern academic center. If you scrape their resident bios and categorize med school origins, you often get something like:
| Category | Value |
|---|---|
| Same State | 45 |
| Neighboring States | 25 |
| Same Census Region (non-neighbor) | 15 |
| Other Regions | 15 |
Interpretation:
- Nearly half from the same state
- Another big chunk from immediately adjacent states
- A minority from coasts or far regions, often with research ties, couples match, or special circumstances
This is not universal, but it is not unusual. If you are from that med school, your odds of staying in that geographic orbit are very high unless you push hard in another direction.
9. Where this leaves you
You cannot control where every program is. You cannot change that New York and California are magnets or that some regions bleed trainees. But you can stop pretending geography is random.
The data are blunt:
- A majority of applicants match in the same region as their medical school.
- A substantial fraction match in the same state.
- Local and regional ties are quietly powerful selection variables, especially outside the tiny group of truly elite applicants.
So if you want to stay near your med school, the system is already leaning your way. You still need to apply broadly and sensibly, but your default gravity vector is home.
If you want to leave—really leave—you need a plan that respects the inertia. Build ties to your target region, load your application list toward it, and make your narrative match what the numbers suggest will actually persuade programs.
You are entering a process that looks chaotic from the ground and very patterned from above. The geography piece is one of the clearest patterns we have. Use it.
Once you have your geographic strategy aligned with the data, the next real lever is program tier: which programs are actually in range for your metrics and profile. That is another set of numbers, and another conversation.