
New residency programs almost never grow “organically.” They grow by formula. Whether leadership admits it or not, the data shows a remarkably predictable growth curve in the first 5 years.
If you understand that curve, you stop guessing. You can forecast faculty needs, clinic volume, call schedules, and even ACGME site visit risk with far more precision than most GME offices currently use.
I am going to walk through what the numbers actually look like, why some programs stagnate at year 3, and which levers (FTEs, funding, accreditation rules) truly control position expansion. Not the marketing materials, not the mission statements. The constraints that show up in Excel.
The Typical 5‑Year Growth Pattern
Strip away the institutional mythology and you see three dominant patterns for new residency programs over their first five resident cohorts:
- Flat launch and then stepwise growth
- Linear annual expansion
- “Target capacity” ramp (accelerated growth to a preset ceiling)
Most programs cluster around the first two. The third happens in systems planning rapid service expansion or responding to acute workforce gaps (rural FM, community EM, certain psych programs).
Let’s quantify this.
Imagine a new internal medicine (IM) program launched with 8 categorical PGY‑1s. Here are three plausible 5‑year growth curves of approved PGY‑1 positions:
| Year of Program | Flat-then-Step (IM) | Linear (FM) | Target Ramp (Psych) |
|---|---|---|---|
| Year 1 | 8 | 6 | 4 |
| Year 2 | 8 | 8 | 6 |
| Year 3 | 8 | 10 | 8 |
| Year 4 | 12 | 12 | 10 |
| Year 5 | 12 | 12 | 10 |
Three points jump out:
- Year 1 is conservative. No one launches at their “dream” size.
- Many programs bake in a pre‑planned expansion at Year 3–4, contingent on surviving early ACGME reviews.
- Very few programs expand every year beyond Year 3. The cost and infrastructure hit is too high.
You can think of “positions over time” as a constrained growth problem. The constraints: accreditation limits, clinical volume, faculty FTE, and funding (including the Medicare cap).
Year 0–1: The Launch Constraint Model
Year 1 numbers are rarely random. They are the intersection of five hard ceilings:
- Faculty FTE and ACGME supervision ratios
- Clinic or case volume per resident
- Space (clinic rooms, ORs, call rooms)
- Call schedule feasibility
- Funding / GME cap strategy
Programs typically back into Year 1 PGY‑1 slots using ratios that look like this (varies by specialty but the pattern holds):
- IM / FM: ~1 core faculty FTE per 6–8 residents total
- EM: driven by ED annual visits and procedure volume
- Surgical fields: cases per resident, room availability, and call
A common IM example I have seen repeatedly:
- 5 core faculty at 0.7–1.0 FTE each → roughly 4 full‑time equivalent core FTE
- Using a conservative 1:6 faculty‑to‑resident ratio early on → 24 total residents
- To avoid overextending in the first cycle, they cap at 8 PGY‑1s → steady state plan of 8 x 3 classes = 24
But that is the “end state” calculation. Year 1 they only have PGY‑1s. Which means total residents = PGY‑1 intake. There is latent capacity built in.
So Year 1 often looks artificially small relative to long‑term faculty capacity. That is why the Year 3 or 4 expansion becomes almost inevitable—leadership wants to use the faculty they are already paying for.
Years 2–3: The First Inflection Point
By Year 2, the program director and DIO have three data streams that directly drive expansion decisions:
- ACGME feedback (initial site visit, citations, warning letters)
- Clinical service strain (resident workload, duty hour flags, faculty burnout complaints)
- Recruitment performance (fill rate, applicant quality, match outcomes)
If those three are green enough, they push the accelerator. If they are yellow or red, everything freezes.
A very typical “conservative to moderate” pattern in medicine, family medicine, and psychiatry looks like this:
| Category | Value |
|---|---|
| Year 1 | 8 |
| Year 2 | 8 |
| Year 3 | 10 |
| Year 4 | 12 |
| Year 5 | 12 |
Why that curve?
- Years 1–2: No change. Leadership focuses on survival—policies, Milestones, CCC, resident feedback, fix the chaos.
- Year 3: Small bump (e.g., 8→10). Enough to relieve some call and service pressure, not enough to break infrastructure.
- Year 4: Final bump to intended capacity (e.g., 12), often supported by adding 1–2 additional core faculty.
- Year 5: Plateau. Stabilization phase.
From a numbers standpoint, you can model Year 3 as the “evidence‑based expansion checkpoint.” Program growth is granted only if:
- Duty hour violation rate stays low
- Case logs / encounters per resident exceed required minima
- Attrition is low (1–2 residents lost for real reasons is tolerable; a pattern is not)
- Fill rate ≥ 90–95% in the Match or SOAP
If any of those fail, the curve flattens. I have watched more than one new program plan a Year 3 expansion and then shelve it after a rough ACGME follow‑up visit.
Linear vs. Stepwise Growth: Which Is Smarter?
Some programs expand 2 positions every year for 3–4 years. Others stay flat and then jump 4–6 positions in one move. The data and operational consequences are very different.
Let’s compare a linear plan (+2 per year) vs. a step plan (+0, +0, then +4):
| Program Year | Linear Plan - PGY1 Slots | Step Plan - PGY1 Slots | Total Residents Linear | Total Residents Step |
|---|---|---|---|---|
| Year 1 | 8 | 8 | 8 | 8 |
| Year 2 | 10 | 8 | 18 | 16 |
| Year 3 | 12 | 8 | 30 | 24 |
| Year 4 | 14 | 12 | 44 | 36 |
| Year 5 | 16 | 12 | 60 | 48 |
By Year 5:
- Linear approach: 16 PGY‑1s, 60 total residents
- Step approach: 12 PGY‑1s, 48 total residents
Same initial class size. Very different long‑term service footprint.
The linear model:
- Pros: Predictable annual growth, easier political sell (“we only added 2 more this year”), smoother recruitment story.
- Cons: Infrastructure often lags; faculty and space strain quietly increase; ACGME may view constant expansion skeptically if support does not keep pace.
The step model:
- Pros: Three “quiet” years to stabilize workflows; expansions tied to concrete milestones (new ambulatory site, new faculty).
- Cons: Sharp jumps in coverage demands; onboarding and culture can wobble when cohort size suddenly increases 50–100%.
If you are on the inside, you can usually tell which camp your leadership sits in from one question: Are they more afraid of ACGME or more afraid of local service coverage gaps?
- Fear of ACGME → stepwise, delayed expansion.
- Fear of coverage → linear, even aggressive year‑over‑year increases.
How Accreditation Rules Shape the Curve
Accreditation status is one of the strongest predictors of growth behavior, and it is not subtle.
- Initial accreditation with next visit in 2–3 years → most programs wait until after the first full review to expand aggressively.
- Citations on supervision, patient safety, or workload → growth freezes; some programs even cut back.
- Clean review and language commending educational environment → expansion proposals flow within 6–12 months.
You see this especially clearly in programs that launched right as their specialty transitioned to the Single Accreditation System. Osteopathic‑heritage programs that had to adapt documentation and processes sometimes paused expansion for 2–3 extra years, even when clinical volume clearly justified more residents.
The data story: each ACGME site visit effectively acts as a “gate” in a Markov process. Transition probability from “current size” to “larger size” spikes after a favorable review and collapses after a critical one.
If you mapped your program’s history, it would look something like:
| Step | Description |
|---|---|
| Step 1 | Launch Year |
| Step 2 | Year 1-2 Stabilize |
| Step 3 | Propose Expansion |
| Step 4 | Maintain Size |
| Step 5 | Freeze or Reduce |
| Step 6 | Expand Positions |
| Step 7 | New Steady State |
| Step 8 | ACGME Review Outcome |
| Step 9 | Funding and Faculty OK |
The critical nuance: ACGME rarely tells you an exact number to cap at. They push you to align resident numbers with your actual educational and supervision environment. Programs that stretch early pay for it when the review team walks the wards and talks to PGY‑2s.
Funding, Medicare Caps, and Why Some Curves Flatten at Year 5
Now to the part many people pretend is “complex funding mechanics” but is, in practice, a hard ceiling: GME funding caps.
Existing hospitals with old Medicare cost reports already have a cap. New teaching hospitals can set a new cap during their “cap building window,” often 3–5 years after starting training. That window is exactly why 5‑year curves matter.
Hospitals tend to follow one of three strategies:
- Conservative: Build to a stable, modest size and stop well before the cap.
- Cap‑maximizer: Rapidly expand multiple programs to fill the FTE cap by the end of the window.
- Opportunistic: Incremental growth driven by service lines, not by a clear cap strategy—this one looks messy in the data.
Where you see the most aggressive 5‑year ramps is in systems tipped off early by consultants: “Your cap will finalize around Year 5–6. Whatever resident FTE you have then is basically locked in.”
So yes, some of the growth you see in Years 3–5 is not about education at all. It is about not leaving federal dollars on the table.
A simplified model:
- Hospital launches an IM program with 8 PGY‑1s and a psych program with 4 PGY‑1s.
- Over 5 years, they ramp IM to 16 PGY‑1 and psych to 8 PGY‑1.
- Steady state: (16 × 3) + (8 × 4) = 88 residency FTEs.
- That 88 FTE becomes the ballpark cap metric that drives federal funding going forward.
If they had stayed at 8 and 4 with no growth, they would have locked in roughly 44–48 FTE instead. You can do the math on the revenue loss over a decade.
So when you see an otherwise cautious institution suddenly adding 2–4 PGY‑1 positions per year across several new programs, odds are they are playing the cap game.
Specialty Differences: Who Grows Fast, Who Stays Flat
Not all specialties follow the same curve. The constraints differ.
Here is a high‑level comparison pattern (yes, there are exceptions; I am talking about medians):
| Category | Value |
|---|---|
| Internal Medicine | 5 |
| Family Medicine | 6 |
| Psychiatry | 5 |
| General Surgery | 2 |
| Emergency Medicine | 3 |
Interpretation (on a 0–10 “growth aggressiveness” scale):
- Family Medicine: High growth propensity. Especially in community and rural settings with primary care shortages. Often ~+2 PGY‑1 every 1–2 years until they hit a 24–30 resident footprint.
- Internal Medicine: Moderate–high. Hospitalists and subspecialists drive demand; many IM programs have the clinical volume to double their original size over the first decade.
- Psychiatry: Similar to FM/IM now because of demand. But bottlenecked by outpatient sites and psychotherapy supervision.
- Emergency Medicine: Moderate. Growth limited by ED volume, procedure availability, and shift coverage structure.
- General Surgery: Slow. OR time is finite, and ACGME case minima are unforgiving. You rarely see surgery programs doubling in size within 5 years.
Surgical and procedural specialties tend to “under‑grow” what funding would allow because their educational and case constraints are much tighter. They might take 5–8 years to add a single extra resident per class.
Operational Impact: What Growth Curves Mean on the Ground
These curves are not abstract. They decide whether your PGY‑2s are taking 1 in 4 call vs. 1 in 6. They determine how many clinic days per block your PGY‑3s get. And they dictate faculty burnout trajectories.
A few patterns I have seen over and over:
Under‑growth programs
- Add no positions for 5 years despite rising volume.
- Residents end up covering more patients than peer programs.
- Faculty complain about “using residents as cheap labor,” ACGME starts sniffing around workload and supervision, not size.
Over‑growth programs
- Expand positions faster than they expand meaningful educational opportunities.
- Residents struggle to meet case minima or get procedures because there are too many bodies for the same number of opportunities.
- PDs end up slicing rotations thinner, adding “electives” that are really service.
Balanced‑growth programs
- Tie each planned increase in PGY‑1 slots to concrete changes: a new clinic, additional inpatient service, or added faculty FTE.
- They can show, on paper and in data, that each added resident still gets at least the same, if not better, exposure and supervision.
The programs that end up in trouble are the ones whose growth curve is driven by political pressure, not by a structured needs and capacity analysis.
How to Model a Responsible 5‑Year Growth Plan
If you are involved in a new program, you should be able to produce a simple 5‑year model that looks like something a CFO and ACGME reviewer could both respect.
Here is the bare minimum dataset I would build:
- Planned PGY‑1 intake by year (5‑year horizon)
- Resulting total residency headcount by year (all PGY levels)
- Faculty FTE (core and key faculty) by year
- Clinical volume per learner by year (patients per clinic half‑day, encounters per ward day, OR cases, ED visits)
- Funding sources by year (Medicare GME, hospital support, grants, etc.)
Then set some guardrails:
- Faculty FTE to resident ratio should not get worse over time. If you go from 1:6 to 1:9 over 5 years, that is a red flag.
- Key volume metrics per resident should stay flat or improve. If outpatient visits per resident drop 25% because you added 4 more residents to the same clinic capacity, that is a problem.
- Duty hour violations should not spike each time you add positions. In practice, adding residents usually reduces violations, but only if they are deployed intelligently.
If your 5‑year curve violates those guardrails, you do not have a growth plan. You have a wish list.
The Future: Aggressive Expansion vs. Quality Risk
Given workforce shortages and federal funding pressures, expect more aggressive growth curves over the next decade, not fewer.
Three trends that matter:
Workforce shortages
- Psychiatry, primary care, and certain surgical fields have clear national and regional gaps.
- Policy discussions are already leaning toward expanding funded residency slots; systems are positioning themselves to absorb them.
Cross‑site and consortium models
- Programs will increasingly spread residents across multiple hospitals and clinics to “manufacture” enough volume to support larger cohorts.
- This enables faster position growth, but coordination cost and variation in training quality rise.
Data‑driven accreditation
- ACGME is leaning more and more on outcomes data, milestones, and real workload metrics.
- Programs attempting steep growth curves with weak data will get called out faster than a decade ago.
The worst‑case scenario is obvious: hospitals chasing GME dollars, pushing 5‑year curves that outpace faculty and case volumes. Residents end up doing more scut, learning less, and burning out faster.
The best‑case scenario is also clear: deliberate, model‑driven expansion where each added PGY‑1 slot is justified by numbers, not just sentiment.
Key Takeaways
- New residency programs follow surprisingly predictable 5‑year growth curves, with Year 3–4 as the primary expansion inflection point driven by ACGME feedback and early outcomes.
- Sustainable growth is constrained by three things: supervision capacity (faculty FTE), real clinical volume per resident, and GME funding caps. Ignore any one of those and the curve becomes unsafe.
- The smartest programs treat position expansion as a quantitative design problem, not a political decision—tying each added resident to concrete metrics rather than vague “service needs.”