
New residency programs almost never just vanish overnight. The horror stories get recycled; the actual numbers do not back them up.
Let me be blunt: the way people talk on Reddit and in group chats, you’d think half of new programs implode in the middle of the year and leave residents holding a trash bag full of scrubs and broken dreams. That’s fiction. Or more accurately: that’s a tiny number of high‑profile meltdowns being treated like the norm.
You should be wary of bad programs. You should not be terrified that “new = likely to close.”
Here’s what the data and real cases actually show.
The Reality: Closures Are Rare, Even Among New Programs
ACGME‑accredited programs do close. But they close rarely, and usually not suddenly.
ACGME publishes lists of newly accredited and withdrawn programs every year. They do not put a big red arrow saying “this one was a disaster,” so you have to look at patterns: timing, specialty, sponsoring institution, and whether there was a broader hospital or corporate collapse.
What you see if you track this over years:
- Hundreds of new residency and fellowship programs are accredited in a typical multi‑year span.
- Only a small minority of programs—new or old—are ever withdrawn.
- Of those withdrawals, most are:
- Voluntary consolidations (merging two Internal Medicine programs under one institutional code).
- Shifts in institutional strategy (hospital drops one specialty, grows another).
- Consequences of hospital or health system financial collapse.
“Sudden, out‑of‑the‑blue closure of a brand‑new program mid‑year” is in the same category as “plane crash”: real, devastating, heavily publicized, and statistically rare.
To put some shape to this, here’s a rough comparison using available ACGME and news data patterns from the last decade. This is directional, not an exact census, but it’s far closer to reality than the rumor mill:
| Category | Value |
|---|---|
| New Programs Accredited | 400 |
| Programs Closed/Withdrawn | 25 |
For every one program that closes or withdraws, you see many more that are created and persist.
Does that 25 include some programs that were genuinely dysfunctional or nonviable? Yes. But the catastrophic “we lost accreditation mid‑residency and had to scramble” group is a subset of that subset.
Why The Myth Feels So Real
So if closures are rare, why does it feel like every other week someone posts “New program shut down!!”?
Because the failures are loud, dramatic, and shareable.
You’ve probably heard versions of these:
- The hospital system that expanded too fast, then hit a financial wall and chopped GME to save cash.
- The community hospital that opened several new specialties under a weak GME office, got slammed in site visits, and chose to withdraw.
- The private equity–backed facility that built residency programs as a revenue strategy, then got sold or restructured.
These produce nightmare stories: residents reapplying, relocating, losing years of training, or scrambling to find “transfer” positions. The ACGME does step in to help residents place into other programs, but it’s messy. Trainees who go through it understandably want to warn everyone.
The key cognitive trap: availability bias. You remember the emotional case you read on SDN or Twitter. You don’t remember the 300 quiet, functioning new programs where residents are just… doing notes and scut and complaining about Epic.
The myth grows because people repeat the trauma, not the base rates.
What Actually Causes Residency Programs To Close
Programs do not lose accreditation because they’re “new.” They close because of specific, usually predictable problems that piled up over years. When you examine real closures, several themes repeat.
1. Hospital or System Financial Collapse
This is by far the biggest real risk.
If a sponsoring hospital closes, merges, or files bankruptcy, the residency programs are collateral damage. This has happened with:
- Safety‑net hospitals bled dry by uncompensated care.
- Community hospitals taken over by larger systems that decide GME is not worth the cost.
- For‑profit entities that used residents as cheap labor, then liquidated or restructured.
This is not a “new program problem.” Old, storied programs have gone down this way too. The risk factor is institutional stability, not program age.
2. Chronic Non‑Compliance with ACGME Standards
ACGME does not nuke a program for one bad survey comment. Loss of accreditation almost always comes after years of warnings, citations, and probation.
Patterns you see in problem programs:
- Persistent violation of work‑hour rules.
- Systemic lack of supervision (residents “acting as attendings” at night).
- No didactics, no protected education time, garbage evaluation processes.
- Toxic culture reported consistently on anonymous ACGME surveys.
New programs are more exposed here if the leadership is inexperienced or cynical and thought “residents = cheap coverage, education optional.” But that’s not every new program; that’s a specific kind of bad leadership.
3. An Overextended, Underprepared GME Office
Some hospitals decide they want 10 new programs… but they’ve barely run one.
I’ve seen versions of this: one beleaguered DIO covering Internal Medicine, Family Medicine, Psychiatry, Transitional Year, and EM, with part‑time coordinators and no real faculty development. All at once. That’s exactly how you end up with accreditation issues across multiple programs.
Again, predictable if you look for it. These places scream “too much, too fast.”
How Often Are Residents Actually Left Stranded?
Here’s where people confuse categories.
Programs can:
- Lose accreditation completely, with residents forced to transfer.
- Be placed on probation with strict oversight but continue training.
- Voluntarily withdraw over time, often completing training for current residents.
- Be closed because the hospital itself closes.
Those scenarios feel very different if you’re the PGY‑2 trying to finish training.
Based on ACGME reports and news‑verified cases over recent years, the number of residents forced to transfer from collapsing programs in any given year is tiny relative to the roughly 160,000+ residents in ACGME programs.
To make that concrete, think scale, not exact numbers:
| Category | Value |
|---|---|
| Residents in Stable Programs | 159000 |
| Residents Affected by Closures | 500 |
Most years, the displaced group is a rounding error. Painful if you’re in it, yes. Statistically, no, this is not normal or expected.
When closures do happen, ACGME and specialty RRCs usually coordinate “placement” efforts:
- Nearby programs temporarily increase complement to take displaced residents.
- Some residents lose a year if there’s not a perfect PGY‑aligned spot.
- People may have to move states, reset seniority, or accept less desirable programs.
Brutal? Yes. Common? No.
Are New Residency Programs Inherently Risky?
The short version: no, not inherently. But there are classes of new programs that carry more risk.
Let’s split what usually gets lumped together as “new program” into some more useful categories:
| Type of New Program | Relative Risk of Sudden Problems |
|---|---|
| Expansion at major academic center | Very low |
| New program in long‑established GME hospital | Low |
| Fast‑multiplying programs at small system | Moderate to higher |
| Programs tied to unstable or for‑profit hospitals | Higher |
Low‑Risk “New” Programs
Examples:
- A new Cardiology fellowship at a university that already runs 30+ residencies.
- A fresh EM program at a big county hospital with long‑running IM, Surgery, and FM.
These are technically “new” on paper but backed by mature GME infrastructure: seasoned DIO, coordinators who actually know what a Milestone is, faculty with prior teaching experience. The chance of sudden death here is extremely low.
Moderate‑Risk Programs
Think:
- Community hospitals with 1–2 stable programs that now add 3–4 more quickly.
- Regional systems chasing the “more residents = more bodies on the floor” logic without proportionally increasing faculty or GME staff.
These can work. Many do. But if you hear “we opened IM three years ago and now have EM, Psych, FM, TY, and Neuro coming online,” your antennae should go up. That pace strains oversight and funding.
Higher‑Risk Programs
The ones I personally side‑eye:
- New programs at hospitals owned by shaky for‑profit chains or frequently flipped systems.
- Programs where residents are clearly being used to backfill what should be attending or APP roles, with no meaningful academic affiliation.
- Institutions with major, recent financial or regulatory trouble.
It’s not the “newness.” It’s the underlying business model and governance.
How To Actually Assess Whether a New Program Might Close
Instead of asking “Is a new program likely to close?” ask smarter questions: “What’s the institutional risk profile?” and “Are there early signs this place is being built on hot air?”
Here’s a simple mental checklist I wish more applicants used.
1. Sponsoring Institution Stability
Look at:
- Is the hospital part of a stable, reputable system or a recurring name in bankruptcy news?
- Have they been in the headlines for closures, mass layoffs, or losing service lines (OB, ICU, etc.)?
- Do they already host long‑standing accredited programs that seem healthy?
If a place has run an IM program for 20 years without drama, your new subspecialty there is unlikely to vanish in 3.
2. GME Office and Leadership
Red flags in actual interviews:
- The DIO cannot clearly explain how many programs they sponsor and what the growth plan is.
- Program directors look completely exhausted and vague about curriculum.
- Coordinators are brand new and juggling multiple programs solo.
On the flip side, if you see a seasoned DIO, clear policies, robust orientation, and engaged faculty, that’s a different signal.
3. ACGME Accreditation Status
New programs start with Initial Accreditation. That alone is not a red flag.
But you should:
- Ask explicitly: “Have there been any citations or concerns from ACGME site visits?”
- Look up the sponsoring institution in ACGME’s public data to see if other programs are on probation.
If multiple programs at the same institution are on probation for similar issues, that’s a systemic problem.
4. Resident Experience (Even If They’re Only PGY‑1)
Yes, PGY‑1s do not have the full picture. But they absolutely know if:
- They’re consistently working beyond duty hours with no fix in sight.
- There’s actual teaching versus constant service with ghost attendings.
- They feel safe escalating concerns or are told to “stop complaining; you’re lucky to be here.”
On interview day, ask hard questions. And watch whether the chief or PGY‑1 dodges eye contact when they answer.
The Data Trend No One Talks About: Expansion, Not Collapse
Another reason the “everything is closing” narrative is wrong: the long‑term direction of U.S. GME is expansion, not contraction.
There’s an ongoing mismatch between U.S. med school graduates and available residency spots. To keep up with med school growth and population needs, hospitals have been adding programs and positions aggressively, especially in community and regional centers.
You can visualize the general dynamic like this:
| Category | New/Active Programs | Programs Withdrawn |
|---|---|---|
| Year 1 | 10000 | 40 |
| Year 2 | 10400 | 45 |
| Year 3 | 10850 | 42 |
| Year 4 | 11300 | 50 |
| Year 5 | 11750 | 48 |
Massive growth in active programs. Small fluctuations in yearly withdrawals. That’s the real backdrop.
The painful truth is this: the system tolerates some failed experiments because the aggregate demand for training positions is so high. That’s not comforting if you end up in a failed experiment, but it explains why the macro reality is “more programs each year,” not “implosion.”
Practical Take: Should You Avoid All New Programs?
No. That’s lazy thinking.
You should avoid:
- Programs at obviously unstable or predatory institutions.
- Programs where residents are clearly being abused, unsupported, or silenced.
- Programs created purely as staffing hacks, with no coherent educational mission.
You should not automatically avoid:
- New programs embedded in solid, long‑standing academic or community teaching hospitals.
- Thoughtful expansions with clear curricula, invested faculty, and transparent leadership.
The residents I’ve seen get burned didn’t just choose a “new program.” They chose a new program at a hospital that already looked wobbly, or in a system that saw them as line items, not trainees. You can usually sniff that out.
A Quick Reality Check Timeline: How Closures Actually Unfold
The “sudden” part is often exaggerated. Internally, trouble builds for years.
Here’s a simplified version of how a program meltdown typically progresses:
| Period | Event |
|---|---|
| Early Years - Year 1-2 | New program opens, minor growing pains |
| Early Years - Year 2-3 | First ACGME site visit, initial citations |
| Warning Phase - Year 3-4 | Continued complaints, poor surveys, more citations |
| Warning Phase - Year 4-5 | Probation or threatened actions by ACGME |
| Outcome - Year 5-6 | Program fixes issues and continues |
| Outcome - Year 5-6 | OR program voluntarily withdraws or loses accreditation |
From the outside, you may only hear “Program lost accreditation this year.” On the inside, the warning lights were flashing for a while.
This is why talking to current residents and watching for institutional patterns is so crucial. You’re trying to pick up on those early warning years.
The Bottom Line
New residency programs do occasionally implode. But the narrative that “new program = high chance of closure” is not supported by data.
You should walk away with three core points:
- Program closures are rare events, even among new programs. Most new residencies stabilize and keep training cohorts year after year.
- The real risk is tied to institutional instability and bad leadership, not the age of the program on paper. Follow the money, the accreditation history, and the residents’ lived reality.
- Instead of blindly avoiding “new,” learn how to vet programs intelligently. If you can spot shaky hospitals and overextended GME offices, you’ve already dodged the scenarios that lead to truly sudden closures.
Be cautious. Be skeptical. But base your fear on actual risk, not on the loudest horror story in your group chat.