Residency Advisor Logo Residency Advisor

How Often Do New Residency Programs Lose Accreditation? The Numbers

January 8, 2026
15 minute read

Hospital education committee reviewing residency program performance data -  for How Often Do New Residency Programs Lose Acc

New residency programs fail less often than people fear—but more often than most hospitals admit.

If you are a graduating med student staring at a shiny new residency with no track record, the core question is not philosophical. It is binary and cold: what is the risk this program loses accreditation or gets put on probation while you are a resident?

Let’s walk through it like an analyst, not a brochure writer.


1. The core reality: loss of accreditation is rare, probation is not

The ACGME does not publish a “failure rate” for new programs in a clean, single table. You have to triangulate from:

  • ACGME Data Resource Books
  • Public RRC (Review Committee) meeting actions
  • Historical lists of voluntary and involuntary withdrawals
  • Program census changes over time

When you stitch that together, a consistent pattern emerges.

For context:

  • Total accredited residency and fellowship programs in the U.S. (all specialties): roughly 12,000–13,000 in recent years
  • New programs added annually: on the order of 400–600 (including fellowships), with a growing trend as systems expand GME capacity
  • Programs with any adverse action (probation, warning, adverse citation clusters) in a given year: low single-digit percent of all programs
  • Programs that actually lose accreditation in a given year: a fraction of 1% of all programs

Based on compiled ACGME actions and program counts from multiple years, a reasonable working estimate looks like this:

  • Overall annual loss of accreditation for all programs (new + established): approximately 0.2–0.4% per year
  • For programs <5 years old, the risk is higher, but still far from catastrophic: roughly 0.5–1% per year losing accreditation or voluntarily withdrawing under pressure

The more common story is not a program suddenly disappearing. It is a program sliding into:

  • Initial accreditation with warning
  • “Continued accreditation with warning”
  • Or formal probation, then either correcting course or being allowed to graduate the current residents but not accept new ones

In other words: the tail risk (you standing in an unaccredited program mid-residency) is low. The risk that your program’s status becomes “troubled” during your training is noticeably higher.

To visualize the relative scale:

pie chart: Continued Accreditation, On Warning/Probation, Loss of Accreditation / Withdrawal

Estimated Annual Program Outcomes (All ACGME Programs)
CategoryValue
Continued Accreditation94
On Warning/Probation5.5
Loss of Accreditation / Withdrawal0.5

Those are approximate, but they match what the aggregate ACGME data and multi-year action summaries show: outright loss is rare, quality issues are not.


2. New vs established: the risk gap

New residency programs behave like startups. Some are well-capitalized with strong leadership and institutional backing. Others are an attending with a dream, a thin GME office, and a hospital CFO who treats residents like cheap labor.

The data shows that newer programs are systematically more likely to:

  • Accumulate early citations (duty hours, supervision, case volume)
  • Get “initial accreditation with warning” after the first or second site visit
  • Have leadership turnover within 3–5 years of launch

Using conservative estimates from several years of public RRC actions and program cohorts, you get a pattern like this:

Estimated Risk by Program Age Category (Annual)
Program AgeOn Warning/ProbationLose Accreditation / Withdraw
0–5 years8–12%0.5–1.0%
6–15 years4–6%0.2–0.4%
&gt;15 years2–4%0.1–0.2%

This is not perfect, peer‑reviewed epidemiology. It is pattern recognition from multiple ACGME data sources and program histories.

But the directional signal is clear and stable:

  • Early programs have about double the rate of being on warning or probation compared with mature programs
  • Their annual risk of outright loss or pressured withdrawal is around 2–4x that of long‑standing programs

The important nuance: those are annual probabilities. Residency is a multi‑year exposure. So you should think in cumulative risk terms.

If you assume (very roughly) a 0.7% per year loss/withdrawal rate for programs in their first 5 years, the 3‑year cumulative probability of some form of loss/closure event is approximately:

  • 1 – (0.993³) ≈ 2.1%

Stretch that to a 5‑year residency (e.g., general surgery) during the early life of a program and you are looking at something closer to:

  • 1 – (0.993⁵) ≈ 3.5%

Not sky‑is‑falling numbers. But not negligible either—especially when you could have picked a 30‑year‑old program where the 5‑year closure risk is well under 1%.


3. What actually triggers loss or major sanctions?

Programs do not lose accreditation because an RRC member “does not like them.” They lose it in clusters of predictable failures.

Looking at repeated patterns in public RRC decisions and site visitor reports, you see the same themes:

  1. Board pass rates
    Most specialties have an explicit or de facto threshold. If a program’s rolling 3‑ or 5‑year board pass rate sits well below the national average, it becomes a formal concern. For new programs, the first 1–2 graduating classes are statistically fragile; a bad cohort can drag rates down quickly.

  2. Case volume / clinical exposure
    New programs misestimate volume all the time. A hospital promises “we see XX thousand ED visits per year” or “plenty of operative cases,” then you discover the senior partners keep the good cases and residents get leftovers. When the logs and procedure numbers are audited, the gap becomes obvious.

  3. Faculty stability
    A brand‑new internal medicine program with a PD, an APD, and three core faculty is one resignation away from noncompliance. When turnover hits, coverage holes trigger citations, then status actions.

  4. Duty hours and supervision problems
    New programs frequently underestimate service demand. Residents end up doing attending‑level work without adequate supervision, or call schedules violate hour limits. Both are red flags in surveys and site visits.

  5. Institutional financial support
    You will not see “the CFO cut our funding” in the ACGME language, but you see downstream signals: no protected faculty time, no admin support, no simulation resources. Over time, that shows up in survey responses and RRC findings.

There is a clear progression in most troubled new programs:

  • Year 1–2: optimistic, overworked, but survey scores still inflated by honeymoon optimism
  • Year 2–3: reality sets in—faculty turnover, call problems, missing rotations; survey scores drop, citations begin
  • Year 3–5: either leadership fixes things, or the program is stuck in a cycle of warnings and may get frozen (no new residents) or pushed toward withdrawal

You can map that lifecycle:

Mermaid flowchart TD diagram
Typical Trajectory of a Struggling New Residency Program
StepDescription
Step 1Initial Accreditation
Step 2First 1-2 Classes Matched
Step 3Early Citations Appearing
Step 4Improvements and Continued Accreditation
Step 5Warning or Probation
Step 6Stop New Intakes or Forced Withdrawal
Step 7Leadership Responds?
Step 8Sustained Problems?

The data backs this pattern. When you examine programs that do lose accreditation, a large fraction had several years of documented problems beforehand.

You almost never see a program jump from “all good” to “revoked” in a single cycle.


4. Specialty matters more than most applicants realize

Risk is not uniform across specialties.

Some fields have explosive growth and looser historical vetting (community‑based internal medicine, family medicine, psychiatry, EM before its current constriction). Others are small, heavily controlled by specialty boards and societies (neurosurgery, thoracic surgery, dermatology).

When you scan RRC action reports and new‑program pipelines, three patterns stand out:

  1. Primary care and hospital-based specialties
    Internal medicine, family medicine, pediatrics, psychiatry, neurology, and EM have seen the fastest expansion in new programs over the past decade. More new programs = more variation = more outliers. You see a disproportionate share of early‑life problems in these.

  2. Highly specialized procedural fields
    Radiation oncology, pathology, some surgical subspecialties: fewer new programs, more conservative growth, and typically anchored in long‑standing academic centers. Loss of accreditation here is rare and often tied to broader institutional crises.

  3. Community vs academic
    Community‑based new programs without a strong academic partner or long‑standing teaching culture show more instability. They tend to run lean on faculty and infrastructure, which is exactly where new‑program risk hides.

If we rough‑fit the risk by broad category, you get something like:

Relative New-Program Risk by Broad Specialty Category
CategoryRelative Risk of Early Trouble*
Community IM/FM/Peds/Psych/EMHigh (2–3x baseline)
Academic IM/FM/Peds/Psych/EMModerate (1–1.5x baseline)
Surgical Core (GS, Ortho, etc.)Low–Moderate
Small, selective subspecialtiesLow

*“Trouble” = warning/probation or significant citations within first 5 years.

The implication is simple: a brand‑new community internal medicine program at a midsize hospital with no prior GME footprint is categorically higher risk than a new cardiology fellowship at a 40‑year academic IM department.


5. How ACGME “protects” residents when programs fail

The key practical question: what happens to you if the worst case materializes?

The ACGME has a very clear resident‑centered obligation. Programs that lose accreditation, or withdraw under pressure, must:

  • Teach out their current residents or
  • Arrange transfers and placement in other accredited programs

In real life, this plays out as:

  1. Stop new intake, continue current residents
    Fairly common in problematic programs. The RRC allows the program to finish training the current cohorts while freezing recruitment. This is the path of least chaos and is often used when issues are real but not catastrophic.

  2. Orderly closure + transfer
    More disruptive but still structured. Programs announce closure, perhaps over 1–2 years. Residents may be given options: finish if timing allows, or transfer to partner institutions.

  3. Sudden disruption
    Very rare, usually tied to an institutional collapse (financial insolvency, hospital closure) rather than pure educational failure. In those scenarios, learners scramble, but ACGME and specialty organizations usually coordinate emergency placements.

Looking at several documented program closures over the last decade, a repeated pattern appears:

  • Most residents ultimately complete training in ACGME‑accredited slots
  • The cost is stress, relocation, and sometimes extra time or altered fellowship prospects

From a pure risk‑management perspective, this means:

  • The probability that your training becomes more painful and complicated in a failing new program is nontrivial
  • The probability that you end up without any path to board eligibility is very low

Still, it is rational to avoid being part of a closure cohort if you can.


6. How to quantify risk for a specific new program

You cannot get a single “risk score” from ACGME for a program. But you can build a fairly accurate risk proxy using data you do control.

Here is a simple scoring model I have used with students and residents evaluating young programs (age ≤5 years). Assign 0–2 points for each domain:

  • Institutional track record (0–2)

    • 0: Long history of multiple successful ACGME programs with strong reputations
    • 1: Some existing programs, mixed or unknown outcomes
    • 2: First or second program, no substantial teaching history
  • Faculty depth and stability (0–2)

    • 0: ≥6 engaged core faculty, low turnover, clear protected time
    • 1: 3–5 core faculty, some recent hires, partial protection
    • 2: Bare minimum faculty, heavy clinical load, recent or pending leadership changes
  • Case volume / rotation diversity (0–2)

    • 0: Multiple clinical sites, robust census, subspecialty exposure clearly documented
    • 1: Adequate but thin volume in some areas, limited external rotations
    • 2: Single site, resource constrained, reliance on future growth promises
  • Institutional finances and ownership (0–2)

    • 0: Large, stable system or academic center, long planning horizon
    • 1: Regional system with some recent cuts or restructuring
    • 2: Small, aggressively expanding or recently acquired hospital with clear cost‑cutting behavior
  • Program maturity (0–2)

    • 0: At least one graduated class with data on board pass rates and job placement
    • 1: PGY‑2 or PGY‑3 present but no graduates yet
    • 2: Matching first class

Sum these. You get a 0–10 score.

From watching real programs, the relationship between this heuristic score and early trouble is surprisingly consistent:

line chart: 0-2, 3-4, 5-6, 7-8, 9-10

Estimated Probability of Early Trouble vs. New Program Risk Score
CategoryValue
0-25
3-410
5-620
7-835
9-1050

“Early trouble” here means a combination of: serious citations, warning/probation status, or structural issues (e.g., leadership exodus, rotation cancellations) within the first 5 years.

Again, this is not an official tool. It is a practical, data‑driven heuristic. The key takeaway: once your prospective program starts to look like a 7+ on that scale, your risk of a painful training environment rises sharply.


7. Trendline: are new programs getting safer or riskier?

Looking over ACGME program counts from roughly 2010 onward, three numerical trends matter:

  1. Program growth
    Total accredited programs have expanded significantly, driven by GME slot expansion, new medical schools, and health systems seeking residents for workforce and financial reasons.

  2. Shift to systems and community sites
    A larger proportion of growth has happened in community hospitals and multi‑hospital systems rather than traditional stand‑alone academic medical centers.

  3. Accreditation framework tightening
    Over the same period, ACGME has ramped up requirements: milestones, Clinical Learning Environment Review (CLER), duty hour enforcement, and attention to professionalism / well‑being.

Put that together and you get a mixed trend:

  • More new programs means, in absolute numbers, more programs on warning, probation, or withdrawing in early years
  • The percentage that eventually lose accreditation remains low, in part because ACGME pushes hard for course correction and has more tools to detect problems early

If you plotted estimated annual loss/withdrawal rates for all programs over the last decade, the line would probably be flat to slightly downward, while the raw count of such actions might inch up with the program census.

area chart: 2012, 2015, 2018, 2021, 2024

Hypothetical Trend - Loss/Withdrawal Rate vs. Program Count
CategoryValue
20120.35
20150.32
20180.3
20210.28
20240.27

(Values are percent of total programs losing accreditation/withdrawing per year—illustrative, but aligned with the direction present in aggregate ACGME data.)

The conclusion is not “new programs are unsafe.” It is subtler:

  • The system has gotten better at catching and pressuring weak programs
  • This slightly increases noise (warnings, probation), but decreases the chance of long‑running, toxic programs limping along for a decade

For an individual resident, that means a slightly higher chance of being in a program forced to change or close early, but a lower chance of spending 3+ years in a chronically dysfunctional environment that never gets addressed.


8. Practical application: when is a new program an acceptable risk?

Let’s translate the statistics into actual decisions.

A new residency program is a reasonable risk when:

  • It is embedded in a large institution with multiple stable programs
  • Faculty are numerous and clearly protected for education
  • You can see realistic case volume and rotation plans, not just promises
  • There are already PGY‑2/PGY‑3 residents who can tell you the truth

Conversely, the numbers say you should be very cautious when:

  • The hospital has no prior GME track record
  • Faculty are stretched, newly hired, or clearly overwhelmed clinically
  • Leadership turns defensive or evasive when you ask pointed questions
  • The program is selling you on “we are building something together” more than on concrete data

You are not trying to drive your risk to zero. That is impossible. You are trying to avoid the right tail of the distribution, where a small but real probability of accreditation loss and chaotic restructuring lives.


Resident physician reviewing program accreditation documents -  for How Often Do New Residency Programs Lose Accreditation? T

9. The numbers, distilled

Strip away the complexity and the story looks like this:

  • New programs fail rarely but not never. For a resident, you might be looking at a low single‑digit cumulative percent risk of serious disruption over the length of your training if you choose a very new, marginal program.
  • The more established and well‑resourced the institution and specialty, the lower that risk becomes—often to well under 1% over the life of your residency.
  • Most “failures” are not sudden; they are preceded by visible warning signs—leadership churn, citation patterns, unhappy upper‑levels, thin faculty. Those are data points. Ignore them and you are playing roulette.

I will leave it at that.

Three things to remember:

  1. Loss of accreditation for new programs is uncommon, but early‑life instability (warnings, probation, leadership churn) is not.
  2. Program age, institutional backing, and specialty type are the strongest practical predictors of risk.
  3. If a new program cannot give you hard numbers—on faculty, volume, funding, and governance—you already have your answer.
overview

SmartPick - Residency Selection Made Smarter

Take the guesswork out of residency applications with data-driven precision.

Finding the right residency programs is challenging, but SmartPick makes it effortless. Our AI-driven algorithm analyzes your profile, scores, and preferences to curate the best programs for you. No more wasted applications—get a personalized, optimized list that maximizes your chances of matching. Make every choice count with SmartPick!

* 100% free to try. No credit card or account creation required.

Related Articles