Residency Advisor Logo Residency Advisor

Board Pass Rate Red Flags: How to Interpret 3‑Year Trends by Specialty

January 8, 2026
17 minute read

Residents reviewing board exam performance data on a conference room screen -  for Board Pass Rate Red Flags: How to Interpre

It is late January. You are on your 7th interview, sitting in a conference room while the PD flips through a PowerPoint about the program. One slide flashes past: “Board Pass Rate (3-year rolling average): 78%.” The PD does not pause. Nobody in the room reacts. But you caught it. And now you are wondering whether you just saw the single biggest red flag of the entire cycle.

Let me break this down specifically: board pass rates are not a soft metric. They are one of the most objective, hard-outcome signals of a program’s educational culture. But you cannot interpret them in a vacuum, and you absolutely cannot compare across specialties using the same numbers.

Most applicants do this wrong. They see “>90%” and relax, or “<80%” and panic. That is lazy thinking. You need to know:

  • What the national baseline is for that specialty.
  • Whether the program is trending up, flat, or down over at least 3 years.
  • Whether the program’s narrative about those numbers makes sense.

If you understand those three pieces, you will see red flags that half your co-applicants completely miss.


1. Step zero: know the baseline for your specialty

You cannot call something a red flag until you know what “normal” looks like in that field. A 92% 3-year pass rate may be excellent in one specialty and mediocre in another.

Here is a rough, realistic view of typical 3‑year board pass rate ranges for larger, reasonably functioning programs (not the rare unicorns, not the disasters):

Typical 3-Year Board Pass Rate Ranges by Specialty
SpecialtyTypical Stable RangeTruly Concerning Range
Internal Medicine92–98%&lt;90%
General Surgery85–95%&lt;80%
Pediatrics90–97%&lt;88%
OB/GYN85–93%&lt;80%
Psychiatry90–98%&lt;88%

These are ballpark, but they match what many applicants see in ACGME / ABMS reports and program slides.

So the first thing you should be asking yourself:

  • For this specialty, would a normal, healthy program be at or near >90%? >95%?
  • Is this program consistently below that?

If the PD’s slide says “3‑year rolling ABIM pass rate 86%” and the national is mid‑90s, that is not a small difference. That is “multiple residents failing every year” territory.


A single cohort can get hammered by a particularly weird exam, major curriculum changes, or even a leadership transition. One bad year is a yellow flag. A bad trend is a red flag.

Three-year data smooths out a lot of noise. Here is what you should look at when programs show 3‑year pass rates (most will, especially in IM, Peds, Psych, Surgery, OB/GYN).

There are 4 basic patterns:

  1. Consistently high, stable (green)
    Example: 96% → 94% → 97%

  2. Low but improving (cautious yellow)
    Example: 82% → 88% → 92%

  3. High but declining (orange bordering red)
    Example: 98% → 92% → 85%

  4. Consistently low (solid red)
    Example: 78% → 80% → 79%

You care more about the direction and magnitude of change than you do about any single year.

To visualize the impact of trends, consider this simple line chart of three fictional IM programs:

line chart: Year 1, Year 2, Year 3

Three Internal Medicine Programs - 3-Year ABIM Pass Trends
CategoryProgram A (Stable High)Program B (Improving)Program C (Declining)
Year 1968397
Year 2958891
Year 3979284

Given only this, which one carries the strongest red flag? It is not B. It is C. A program that drops from 97% to mid‑80s in three cycles has something structurally wrong: leadership, culture, selection, or curriculum.


3. What low or falling pass rates usually mean in real life

Board failure is a downstream symptom. The upstream problems are almost always one (or several) of these:

3.1 Weak or chaotic didactics

You see this a lot in malignant or disorganized programs. Common patterns:

  • Didactics constantly canceled for “patient care needs.”
  • Noon conference is just pharma lunches or random case talks, zero structured board content.
  • No dedicated board review curriculum; residents are told to “just do UWorld on your own.”

Residents then fail, because they are trying to self-teach a board-level curriculum while doing 70‑hour weeks.

3.2 No culture of exam preparation

Some specialties have a strong exam prep culture (IM, Peds), others less so (some community psych, smaller OB/GYN). But even in chill programs, residents who pass have some structure.

Red flag behaviors:

  • PD says, “We do not like to ‘teach to the test’ here, we focus on clinical excellence,” and cannot show any structured board prep.
  • Senior residents tell you, “Honestly no one talks about the boards until PGY‑3 spring. Then everyone panics.”

That “we train clinicians, not test-takers” line is a classic rationalization from programs that do not want to admit they are underperforming.

3.3 Poor resident selection or remediation

If a program repeatedly matches residents with major academic struggles and then offers no real remediation, pass rates suffer.

Questions you should be asking (directly or indirectly):

  • Does the program have a genuine remediation process if a resident fails the in‑training exam (ITE) or boards?
  • Or do they just shrug and say, “Well, it is on you to fix it”?

I have seen programs where PGY‑3s who scored at the 10th percentile on the ITE got no structured help. A year later, half of them failed boards. The PD was “disappointed in the cohort.” Translation: leadership abdicated their responsibility.

3.4 Structural burnout and service overload

A resident who is chronically drowning in service has no bandwidth for serious board prep. Some patterns:

  • Extreme ward coverage ratios (e.g., 20–25 patients on IM with minimal support).
  • Endless scut: manual prior auths, transporting patients, drawing all labs at night, etc.
  • No real outpatient or reading time. Ever.

These programs then blame individual residents for failing the boards. But if the schedule is fundamentally hostile to study time, outcomes will reflect that.


Each specialty has different exam structure, culture, and baseline pass rates. You need to adjust your red-flag radar accordingly.

4.1 Internal Medicine (ABIM)

National pass rates for ABIM have hovered roughly in the low‑ to mid‑90s for first-time takers, which is high. In IM, anything below 90% 3‑year rolling should make your eyebrows go up.

Rules of thumb for IM:

  • Green: 93–100% and stable or improving.
  • Yellow: 90–92%, stable. You need a narrative explanation.
  • Red: <90% 3‑year, or any downward trend that touches the 80s.

If you see 85–88% over 3 years at an IM program, residents are regularly failing. That is not “one bad year.”

4.2 General Surgery (ABS QE & CE)

Surgery is trickier. You have the Qualifying Exam (QE) and then the Certifying Exam (CE). Programs often show QE pass rates; CE data is harder to see.

  • National QE rates are usually somewhat lower than ABIM’s pass rates.
  • The specialty culture is also more variable about structured didactics vs “learn in the OR.”

Still:

  • Repeated QE 3‑year pass rates under ~80% are very concerning.
  • A drop from ~90% to low 80s in a few years is also a serious warning sign.

And if you ever find QE pass rates in the 60s or 70s over a 3-year window, that is a program-level educational failure.

4.3 Pediatrics

Peds programs tend to be strong on structured education and board prep. Many good peds programs proudly report >95% 3‑year pass rates.

So in peds:

  • Stable 90–95% = fine.
  • Below 90% 3‑year = strong yellow bordering on red. You need a very convincing explanation.

4.4 OB/GYN

OB/GYN is split: written exam (basic knowledge) and oral exam. Some programs provide data on one, some on both, and some obscure the numbers.

Given the complexity, you look for consistency:

  • Are written exam pass rates ≥85–90% for several years?
  • Are oral exam pass rates respectable and stable?

Recurring 3‑year averages below ~80% on written boards in OB/GYN is a flashing red issue. Less data available, but the pattern still holds: sustained underperformance is bad.

4.5 Psychiatry

For a long time, psych boards were considered “chill.” That has shifted somewhat as the specialty has grown more competitive.

For psych, I would expect:

  • Stable 3‑year rates in low–mid 90s for most decent programs.
  • <88–90% 3‑year rate is a yellow/red signal, especially if trending downward.

5. Three-year patterns that should immediately trigger follow-up questions

Let me be concrete. These specific scenarios are red‑flag patterns you should not gloss over.

Pattern 1: “We used to be great, but…”

Example slide (IM):

  • 2019: 98%
  • 2020: 94%
  • 2021: 86%

If the PD does not proactively address this, you should.

Questions to ask, verbatim if you want:

  • “I noticed your 3‑year ABIM pass rate has declined. What do you think contributed to that trend?”
  • “What changes have you made in your curriculum or resident support to address it?”
  • “How did the leadership transition in [year, if mentioned] affect resident exam performance, if at all?”

If the answer is, “COVID hurt us,” that is fine once. If everyone else weathered COVID and they fell off a cliff, then no, COVID is not the whole story.

Pattern 2: “We are at 100% now” but the 3‑year average is low

Sometimes programs trumpet “100% pass rate last year!” but quietly show a 3‑year rate of 82–85%.

Translation: one good year, two bad ones.

You ask:

  • “I see last year’s cohort all passed, which is great. For the prior cohorts with lower pass rates, what did you identify as the key issues, and how have those been fixed long-term?”

You are testing whether this was just a lucky cohort or whether there was a systemic change (improved didactics, protected study time, in‑training exam-based remediation).

Pattern 3: Mid‑tier numbers and no data transparency

You will sometimes see:

  • “Our board pass rates are in line with national averages”
  • But no actual numbers on the website, and vague talk during interview day.

When a program is proud of their outcomes, they put the numbers on a slide. When they are not, they hide them behind phrases like “comparable to peers.”

If you cannot find real numbers after asking, assume they are below average.


6. How to cross‑check board pass claims

You are not limited to whatever the PD says in a 20‑minute talk.

Here is how to verify and contextualize what you are told.

6.1 Use ACGME and ABMS / board websites

Not all boards publish program‑level pass rates, but some do or at least show national rates by exam year. Even knowing the national first-time pass rate gives you something to compare to the program’s claimed 3‑year rate.

If the board data says national first‑time pass is 94% and the program tells you their 3‑year average is 86%, you now have context: they are clearly underperforming, not “about average.”

6.2 Talk to senior residents off-script

Candid questions that get real answers:

  • “Do people here feel well prepared for the boards?”
  • “How many people from your class failed the last exam?”
  • “What happens if someone fails the in‑training exam—does the program actually intervene?”

Listen for hesitation, euphemisms, or “we do our best but…” responses. Residents will often hint at problems even when they feel obligated to be polite.

6.3 Pay attention to how they talk about ITE scores

The in‑training exam is the early warning system for board failure. Programs that treat ITE seriously usually have better board outcomes.

Green flags:

  • Structured, program‑wide board review based on ITE results.
  • Formal reading assignments or question quotas after low scores.
  • Dedicated academic half‑days that do not get cannibalized by service.

Red flags:

  • PD shrugs: “ITE is just a practice test, we do not care about it too much.”
  • Residents tell you, “We see our scores once and that is it. No one follows up.”

7. Distinguishing a “fixable” yellow flag from a true red flag

Not all sub‑90% numbers should scare you equally. Context matters.

Here is a straightforward way to think about it:

Interpreting 3-Year Board Pass Patterns
Pattern TypeExample TrendMy Interpretation
Stable high95 → 93 → 96Healthy program
Low but improving82 → 88 → 92Cautious optimism, ask why
High then collapsing97 → 91 → 84Major red flag
Stable low80 → 82 → 81Chronic systemic failure

I tend to be more forgiving of “low but improving” when:

  • There is a clear, credible story. For example: “We changed PDs, cut service, built a board curriculum, and pass rates have risen each year since.”
  • Senior residents confirm the changes are actually happening.

I am not forgiving of:

  • Stable 80‑something pass rates with no convincing plan.
  • PDs who blame cohorts instead of owning system problems (“That class just did not study”).
  • Programs that treat board failures as individual moral failures.

8. Board pass rates in the broader context of program quality

One more nuance: High board pass rates do not automatically mean the program is a good fit or even a good environment. They just mean the educational infrastructure is functional.

You still have to cross‑reference:

  • Case volume and clinical intensity
  • Fellowship / job placement
  • Wellness and culture
  • Support for research or specific career goals

But low board pass rates, in contrast, are rarely isolated. They almost always travel with:

  • Disorganized didactics
  • High service burden
  • Weak mentorship
  • Poor communication and leadership churn

Which is why I treat them as a true red flag. If the easiest outcome metric a program has to care about—“Do our graduates get certified?”—is suffering, you can infer a lot about what daily life there looks like.


9. How this will evolve: future of board metrics as a red flag

Boards and training are both shifting:

  • USMLE Step 1 is now pass/fail. Programs are relying more heavily on ITE, Step 2 CK, and ultimately board results to justify their educational quality.
  • Some boards are changing exam format (e.g., phasing out certain oral components, restructuring written exams). That can cause temporary volatility.

But the long‑term trend is clear: objective outcomes will become even more important as other metrics (grades, Step 1, numeric clerkship scores) get watered down or removed.

I expect:

  • More transparency pressure from ACGME and applicants on program-level board data.
  • Stronger linkage between accreditation reviews and poor board outcomes.
  • Programs with chronically low pass rates getting hammered in recruitment and either reforming or shrinking.

So learning to interpret 3‑year board trends now is not just about this match year. It is about developing the skill to distinguish “good PR” from “good training” in a landscape where everyone will claim to be “well above national averages.”


hbar chart: High-stable programs, Mid-range programs, Chronic low-pass programs

Program Profiles by 3-Year Board Pass Category
CategoryValue
High-stable programs65
Mid-range programs25
Chronic low-pass programs10

That rough breakdown—most programs fine, some mediocre, a minority truly bad—is why you cannot be lazy. The disastrous 10% do exist. They will show you nice pictures of brand-new call rooms while quietly graduating uncertified residents.


Mermaid flowchart TD diagram
Board Pass Red Flag Decision Flow
StepDescription
Step 1See program board data
Step 2Likely OK, confirm stability
Step 3Yellow flag - ask why and how
Step 4Red flag - avoid unless exceptional fit
Step 53-year rate 5%?
Step 6Trend improving?

FAQ (exactly 6 questions)

1. What 3‑year board pass rate is an automatic deal‑breaker?
If a program’s 3‑year rate is consistently below 80% in IM, Peds, Psych, or OB/GYN, or below mid‑70s in General Surgery, I consider that an automatic “do not rank unless there is an extremely unusual reason.” That level of failure reflects deep systemic problems that will not magically fix themselves during your residency.

2. How much weight should I give board pass rates compared to fellowship match lists?
They measure different things. Board pass = minimum educational competence and program support. Fellowship match = reputation, networking, and resident initiative. I would not join a program with great fellowship outcomes but chronically weak board pass rates. If they cannot reliably get people certified, fancy match lists are lipstick on a pig.

3. Is one bad year in board pass rates a reason not to rank a program?
Not automatically. A single bad year (e.g., 96% → 84% → 95%) can happen with small cohorts or a weird exam year. You care about 3‑year trends. If the program can clearly explain one anomalous year and their 3‑year rate is otherwise healthy, I would not penalize them heavily.

4. What if a program refuses to share specific board pass numbers?
That is a major red flag. Strong programs are proud of their outcomes and do not hide them. If leadership dodges or gives only vague statements like “comparable to national averages” without numbers, assume the real data is worse than they want to admit. You have other options.

5. How do small programs affect interpretation of 3‑year trends?
In a small program (say, 3–4 residents per year), a single failure swings percentages dramatically. One person failing in a 4‑resident class = 75%. That is why you should pay even more attention to the narrative: Do they have structured remediation? Was this a one‑off? Are later cohorts doing better? Small numbers mean you cannot overreact to a single year, but a three‑year pattern of multiple failures is still concerning.

6. If I am a strong test‑taker, can I ignore low program board pass rates?
You are overestimating your control. Board prep during residency is not just about personal discipline; it is about schedule, culture, and structure. A bad program can grind down even excellent test‑takers with relentless service and zero educational support. You might pass anyway, but your life will be significantly harder, and your peers will suffer. There is no prize for surviving a dysfunctional program on “hard mode.”


Key points:

  1. Always interpret board pass rates against specialty-specific national baselines and 3‑year trends, not single-year numbers.
  2. Consistently low or declining 3‑year pass rates are not cosmetic flaws; they signal deep educational and cultural problems.
  3. Ask direct questions, cross‑check what you hear, and treat poor board outcomes as a genuine red flag, not a minor footnote.
overview

SmartPick - Residency Selection Made Smarter

Take the guesswork out of residency applications with data-driven precision.

Finding the right residency programs is challenging, but SmartPick makes it effortless. Our AI-driven algorithm analyzes your profile, scores, and preferences to curate the best programs for you. No more wasted applications—get a personalized, optimized list that maximizes your chances of matching. Make every choice count with SmartPick!

* 100% free to try. No credit card or account creation required.

Related Articles