
It’s mid‑June. ERAS isn’t open yet, your personal statement is half‑baked, and you’ve got 25+ residency programs bookmarked with names like “maybe?” and “idk but big city.”
This is the moment people waste. They scroll glossy program websites, watch the same polished “resident life” video 10 times, and convince themselves everything looks fine. Then in January they’re on a night float thinking, “How did I miss that this program has zero in‑house fellowship placement and a 60% board pass rate?”
You’re not doing that.
This summer is for one thing: systematically hunting red flags before you spend money and application slots. Online only. No phone calls, no emails. Just publicly available info and a structured, time‑based checklist.
Let’s walk through what you should do:
- Month by month (June–September)
- Week by week inside June/July (heavy research window)
- Then a tight “day‑of” checklist for each program you’re evaluating
June: Build Your Research System Before You Judge Programs
At this point you should not be obsessing over individual programs. You should be building the system you’ll use to judge them. If you skip this, you’ll end up with 40 random screenshots and no idea what actually matters.
Week 1 (Early June): Set Up Your Tracking Tools
You need a standardized way to compare programs. Otherwise the shiny websites will win.
Do this:
Create a master spreadsheet
Tabs or columns for:- Program name
- City / region
- Size (number of residents per year)
- Type (university, university‑affiliated community, community)
- Program director name + tenure
- Core faculty count
- Recent ACGME status (continued, initial, warning, etc.)
- Board pass rate (3–5 year if available)
- Fellowship match / job outcomes
- Call schedule info
- Resident satisfaction clues (toxic vs normal complaints)
- Major red flags (yes/no + notes)
- “Apply / Maybe / Avoid” decision
Pre‑define your red flags vs yellow flags
Do not invent criteria on the fly. Decide now. Example:
Major red flags (likely do not apply):
- ACGME warning/probation, recent accreditation loss
- Board pass rate consistently <80% for core boards
- Chronic unfilled spots in NRMP (not just one weird year)
- Multiple residents leaving mid‑year or frequent PGY‑2+ openings
- Documented harassment / discrimination problems with no visible response
- No evidence of graduates passing boards or getting solid jobs/fellowships
Yellow flags (apply with caution, ask about at interviews):
- Rapid faculty turnover, especially PD/APD changing every 1–2 years
- Very vague or missing info on call schedule and didactics
- No published list of current residents or alumni
- Consistent complaints about malignant culture or lack of support
- Unrealistic service load with poor educational structure
Your goal this summer: filter out obvious red‑flag programs and tag the yellow‑flag ones for targeted questions later.
Set your personal priorities
These are not red flags; they’re your deal‑breakers. Examples:
- Must be in X region because of partner/family
- Need strong fellowship placement (e.g., cards, GI, heme/onc)
- Cannot tolerate Q3 28‑hour call
- Need decent parental leave policy
Put these in the top row of your sheet as reminders. You’ll forget them when a big‑name place dazzles you.
Week 2 (Mid‑June): Build Your Initial Program List
Now you can start collecting names, but still not deep‑diving. At this point you should be widening the net and tagging programs as “worth investigating.”
Sources to use:
- FREIDA / AMA residency search
- Specialty organization lists (e.g., ACOG, ACGME lists, ABIM programs)
- Your med school advisors and recent grads
- NRMP “Charting Outcomes” and “Program Director Survey” (to calibrate competitiveness vs your stats)
Task for the week:
- Get to 1.5–2x the number of programs you think you’ll apply to.
If you think you’ll ultimately apply to 40, you want 60–80 on your “candidate” list now.
You’re stocking the pond. You’ll cut later.
Week 3–4 (Late June): Learn to Read Between the Lines
At this point you should practice your “online red flag radar” on a few sample programs before you tackle the full list.
Pick 5–8 programs (mix of known strong, mid, and questionable) and do a full online review using the checklist I’ll give you later. As you go, refine your spreadsheet columns. Add things you realize matter:
- Moonlighting options
- Night float vs 24‑hr call
- Research expectations vs actual output
- % of residents who are IMGs vs US grads (for some specialties this matters for visa/mentorship patterns)
Do not over‑interpret yet. This is practice to speed up your process in July.
July: Systematic Online Vetting of Every Program
July is where the real work happens. ERAS isn’t open yet, your sub‑I might be starting, but you can still carve out 30–60 minutes most days.
At this point you should be moving from “I’ve heard of them” to “Apply / Maybe / Avoid” based on public data.
Week 1 (Early July): First‑Pass Screen for Obvious Red Flags
You are not doing deep research on every program. First pass is fast and ruthless: 5–10 minutes per program.
For each program on your candidate list:
Check ACGME accreditation status
- Google:
ACGME [Specialty] programs [state]or use ACGME public search. - Look for:
- “Continued Accreditation” vs “Initial” vs “Warning/Probation”
- Any note about recent actions.
Immediate red flag: warning or probation, especially if recent and not clearly resolved.
- Google:
Scan NRMP / SOAP outcomes (if available)
- Google:
[Program name] NRMP outcomesor check if they brag about “100% fill rate.” - For programs that consistently go into SOAP or leave PGY‑1 spots open, mark as:
- Yellow flag: recent single SOAP year in a turbulent year (e.g., pandemic chaos).
- Red flag: pattern of multiple SOAP years or annual open PGY‑2+ spots.
- Google:
Check for catastrophic news
- Quick Google:
[Program name] residency lawsuit,[Hospital name] resident fired,[Hospital name] discrimination investigation - You’re not punishing one old incident with clear remediation. You’re looking for:
- Multiple events
- No visible institutional response
- Unresolved lawsuits or recent major scandals
- Quick Google:
Filter your list
- Remove clear “no” programs.
- Mark “High‑risk – needs deeper look” for anything that feels off but not fully disqualifying.
Goal by end of Week 1: Your list should be trimmed by ~20–30% from glaring issues alone.
Week 2–3 (Mid‑July): Deep Website and Public Info Audit
Now you slow down. 20–30 minutes per serious program. Here’s the per‑program checklist you run through.
1. Program Website Basics
Things that should exist in 2024+:
- Current list of residents with photos/names
- Faculty list with at least some academic interests
- Rotation schedule by year
- Call structure and duty hours summary
- Didactics schedule (even if not super detailed)
- Recent class stats or match list (for IM, peds, etc.)
Red flags:
- Website clearly outdated (>2 years old, old PD name, old graduation year lists)
- No resident list or only first names
- Rotations described only in vague marketing language, no structure
This usually tells you one of two things:
- They don’t care about recruitment.
- They’re hiding chaos or high turnover.
2. Leadership Stability
You care about who runs the place and how often they change.
Look for:
- PD name and start year
- APD(s) and their roles
- Program coordinator (title alone can tell you something; when coordinators are constantly new, residents suffer)
Yellow flag pattern:
- PD changed 2–3 times in the last 5 years
- APDs frequently “stepping down” or “moving on” with no clear trajectories
Red flag pattern:
- No clear PD listed
- PD listed but you can’t find them on hospital site or LinkedIn
- ACGME notes “new program director” with recent probation
3. Board Pass Rates and Outcomes
This is not optional. You’re training for a specialty board exam.
Check:
- Website “resident achievements” or “board pass rates”
- ABIM, ABFM, other board reports when public
| 3-Year Pass Rate | Interpretation | Action |
|---|---|---|
| ≥ 95% | Strong | No concern |
| 90–94% | Acceptable | Fine if other factors good |
| 80–89% | Worrisome | Yellow flag; dig deeper |
| < 80% | Major issue | Usually avoid |
If they don’t list their pass rates anywhere, that alone isn’t a red flag, but combine it with other issues and I start to worry.
Also look at:
- Fellowship match lists (for specialties where this matters)
- For community programs: are graduates getting solid jobs in decent locations or drifting to urgent care because they’re under‑prepared?
Week 3–4 (Late July): Resident Culture and Hidden Signals
At this point you should shift from structural data to culture clues.
1. Social Media Recon
Look up:
- Program Instagram / Twitter / LinkedIn
- Resident‑run accounts
- Hospital GME accounts
Patterns to pay attention to:
- Only PD‑centric, PR‑style posts vs residents actually shown learning and existing as humans
- Residents who look exhausted, chronically under‑repped at events
- Repeated “we’re so grateful for our residents working extra this weekend!” posts — occasional is fine; constant suggests chronic understaffing
2. Third‑Party Reviews (Take With Salt, But Use Them)
You know the sites:
- Reddit r/Residency, r/medicalschool
- Student Doctor Network threads
- Doximity (for general ranking patterns, not gospel truth)
- Specialty‑specific forums or Discords
You’re not looking for:
- One bitter comment from 2017
You’re looking for:
- Multiple independent threads saying the same thing: “toxic leadership,” “scut‑heavy with no teaching,” “multiple residents left mid‑year.”
- Examples where people say: “PM me for details” and then you repeatedly see posts: “Yeah, I rotated there… it was bad.”
When you see consistent negative themes from different years, that’s a red flag. Not gossip. Pattern recognition.
August: Refine, Compare, and Sense Check
By August, ERAS is about to open. At this point you should have:
- Your trimmed list of target programs
- Per‑program notes with flagged issues
- A sense of which programs are “safe,” “cautious,” and “absolutely not”
Now you move from isolated data points to comparative judgment.
Build a Quick Comparison View
Create a side‑by‑side view of a few top choices vs borderline ones.
| Factor | Program A | Program B | Program C |
|---|---|---|---|
| ACGME Status | Continued | Warning | Continued |
| 3-yr Board Pass % | 96% | 82% | Not listed |
| PD Tenure (years) | 7 | 1 | 3 (new PD) |
| NRMP Fill Pattern | Full | SOAP x2 | Full |
| Resident Turnover | Stable | Multiple mid‑year exits | Unknown |
If you see:
- Low board pass + SOAP + warning status + mid‑year exits → that’s a hard avoid.
- One weak area (e.g., new PD) with otherwise strong metrics → yellow, not red.
Future‑Facing Red Flags: Where Is This Program Going?
Category is “Miscellaneous and Future of Medicine,” so let’s talk trajectory, not just snapshot.
Look for:
- Hospital/health system financial health
- News about closures, heavy layoffs, service line cuts
- Planned mergers that often screw GME for a few chaotic years
- Specialty‑specific vulnerability
- For EM: Are they downsizing due to oversupply? New grads struggling?
- For OB: Are L&D units being consolidated or shut down?
- Technology and modernization
- Still using dinosaur EMR with no plan to upgrade
- No mention of quality improvement, data literacy, or adapting to new guidelines
- Zero telehealth exposure where it’s clearly relevant
You’re training for future medicine. A program that’s already behind and showing no sign of evolving is risky.
September: Final Filter and “Apply / Maybe / Avoid” Decisions
It’s almost application time. At this point you should be closing loops, not collecting new programs.
Week 1: Resolve Your Biggest Question Marks
For 5–10 borderline programs, do a second‑pass online review:
- Re‑check for any new ACGME actions
- See if the website got a big update (often happens late summer)
- Quick search for new news articles or scandals
If something still feels off and you can’t find reassuring data? Put it in “Avoid” or “Maybe but low rank if interviewed.” Do not pretend “no information” equals “good information.”
Your Per‑Program “Day‑Of” Online Checklist
Here’s the tight, reusable checklist you can run for any program in 15–20 minutes once you’re in the groove:
Basic Info
- Confirm ACGME status
- Note PD/APD names and tenure
- Program size and type
Training Quality
- Rotation schedule present and coherent?
- Board pass rate ≥ 90%?
- Any obvious service‑over‑education vibe?
Outcomes
- Fellowship and job placement in the last 3–5 years?
- Any alumni lists that just stop after 2018?
Stability
- Any NRMP SOAP patterns?
- Any frequent mid‑year vacancies?
- Recent big hospital or department upheavals?
Culture
- Resident social media: exhausted and bitter vs normally tired but connected?
- Repeated online complaints about malignant leadership?
- Diversity and inclusion signals — performative or real?
Future Readiness
- Tech, QI, and innovation presence?
- Adapting to specialty trends or pretending it’s 2003?
Mark each:
- Green (no concern)
- Yellow (ask on interview)
- Red (strong reason not to apply)
| Category | Value |
|---|---|
| Initial List Building | 20 |
| Red Flag Screening | 30 |
| Deep Dives | 35 |
| Comparative Review | 15 |
| Period | Event |
|---|---|
| June - Week 1 | Build spreadsheet and criteria |
| June - Week 2 | Create broad program list |
| June - Week 3-4 | Practice deep dives on sample programs |
| July - Week 1 | Fast screen for obvious red flags |
| July - Week 2-3 | Full website and outcomes audit |
| July - Week 4 | Culture and social media review |
| August - Early Aug | Compare programs side by side |
| August - Late Aug | Evaluate future trajectory |
| September - Week 1 | Recheck borderline programs |
| September - Week 2 | Finalize Apply/Maybe/Avoid list |

FAQ (Exactly 4 Questions)
1. Should I email residents or PDs before ERAS if I see a possible red flag?
Usually no. Pre‑ERAS cold emails about “I saw you had probation in 2021, what happened?” are awkward and rarely helpful. Use public data now. Save pointed questions for interview day or post‑interview communication, when you have more context and they’ve already expressed interest in you.
2. How many red flags make a program not worth applying to?
One major red flag (probation, awful board pass rate, repeated SOAP, multiple mid‑year exits) is often enough to skip, especially if you have reasonable options elsewhere. Several moderate yellow flags together can equal one big red flag. When in doubt, compare against your safer programs side‑by‑side. If you’d never choose it over your mid‑tier backups, why spend the money?
3. What if every program in my target region has some issues?
That happens. Then you sort by least bad + best fit. Maybe the city you want only has community programs with weaker research but solid board pass rates and decent culture. Fine. Prioritize:
- Accreditation and training quality
- Boards + outcomes
- Culture
Take location as a separate, conscious trade‑off, not an excuse to ignore serious problems.
4. Do “bad vibes” online matter if I can’t find hard data?
They matter enough to move a program from “Apply” to “Maybe” or “Low enthusiasm.” If your gut keeps nagging after you’ve looked at objective metrics, rank it lower if you end up interviewing there. But don’t burn all options on vibes alone. Use them as a tiebreaker, not your only tool.
Key points:
- Use June to build a system; use July–August to systematically run every program through your red‑flag filter.
- Major red flags = probation, terrible board pass rates, chronic SOAP, and repeated resident exits. Stop pretending those are “one‑offs.”
- Snapshot matters, but trajectory matters more. You’re training for the medicine you’ll practice five years from now, not the one in their glossy recruitment video today.