Residency Advisor Logo Residency Advisor

Exactly How to Analyze Faculty CVs to Judge a New Residency’s Depth

January 8, 2026
16 minute read

Resident physician reviewing faculty CVs at a desk -  for Exactly How to Analyze Faculty CVs to Judge a New Residency’s Depth

The worst way to judge a new residency is by its glossy brochure. The best way is by its faculty CVs.

If you learn to read those CVs like an insider, you will see—very quickly—whether a shiny “new” program has real depth or is running on fumes.

You are not just asking “Do they have faculty?” You are asking:

  • Do they have enough experienced people to train me?
  • Can they honestly support research, fellowships, and career growth?
  • Is this program a temporary stopgap for the hospital or a serious long-term investment?

(See also: How to Vet a New Residency Program When There Are No Graduates Yet for a broader checklist.)

Let me walk you through a structured, ruthless way to answer those questions using only faculty CVs and a bit of discipline.


Step 1: Build a Clean Faculty Snapshot Before You Judge Anything

You cannot evaluate depth if you are staring at 30 separate PDFs in chaos.

Here is the first pass. Fast, mechanical, and non-negotiable.

  1. List all core faculty
    From the website or ACGME data:

    • Program Director (PD)
    • Associate PDs (APDs)
    • Core faculty (anyone explicitly listed as “core”)
    • Key subspecialists you know you care about (e.g., cardiology, critical care, EM ultrasound, endocrine, etc.)
  2. Create a simple grid (Excel, Notion, Google Sheets, whatever)
    For each faculty member, capture:

    • Name
    • Degree(s) (MD/DO/PhD/MPH/other)
    • Rank (Assistant/Associate/Professor or equivalent)
    • Years since residency completion
    • Board certification(s)
    • Primary site (main hospital vs off-site/community)
    • Major roles (PD/APD, clerkship director, research director, QI lead)
    • Last 5 years of publications (count only, for now)
    • Grant involvement (Y/N)
    • Fellowship training (Y/N + field)
Core Data Points to Extract from Faculty CVs
Data PointWhy It Matters
Years since residencyExperience depth
Faculty rankAcademic maturity & promotion
Recent publicationsActive scholarship vs stale CV
Grants (PI or co-PI)Real research infrastructure
Leadership rolesProgram stability & direction
  1. Speed rules for this pass

    • Spend no more than 2–3 minutes per CV on this first pass.
    • Do not read line-by-line. You are scanning headings: Education, Training, Academic Appointments, Publications, Grants, Teaching, Service.

Once this grid exists, you are not guessing anymore. You can see patterns. And patterns are everything.


Step 2: Check Experience Depth – Is Anyone Actually Senior?

Here is the blunt truth: a brand-new program with almost all faculty ≤5–7 years out of training is a risk. Not automatically bad. But risky for depth, stability, and mentorship.

Focus on this question first:

How many truly senior people are here?

Look at:

  • Residency completion year
    Subtract from the current year. That is years in practice.

    • 0–5 years: junior
    • 6–10 years: mid-career
    • 11+ years: senior
  • Academic rank (if any)

    • Instructor: very junior or mostly clinical
    • Assistant Professor: early-stage academic
    • Associate Professor: mid-career, usually with a tracked record
    • Full Professor: senior, high institutional capital

What you want to see in a solid new program:

  • PD: ideally ≥10 years out of residency or at least ≥5 with strong leadership/education record
  • At least 3–5 core faculty who are ≥7–10 years out
  • Some mix of junior faculty is fine. All junior, not fine.

Red flags I have seen on too many “new” programs:

  • PD and APDs all ≤5 years out of training; no one at Associate Professor rank or higher
  • CVs show serial job-hopping every 1–2 years
  • Many “clinical instructor” titles with no promotions, despite 7+ years out

If your grid shows that 70–80% of core faculty are ≤5 years out and only 1–2 people look genuinely senior, you are looking at a shallow bench.


Step 3: Map Clinical Breadth – Do They Cover What You Need Trained In?

Depth is not just about years; it is about coverage.

(Related: How to Use Site Visits and Shadow Days to Evaluate New Residencies)

You want to know: can these people cover the core clinical areas of your specialty at a level that will let you graduate confident?

Scan the CVs for:

  • Fellowship training (field + institution)
  • Prior roles:
    • MICU director, ED director, trauma director
    • Service chiefs, section heads
    • “Director of X rotation”

Now, cross-reference with the program’s claimed strengths. For example, in Internal Medicine:

  • If they are bragging about strong ICU training:

    • How many faculty are Critical Care trained?
    • How many of them practice at the main teaching site full-time?
    • Are they teaching-focused (APD, ICU rotation director) or mostly service-only?
  • If they sell themselves as “academic”:

    • How many faculty have prior university appointments?
    • Any affiliations with well-known institutions (e.g., trained/faculty at major academic centers)?

You should be able to roughly fill in: “Who are my likely role models/teachers for…”

  • Procedures (lines, ultrasound, bronch, etc.)
  • Outpatient continuity
  • Subspecialty exposures you care about
  • Fellowship-typical paths (cards, GI, pulm, EM subspecialties, etc.)

If there is a giant hole—like zero faculty with credible experience in the area you think you want to pursue—be realistic. “We plan to build that” is not the same as “we have it.”


Step 4: Dissect the Publication Section – Is This Real Scholarship or Decoration?

New programs almost always talk a big game about “robust research opportunities.” The CVs will show whether that is fantasy.

You are looking for:

  1. Recent, peer-reviewed publications
    Ignore abstracts and posters for the first pass. Count actual articles.

    • How many original research or review articles in the last 5 years?
    • How many faculty with ≥3 papers in that window?
    • Are residents or students among the co-authors?
  2. Type and venue of publication

    • PubMed-indexed journals vs. obscure or predatory journals
    • Case reports are fine, but if 90% of output is case reports, there is no real research infrastructure
  3. Authorship patterns

    • Is the PD or a few senior faculty consistently senior author on multi-resident projects?
    • Do you see a line like “Resident X, MD” repeatedly? That means there is a functioning pipeline.

Here is a structured way to score research depth quickly:

bar chart: Faculty A, Faculty B, Faculty C, Faculty D, Faculty E

Hypothetical Faculty Scholarship Profile
CategoryValue
Faculty A12
Faculty B5
Faculty C0
Faculty D3
Faculty E1

  • Faculty A: prolific, likely research anchor
  • Faculty B, D: moderate activity
  • Faculty C, E: minimal or none

If your overall picture is:

  • 1–2 people with real research records
  • Everyone else with 0–1 papers, mostly case reports
  • No resident co-authors visible

Then do not believe “you can do lots of research here” without heavy self-direction and luck.


Step 5: Look for Grants and Funded Projects – The True Research Signal

Publications tell you what has been completed. Grants tell you whether there is a running engine.

On each CV, look for:

  • NIH, foundation, or major institutional grants

    • Role: PI (principal investigator), co-PI, co-investigator
    • Active vs completed
  • Industry-sponsored trials

    • Clinical trials where faculty are site-PI or co-investigator

Here is the simple rule:

  • Any NIH or equivalent competitive grant = a serious research anchor
  • Multiple active industry or foundation grants across several faculty = the program can actually support research

If you see zero grants across all CVs, then research exists only as unfunded, opportunistic projects. That may be fine for a couple of case reports. It is not fine if you are banking on serious research or academic career development.


Step 6: Evaluate Education and Teaching Credibility – Can They Actually Train You?

You are not just joining a group of clinicians. You are joining a training environment.

On each CV, search for:

  • Formal education roles:
    • Program Director, Associate Program Director
    • Clerkship Director, Site Director
    • “Director of Medical Education”
  • Teaching awards:
    • “Outstanding Teacher of the Year”
    • Society awards for education
  • Curriculum development:
    • Designed new rotation, simulation curriculum, OSCEs, boot camps

This is where depth matters a lot in new programs.

Healthy pattern:

  • PD/APDs with a history of prior teaching roles at other programs

  • Multiple faculty with:

    • Documented teaching awards, or
    • Sustained curriculum roles (>3 years)

Thin pattern:

  • PD is a first-time PD with no prior APD or major education leadership
  • No one has teaching awards or significant educational roles beyond “precepts residents”

If the CVs do not show protected time, titles, and recognized teaching work, then all the “we care about education” talk is just that—talk.


Step 7: Trace Career Stability and Institutional Commitment

This part is subtle but crucial. A deep program does not constantly churn faculty. A fragile one does.

From the CVs, pay attention to:

  • Length of stay at each job

    • Many short (1–2 year) stints at multiple hospitals: instability.
    • Long runs (5+ years) with promotions: stability.
  • Timing of joining the new institution
    Sort your grid by “year joined current institution.” If:

    • 70–80% of the faculty arrived in the last 1–2 years
    • There are few or no long-standing physicians with 8–10+ years at that hospital

Then the hospital may be scrambling to meet ACGME requirements quickly rather than building something deliberate.

You want some people who:

  • Know the hospital politics
  • Have institutional memory
  • Can protect residents if there is institutional stupidity (there will be, somewhere, at some point)

If everyone is new, no one has that leverage yet.


Step 8: Look for True Subspecialty Depth vs Token Presence

New programs sometimes “solve” subspecialty coverage by appointing one person to each field, often part-time.

On the CVs of subspecialists, check:

  • Do they have full-time academic appointments at your teaching hospital or are they mostly private practice with a courtesy title?
  • Do they have a track record of:
    • Fellowship involvement elsewhere
    • Subspecialty teaching roles
    • Subspecialty publications

If the “only cardiologist core faculty”:

  • Finished fellowship 1 year ago
  • Has no publications
  • Works half the time at an outside hospital

Expect minimal true cardiology mentorship.

You want subspecialty “clusters” where possible:

  • 2–3 people in a subspecialty, at least one mid-career or senior
  • Evidence on CVs of past fellowship training programs or structured subspecialty teaching

Without that, the program will likely give you surface exposure, not depth.


Step 9: Cross-Check for Resident Involvement – Are Trainees Already Embedded in Their Work?

This is the fastest way to sense whether faculty see residents as partners or just labor.

In publication sections, grant sections, and presentations, look specifically for:

  • Names identified as “resident,” “fellow,” or “medical student”
  • Phrases like “mentor,” “supervised,” “resident project”

You are asking:

  • Are residents showing up as co-authors?
  • Are they presenting at conferences with faculty?
  • Do any CVs list formal mentorship roles (resident research director, scholarship mentor)?

If you see faculty with strong records but almost no trainee names attached, that can go either way:

  • They are new to having residents
  • Or they do not naturally include trainees in their scholarly work

In a brand-new residency, you may not find much of this yet. That is fine, but then you should lean even more heavily on their prior training environments:

  • Did they previously mentor residents and fellows somewhere else?
  • Are there older CV lines that show resident involvement at prior institutions?

Step 10: Use a Simple Scoring System to Bring It All Together

You do not need a PhD in biostatistics. You just need a consistent approach.

Here is a practical scoring framework for “faculty depth” in a new residency (total 20 points). Rate based on the core faculty only.

  1. Experience Mix (0–5)

    • 0–1: Almost all ≤5 years out, PD very junior
    • 2–3: Mixed, but majority junior, only 1–2 clearly senior
    • 4: Good mix; several faculty 7–15 years out; PD mid-career or above
    • 5: Strong spread; multiple mid/senior; PD clearly seasoned
  2. Scholarship & Grants (0–5)

    • 0–1: Very few papers overall, mostly case reports, no grants
    • 2–3: Some publications by a few people; minimal or no grants
    • 4: Several active publishing faculty; at least one with grants or multiple trials
    • 5: Robust publication output; clear research anchor(s); funded projects visible
  3. Teaching & Education Track Record (0–5)

    • 0–1: Almost no formal education roles; no awards; PD/APDs inexperienced
    • 2–3: Some teaching roles; early PD/APD; thin education footprint
    • 4: PD/APD with credible prior roles; multiple faculty with teaching awards or curriculum work
    • 5: Strong and sustained educational leadership across several faculty; clear culture of teaching
  4. Stability & Institutional Tie-In (0–5)

    • 0–1: Many with short job histories; nearly all just arrived; few long-term people
    • 2–3: Some mix; still skewed new with limited institutional roots
    • 4: A healthy mix of new and long-standing clinicians; some with 5–10+ years at that hospital
    • 5: Several long-tenured faculty with promotions at that site; visible leadership positions in the hospital/health system

Now interpret:

  • 16–20: High-depth faculty group. For a new program, this is unusually strong.
  • 11–15: Solid but with some thin areas. Acceptable if program is transparent about growth.
  • 6–10: Shaky. You will be acting as part “beta tester.” Proceed only if you have strong reasons.
  • 0–5: I would not recommend it unless you have no alternatives and are fully aware of what you are trading away.

Step 11: Validate Your CV Impressions Against the Program Narrative

Once you have this structured analysis, compare it to what the program says on interview day.

You are listening for alignment or spin.

Typical mismatches:

  • “We have huge research opportunities”
    But you saw 0–1 faculty with real publication/grant backgrounds.

  • “Strong subspecialty exposure”
    But you only found one part-time subspecialist with limited academic history.

  • “We are very education-focused”
    But the PD has no prior education leadership and no faculty hold major teaching titles.

At that point, your question to them (politely) becomes:

  • “Can you tell me about specific resident research projects from the last year and which faculty led them?”
  • “Who are the main cardiology/ICU mentors for residents pursuing those fellowships? What have your recent graduates matched into?”
  • “Can you walk me through the educational background of your core faculty—any prior PD/APD roles or teaching awards?”

Pay close attention not to the answer itself, but to how concrete it is. Programs with real depth will talk names, numbers, and examples. Programs without depth will give you aspirations and slogans.


Step 12: Watch Out for Three Common CV Illusions in New Programs

There are a few patterns I see repeatedly that fool applicants.

  1. “Borrowed prestige” from old affiliations
    Faculty trained or previously worked at big-name institutions. Great. But:

    • Are they still publishing at that level?
    • Do they still have active collaborations?
    • Or did their academic output drop to zero after moving?

    Past logos on the CV do not equal present-day research culture.

  2. Overloaded titles with no track record
    In new programs, hospitals hand out titles like candy:

    • “Director of Research”
    • “Vice Chair of Education”
    • “Clerkship Director”
    • “Ultrasound Director”

    Check their CVs: have they ever led research or education before this job? If not, those titles might just be boxes checked for ACGME, not a sign of real depth.

  3. Single super-star masking a thin bench
    Some programs recruit one extremely strong academic and then surround them with mostly service-only faculty.

    Ask yourself:

    • How many residents can that one person realistically mentor?
    • What happens if they leave in 2–3 years?

    One star is not a system. It is a risk.


Step 13: Use Visual Tools to Spot Patterns Fast

Sometimes it helps to literally see distribution. Take your grid and sketch quick visuals.

For example, for years since residency:

hbar chart: 0-5 years, 6-10 years, 11-15 years, 16+ years

Years Since Residency Among Core Faculty
CategoryValue
0-5 years8
6-10 years3
11-15 years1
16+ years0

You immediately see: this is a heavily junior faculty environment. That might be fine if you like energy and change, but it is not “deep.”

Or for publications in last 5 years:

pie chart: 0-1 pubs, 2-5 pubs, 6+ pubs

Publications in Last 5 Years by Faculty Group
CategoryValue
0-1 pubs10
2-5 pubs3
6+ pubs1

Now you see: only 1–2 people are actively publishing. Do not expect widespread research mentoring.


Step 14: When a Weak CV Picture Might Still Be Acceptable

Not every applicant needs a powerhouse academic environment. There are scenarios where a relatively shallow CV profile could still work:

  • You aim for community practice, not fellowship, and mainly want:

    • Reasonable workload
    • Decent location
    • Supportive culture
  • You are comfortable being part of the first or second wave of residents, shaping the program as it grows.

If that is you, what you should still insist on:

  • At least a few senior or mid-career people with real teaching experience
  • Evidence that the hospital is committing resources (new hires, new leadership roles, planned expansions)
  • Honest answers about limitations instead of sales pitches

For others aiming at competitive fellowships or academic careers, do not lie to yourself: faculty depth matters more than branding.


Your Next Move: Put One Program Under the Microscope Today

Do this now:

  1. Pick one new residency you are considering seriously.
  2. Download or request CVs for:
    • PD
    • APDs
    • At least 5–7 core faculty
  3. Build a simple grid with:
    • Years since residency
    • Academic rank
    • Publications last 5 years
    • Grants Y/N
    • Prior education roles
  4. Give that program a 0–20 faculty depth score using the framework above.

If that number surprises you—higher or lower—that is good. You are finally seeing the program underneath the brochure.

Then repeat it for your top 2–3 choices. Patterns will emerge, and your decisions will get sharper.

Open a spreadsheet, list the names, and pull the first CV right now. The sooner you see the truth in those documents, the less likely you are to spend three to seven years in a program that looked good on paper but was hollow underneath.

overview

SmartPick - Residency Selection Made Smarter

Take the guesswork out of residency applications with data-driven precision.

Finding the right residency programs is challenging, but SmartPick makes it effortless. Our AI-driven algorithm analyzes your profile, scores, and preferences to curate the best programs for you. No more wasted applications—get a personalized, optimized list that maximizes your chances of matching. Make every choice count with SmartPick!

* 100% free to try. No credit card or account creation required.

Related Articles