
Medical education careers are being quietly strangled by a research myth: “If you don’t have first‑author randomized controlled trials, you’ll never advance.”
That claim is wrong. Not a little wrong. Completely misaligned with how most promotion committees actually function and with what the data on educator careers shows.
Let me walk through what actually gets people promoted as medical educators, who really needs RCTs, and when chasing first‑author trials is a smart move vs a career-wasting detour.
The Myth: “No RCTs = No Promotion”
The myth usually sounds like this:
- “If you want Associate Professor, you need at least one first‑author RCT.”
- “Education research without RCTs is just fluff; committees won’t respect it.”
- “Clinician‑educator track is basically dead-end unless you reinvent yourself as a PI.”
I’ve heard these lines in real hallway conversations from junior faculty in IM, EM, and pediatrics. Usually repeated by someone who has never actually sat on a promotions committee.
Here’s the problem: when you look at actual promotion criteria from major institutions, they almost never say “first‑author RCTs.” They say things like:
- “Evidence of scholarly productivity”
- “Regional / national reputation as an educator”
- “Sustained contributions to educational innovation”
- “Peer‑reviewed publications in medical education”
Not “three RCTs in JAMA.”

Let’s ground this in actual institutional data.
| Institution (Track) | Explicit RCT Requirement | Key Scholarship Language |
|---|---|---|
| Harvard (Clinician-Educator) | No | Educational leadership, innovation, publications |
| UCSF (Clinician Educator) | No | Teaching excellence, curriculum development, scholarship |
| Mayo (Educator Track) | No | Educational research or scholarly work |
| U of Toronto (Education) | No | Contributions to education scholarship |
| Duke (Clinician-Educator) | No | Regional/national reputation, peer-reviewed output |
None of these say, “Randomized controlled trials or you’re out.”
Because that’s not how most educator careers actually work.
Reality Check #1: Medical Education Is Not a Drug Trial Factory
The RCT obsession comes from clinical research norms: if you’re studying a drug, device, or intervention, RCTs are the gold standard for causal inference. Fine.
But education is not drug development. Most meaningful educational work is:
- Context-dependent
- Iterative
- Messy
Try randomizing students to never receive feedback to “test its effect.” Ethically insane. Or randomizing programs to use a blatantly inferior curriculum for three years. Good luck with that.
When you look at influential medical education papers, you see a very different pattern.
| Category | Value |
|---|---|
| RCTs | 10 |
| Quasi-experimental | 35 |
| Cohort/observational | 30 |
| Qualitative/mixed methods | 25 |
RCTs are a minority of high-impact education work. The field runs on:
- Quasi-experimental studies (before/after, controlled but not randomized)
- Cohort studies (following learners through new curricula)
- Qualitative and mixed-methods work (interviews, focus groups, thematic analyses)
If you look at Academic Medicine, Medical Education, Advances in Health Sciences Education, or Teaching and Learning in Medicine, the bulk of respected scholarship isn’t RCT-heavy.
Do RCTs exist in med ed? Sure. Are they the only path to impact or promotion? Not even close.
Reality Check #2: Promotions Committees Care About Coherent Scholarship, Not One Magic Study
What actually moves the needle for medical educators is a coherent body of work.
That usually looks like:
A clearly defined educational niche
- Assessment and feedback
- Simulation training
- Curriculum design in a specific domain (e.g., ultrasound, QI, palliative care)
- Faculty development
Multiple linked outputs over time
- Curriculum creation and implementation
- Local evaluation studies
- Dissemination as workshops, invited talks
- Peer-reviewed publications (often multi-site, often not RCTs)
External recognition
- Regional or national presentations (Clerkship Directors in Internal Medicine, SAEM, STFM, APDIM, etc.)
- Invitations to speak, serve on committees, or write review pieces
- Education awards
Promotion committees do not say: “No RCT = deny.” They say: “Is this a sustained, credible, externally recognized scholarly program of work?”
They also look at role match. An educator track is supposed to value:
- Teaching
- Mentoring
- Curriculum leadership
- Educational scholarship
If your CV shows those—plus publications that are methodologically sane and peer-reviewed—lack of RCTs is not your problem. Lack of focus is.
| Step | Description |
|---|---|
| Step 1 | Start as Junior Faculty |
| Step 2 | Define Education Niche |
| Step 3 | Develop Curriculum or Program |
| Step 4 | Evaluate Outcomes Locally |
| Step 5 | Present Regionally or Nationally |
| Step 6 | Publish Peer Reviewed Work |
| Step 7 | Build Reputation and Leadership Roles |
| Step 8 | Promotion Review |
Nowhere in that realistic pathway is “conduct an RCT or game over.”
Who Actually Needs First-Author RCTs?
There are people in medical education who should be thinking seriously about RCTs. They’re just a very specific subset:
Education scientists on research-intensive tracks
- Hired with protected time specifically to do education research
- Expected to get external funding (NIH, HRSA, foundation grants)
- Evaluated like traditional researchers: grants, PI roles, high-methods rigor
Faculty on classic tenure tracks in some institutions
- Where the metric is still “grant dollars + high-impact publications”
- Where education research has to “compete” with basic science or clinical trials
Those aiming for certain leadership roles in education research centers
- Directors of medical education research units
- People who explicitly brand as “education methodologists” or “learning scientists”
In those situations, you don’t have to produce RCTs, but you do need:
- Strong study designs
- Methodologically defensible work
- Something that can survive the same peer-review standards applied to clinical trials
Sometimes that means an RCT. More often, especially early on, it means strong quasi-experimental and mixed-methods work as a base—then maybe multi-site controlled trials if the question warrants it.
The key distinction: their contract, track, and funding expectations drive that need. Not some generic law of educator advancement.
Reality Check #3: What Successful Medical Educators Actually Build
Look at people who’ve clearly “made it” as educators—Associate/Full Professors on educator or clinician-educator tracks, program or clerkship directors with strong reputations.
What do their CVs really show?
Common pattern:
- 10–30 education-focused publications
- A mix of:
- Curricular innovation descriptions
- Assessment tool development and validation
- Qualitative work (learner perspectives, faculty experiences)
- Multi-site evaluations
- Maybe 0–2 RCTs. Often none at all.
What they do have consistently:
- Named roles: clerkship director, simulation center director, director of undergraduate medical education in a department, etc.
- National presentations and workshops: repeated appearances at specialty education meetings
- Involvement in guideline or consensus documents around education in their field
- Sustained teaching, with documented excellence (evaluations, peer reviews, awards)
They’ve built an education career, not a mini-clinical trials unit.
| Category | Value |
|---|---|
| Curriculum innovation | 12 |
| Assessment/validation | 8 |
| Qualitative | 6 |
| RCTs | 1 |
If you’re on an educator track and your ambition is promotion plus meaningful impact on learners, that kind of portfolio is what you’re aiming for. Not a trophy RCT to “check the box.”
The Hidden Cost of the RCT Obsession
Here’s where this myth becomes actively harmful.
I’ve watched junior faculty do all of the following:
- Sit on a great idea for a new curriculum because “we have to design it as an RCT or it won’t count.”
- Delay implementing obviously better teaching approaches because “we haven’t randomized and registered the trial yet.”
- Burn years trying to run underpowered, methodologically shaky RCTs with no funding, on tiny student cohorts, just to have “RCT” in the methods section.
- Ignore mixed-methods and qualitative approaches that would actually answer their real questions—because “committees only care about RCTs.”
The result? Fewer implemented innovations, weaker local impact, and far fewer tangible products on the CV.

For most educators, that’s backwards. The sensible order is:
- Implement a well-designed educational intervention
- Evaluate it rigorously but realistically (mixed methods, pre-post, comparisons if possible)
- Iterate based on data
- Scale or spread to other sites
- Then consider more complex designs (including RCTs) if the question and resources justify it
You earn complexity. You don’t start there just for optics.
What Promotion Committees Actually Ask When Looking at Your File
Let me translate what I’ve heard in real promotion meetings into plain English. The questions are more like:
- “Does this person have a clear educational identity?”
- “Have they produced scholarship that others actually read and use?”
- “Can we see growth from local work to regional/national impact?”
- “Are they recognized outside their own institution?”
- “Is there a trajectory that suggests they’ll keep contributing?”
Whether your flagship paper is an RCT or a well-done multi-site observational study does not usually decide the case.
What will hurt you much more:
- Five unrelated one-off projects with no thematic connection
- All local posters, no peer-reviewed publications
- No evidence your innovations spread beyond your own program
- No national presence in your chosen niche
That’s the reality you should optimize for.
| Step | Description |
|---|---|
| Step 1 | Review CV |
| Step 2 | Question impact |
| Step 3 | Assess scholarship quality |
| Step 4 | Recommend more development |
| Step 5 | Support promotion |
| Step 6 | Clear education niche? |
| Step 7 | Regional or national reach? |
No diamond in that flowchart labeled “RCT present?”
How to Build a Promotion-Ready Educator Portfolio Without Chasing RCTs
If you want something practical, here it is.
You should focus on:
A narrow, defensible niche
- Commit: “I do simulation for crisis management,” not “I like education things.”
- Your name should be associated with something specific when people talk about education in your field.
Programmatic work within that niche
- Design a curriculum or program.
- Implement it for several years.
- Collect outcomes data every cycle (knowledge, skills, behavior, sometimes patient outcomes if feasible).
- Publish incrementally: initial design, validation of tools, outcomes at scale, adaptations for other settings.
Methodologically solid but realistic studies
- Use controls or comparison cohorts when possible, but don’t cripple your program chasing perfect randomization.
- Use mixed methods to understand why things work or don’t.
- Get a biostat or education research collaborator early; don’t DIY your stats unless you really know what you’re doing.
Deliberate dissemination strategy
- Aim for at least one peer-reviewed output per significant project.
- Pair every major paper with:
- At least one national presentation
- Having your materials hosted or requested by other programs
- Visible involvement in national education committees or working groups
If you later find a question that genuinely requires an RCT and you’ve built the right infrastructure and collaborators—great. Do it. But treat that as an advanced tool, not a promotion prerequisite.
| Category | Value |
|---|---|
| Teaching excellence | 85 |
| Program leadership roles | 80 |
| Coherent scholarship body | 90 |
| Study design type (RCT vs not) | 30 |
That chart lines up with what you hear behind closed doors a lot more than the RCT myth.
The Bottom Line: Myth vs Reality
Let’s cut it down to the essentials.
You do not need first-author RCTs to advance as a medical educator on typical clinician-educator or educator tracks. Promotion is built on coherent, sustained, externally recognized educational work—not one magic study design.
RCTs matter only for a subset of roles: education scientists on research-intensive or tenure tracks, or those leading dedicated education research centers. Even then, they’re one tool among many, not a universal requirement.
Chasing RCTs too early can actively damage your career by delaying real innovations, shrinking your output, and distracting you from building the track record that promotions committees actually value: clear niche, program leadership, credible scholarship, and growing reputation.
Design good work. Study it rigorously but sanely. Publish and disseminate strategically. That’s what really moves you up in medical education.