Residency Advisor Logo Residency Advisor

Structuring Your ERAS Research Entries: Bullet Examples That Impress PDs

January 6, 2026
19 minute read

Medical resident updating ERAS application research section on a laptop in a quiet call room -  for Structuring Your ERAS Res

You are staring at the ERAS “Scholarly Activities” page.
You have 11 research items: one poster, a preprint, a chart-review project that never got published, three QI projects, a middle-author paper from undergrad, and some half-finished things you are not sure even count.

And the text box says: 1020 characters.

That is where people either quietly build an application that screams “this person knows what they’re doing”… or they write “Retrospective cohort study on stroke outcomes. Assisted with data collection and analysis,” paste it 10 times, and wonder why interviewers never ask about their research.

Let me break this down specifically: how to structure ERAS research entries, with concrete bullet-style examples that program directors actually like reading.


1. What PDs Are Really Looking For In Research Entries

Forget the mythology. Program directors are not reading your research section thinking “wow, this topic is so groundbreaking.” They care about:

  1. Can you finish what you start? (productivity, outputs)
  2. Did you actually do anything meaningful? (role and ownership)
  3. Do you understand methods enough not to be dangerous?
  4. Are you intellectually curious or just collecting lines on a CV?

Most PDs skim. They are pattern-recognizers. They scan for:

  • Clear product: “Published”, “In press”, “Submitted”, “Poster”, “Oral presentation”
  • Clear role: “First author”, “Co-first author”, “Statistician”, “Designed protocol”
  • Clear methods and scope: “Retrospective cohort of 842 patients”, “Single-center QI”
  • Clear impact: “Changed pathway”, “Reduced LOS by 0.7 days”, “Cited in guidelines”

They definitely notice:

  • Vague entries: “Worked with Dr. X on research in cardiology.”
  • CV-padding: 9 barely distinguishable chart reviews, all “assisted with data collection.”
  • Sloppy structure: walls of text, tense errors, no sense of hierarchy.

Your job is to make every entry:

  • Immediately scannable
  • Honest but maximally specific
  • Structured in the same predictable pattern

2. Core Structure For ERAS Research Entries

ERAS does not force you to use bullets, but you absolutely should “think” in bullets even if you write in compact prose. PDs like predictable repetition.

For each research entry, you want:

  1. Title / Topic + Type of project
  2. Your role and level of responsibility
  3. Methods and population (1–2 concrete details)
  4. Outputs / status (what came out of it)
  5. Impact or skill gained (when relevant)

Here’s the basic template I recommend you adapt:

  • 1st sentence: What the project is + design + population/topic
  • 2nd sentence: Your role + what you physically did
  • 3rd sentence: Outcome/product + any measurable impact

Stay around 2–4 sentences total. No fluff.

Let’s make this very tangible.


3. Bullet-Style Examples: Published / Accepted Manuscripts

This is the easiest category to make impressive, and astonishingly, people still undersell it.

Example 1 – First-author clinical research paper

Bad (what I actually see too often):

Worked on research project looking at heart failure readmissions. Helped with data collection and writing.

Fixed:

First-author on a retrospective cohort study of 612 adults hospitalized with HFrEF to identify predictors of 30-day readmission. Independently extracted EHR data, performed multivariable logistic regression in Stata, and drafted the methods and results sections. Manuscript accepted in Journal of Hospital Medicine (2024).

Notice the structure:

  • Clear type: retrospective cohort
  • Concrete N: 612 adults
  • Clear tasks: extracted EHR, did regression, drafted methods/results
  • Clear output: accepted, named journal, year

Example 2 – Middle-author but substantial contribution

Bad:

Co-author on study evaluating telemedicine for diabetes. Literature review and editing.

Fixed:

Third author on a prospective study evaluating a telemedicine-based diabetes management program for 185 adults with poorly controlled type 2 diabetes (baseline HbA1c ≥ 9%). Led the literature review, created the REDCap database, and coordinated weekly data quality checks. Contributed to revising the results and discussion; manuscript published in Telemedicine and e-Health (2023).

Middle-author can still look serious when your contribution is specific.


4. Posters and Oral Presentations: Turning “Just a Poster” Into Something Legit

Posters are often undersold. You want:

  • Study design
  • Your role in analysis/creation
  • Conference name + level (national/regional)
  • One concrete metric or finding if space allows

Example 3 – National poster, first author

Bad:

Poster presentation on COPD outcomes at ATS conference.

Fixed:

First-author poster presentation at the American Thoracic Society International Conference (2024) on 5-year outcomes of 327 patients with severe COPD enrolled in a pulmonary rehabilitation program. Performed data cleaning and survival analysis in R, created all figures, and presented findings during the general poster session.

This is the difference between “checkbox” and “this student can actually do work.”

Example 4 – Regional oral presentation, small project

Bad:

Oral presentation about sepsis QI.

Fixed:

Oral presentation at the New England Regional Society of Hospital Medicine meeting (2023) describing a resident-led sepsis pathway implementation project. Co-designed the intervention (standardized order set and nursing checklist), collected pre/post-intervention data on 142 sepsis admissions, and presented results showing a 22% reduction in time-to-antibiotics.

That “22% reduction” is what PDs’ eyes lock onto. Specific outcomes signal real ownership.


5. Ongoing Projects, “In Progress,” and Not-Yet-Submitted Work

This is where people either sound like they are actually actively doing research, or like they are inventing future publications.

The key rule: describe what is already done, not only what you hope will happen.

Example 5 – Manuscript in preparation, substantial progress

Bad:

Working on a manuscript about stroke imaging. Plan to submit to neurology journal.

Fixed:

Co-first author on a retrospective study evaluating CT perfusion parameters in predicting hemorrhagic transformation among 204 patients with acute ischemic stroke undergoing thrombectomy. Completed data extraction and preliminary multivariable analysis; drafting the manuscript and figures with planned submission to a neurology journal in late 2024.

You are explicit: N=204, what you did is done, and “planned submission” is reasonable.

Example 6 – Data collection ongoing, but still real

Bad:

Working on chart review of rheumatology patients.

Fixed:

Team member on an ongoing chart review of biologic use in 350+ patients with rheumatoid arthritis at an academic rheumatology clinic. Developed the data abstraction tool in REDCap, pilot-tested 30 charts for inter-rater reliability, and currently abstracting medication exposure and outcome variables.

No publication promised. Just clear, legitimate work.


6. QI and Clinical Projects: How to Write Them as Research

Many schools shove QI under “Other” or “Work Experience.” On ERAS, QI fits nicely under “Research” if there is systematic data, planning, and evaluation.

Program directors like QI because it maps directly to residency life.

Example 7 – QI to reduce unnecessary labs

Bad:

QI project reducing unnecessary labs. Worked with team to implement.

Fixed:

Led a resident QI project to reduce unnecessary daily CBC/CMP orders on a general medicine service. Baseline audit identified 63% of daily labs as guideline-discordant over 4 weeks (n=211 admissions). Co-developed an educational intervention and EHR order set changes; post-intervention data collection is ongoing to assess changes in lab utilization and estimated cost savings.

Methods + baseline numbers + clear role = serious QI.

Example 8 – QI that changed a local protocol

Bad:

Worked on QI project improving discharge summaries.

Fixed:

Co-led a multidisciplinary QI initiative to improve discharge summary completeness for heart failure admissions at a community hospital. Created a standardized discharge template, trained residents and APPs, and audited 100 pre- and 100 post-intervention summaries for inclusion of guideline-recommended elements (medication list, follow-up, weight monitoring). Project results informed permanent adoption of the new template in the EHR.

Notice the subtle flex: “permanent adoption of the new template.”


7. Basic Science Projects: Showing You Actually Understood Something

For basic science or bench work, the trap is useless jargon or vague “helped in lab” phrasing.

You instead want:

  • Specific techniques you used
  • What you worked on (protein, pathway, model)
  • Any product (poster, abstract, manuscript)

Example 9 – Undergrad bench research, no publication

Bad:

Worked in basic science lab studying cancer biology. Helped with experiments.

Fixed:

Undergraduate research assistant in a cancer biology lab studying PI3K/AKT signaling in triple-negative breast cancer cell lines. Performed Western blots, siRNA knockdowns, and cell viability assays under supervision. Contributed to experimental data included in a senior graduate student’s dissertation and a departmental research poster session.

No need to oversell. Show skills, tools, and what the work fed into.

Example 10 – Bench work that produced a poster

Bad:

Basic science project on neurodegeneration. Poster at campus symposium.

Fixed:

Conducted a one-year independent project characterizing microglial activation in a mouse model of traumatic brain injury. Performed immunohistochemistry, confocal imaging, and quantitative image analysis using ImageJ. Presented findings as first-author poster at the university neuroscience research symposium (2021).

This reads like someone who can understand methods sections in real papers, not just pipette things.


8. Student “Research Electives” and Summer Projects

Programs know that many “summer research fellowships” are 8–10 weeks of data cleaning and shadowing. That is fine. Just do not pretend you ran the lab.

Example 11 – One-summer clinical research elective

Bad:

Summer student research on colon cancer outcomes. Helped with chart review.

Fixed:

Selected for a 10-week medical student research fellowship in colorectal surgery. Joined an ongoing retrospective study of 290 patients undergoing laparoscopic vs open colectomy, where I extracted perioperative outcomes from the EHR, maintained the study database, and presented preliminary results at the department’s research conference.

Defined time period, defined task, defined product.

Example 12 – Epidemiology or public health project

Bad:

Public health research in global health during summer.

Fixed:

Participated in a summer global health project analyzing vaccination coverage in rural clinics in western Kenya using routinely collected Ministry of Health data. Cleaned and merged clinic-level datasets in Excel and Stata, generated descriptive statistics, and created visualization dashboards that were shared with local district health officers.

That sounds like someone who can handle data during residency.


9. Non-Published Work, “Graveyard Projects,” and How to Salvage Them

You probably have projects that fizzled: PI left, COVID hit, IRB purgatory, journal rejection with no resubmission. They can still be listed if:

  • You did real work
  • There was a defined project
  • You clearly label the status (e.g., “project discontinued” or “no publication”)

What you must not do is imply a paper exists when it does not.

Example 13 – Abandoned project, still usable

Bad:

Worked on research with cardiologist but project did not work out.

Fixed:

Contributed to an early-phase project assessing adherence to guideline-directed medical therapy in patients with heart failure at a safety-net clinic. Helped design the data abstraction form, piloted chart review for 25 patients, and presented feasibility findings to the clinic leadership. Project was discontinued after institutional EHR transition.

You get credit for effort and experience without pretending more.


10. How Many Entries, How to Group, and When to Merge

A lot of people ask: “Should I create separate entries for each poster / each paper?”

Basic rules:

  • Each published or accepted paper: its own entry.
  • Each national or major regional poster/oral: usually its own entry.
  • Small local posters / campus symposia: can be grouped.
  • Multiple abstracts from the same underlying project: can be within one entry as multiple outputs.

Here is how you can group logically.

Example 14 – One project, multiple outputs

Prospective study evaluating a standardized education bundle to reduce ED revisits for pediatric asthma at a tertiary children’s hospital. I co-developed the intervention, enrolled 83 patients, and collected 30-day revisit outcome data. Results presented as a poster at the PAS meeting (2023) and as an oral presentation at the state AAP chapter conference (2023); manuscript in preparation.

One project, clearly explained, with all outputs.

Example 15 – Local / internal posters grouped

Medical school research day and departmental posters:
– First-author poster on predictors of medication non-adherence in an inner-city hypertension clinic (Medical Student Research Day, 2022).
– Co-author poster characterizing ICU admissions among patients with advanced cirrhosis (Department of Medicine Research Day, 2023).
– Co-author poster on implementation of a standardized handoff tool on the inpatient pediatrics ward (Pediatrics Research Day, 2023).

That can live as one ERAS entry titled “Institutional poster presentations.”


11. Formatting: Tense, Style, and Consistency That Look Like You Care

You do not need fancy bullet icons in ERAS. You do need consistent, sharp phrasing.

A few ground rules:

  • Use past tense for completed work and outputs.
    “Performed,” “analyzed,” “presented,” “published.”

  • Use present/progressive for ongoing work only.
    “Currently collecting,” “drafting manuscript,” “analyzing data.”

  • Lead with strong action verbs:
    “Led,” “co-led,” “developed,” “designed,” “performed,” “analyzed,” “created,” “presented,” “coordinated,” “implemented.”

  • Avoid throwaway verbs:
    “Helped,” “assisted,” “exposed to,” “involved in” — unless followed by something specific.
    Replace “helped with data collection” with “abstracted 200+ charts” or “administered structured interviews to 45 patients.”


12. Specialty-Specific Nuances: What Matters More Where

Different specialties read research slightly differently. This is not subtle; PDs openly admit it.

Research Emphasis by Specialty Type
Specialty TypeResearch Value Focus
Academic IM/NeuroPublications, methods, stats
Surgical specialtiesProductivity, persistence
Competitive ROADFirst-author work, prestige
Community programsQI, practicality
Primary care focusedQI, population, disparities

Internal Medicine / Neurology / EM (academic-leaning)

They care about:

  • Study design and methods (cohort, RCT, regression, survival analysis)
  • Being able to interpret literature and potentially join resident research
  • First-author or stats-heavy roles are highly valued

Your entries should emphasize:

  • Analysis tools (R, Stata, SPSS)
  • Specific methods (multivariable regression, Kaplan-Meier, Cox models)
  • Any peer-reviewed products

Surgery, Ortho, ENT

They care a lot about sheer productivity and grit.

Highlight:

  • Number of projects and outputs
  • Longitudinal commitment to one group or PI
  • “Got it across the finish line after two rejections” vibe (without whining)

Your bullets for surgical research can lean into:

“Managed revisions and resubmission after initial rejection, leading to eventual acceptance in…”

That tells them you do not fold when things get annoying.

Pediatrics, FM, Psych, OB

They often value:

  • QI and systems projects
  • Disparities, community engagement
  • Practical outcomes more than esoteric lab work

Write your bullets to emphasize:

  • Process changes
  • Patient outcomes
  • Interdisciplinary collaboration

13. Two Complete ERAS-Ready Example Entries

To tie it together, here are two fully “copy-paste-able” style entries, at about the right length.

Example A – Clinical research with publication and poster

Retrospective cohort study examining 30-day readmission predictors among 478 adults hospitalized with decompensated cirrhosis at a tertiary care center. I extracted clinical and laboratory data from the EHR, created a cleaned analytic dataset, and performed multivariable logistic regression and sensitivity analyses in R under faculty supervision. Presented results as first-author poster at Digestive Disease Week (DDW, 2023) and co-wrote the manuscript, focusing on the methods and results sections. Manuscript accepted for publication in Journal of Hepatology (2024).

Example B – QI project with local impact, no publication

Co-led a resident QI project to improve appropriate VTE prophylaxis prescribing on the medicine wards. Conducted a 4-week baseline audit of 196 admissions, identifying 31% underuse and 12% overuse of pharmacologic prophylaxis based on institutional guidelines. Helped design and implement an intervention including an EHR order set modification and targeted education sessions for residents and nursing staff. Post-intervention audit of 204 admissions showed underuse reduced to 12% and overuse to 5%; these results were presented at the hospital QI council and led to permanent order set changes.

Both are clearly “real.” Both are easy to scan. Both tell a PD: this person gets stuff done.


14. Common Mistakes That Make PDs Roll Their Eyes

I will be blunt here. These are red flags:

  • Listing “manuscript in preparation” for 8 different projects when you have zero publications.
  • Adding every single poster as an independent entry with one-line descriptions like “Presented poster on X at Y conference.”
  • Using the same generic description template 12 times:
    “Assisted in data collection and literature review for research study in [field].”
  • Writing future fiction: “Will submit to NEJM,” “Plan to publish,” “Expected to present at…”
  • Overusing jargon to hide that you do not know what you did: “Participated in translational precision oncology bioinformatics.”

PDs have a very sensitive BS radar. Better to be modest and precise than grandiose and vague.


15. A Quick Visual: What Strong vs Weak Research Entries Signal

hbar chart: Clear role & methods, Concrete outputs listed, Vague descriptions, Inflated future claims

Impact of Research Entry Quality on PD Perception
CategoryValue
Clear role & methods90
Concrete outputs listed80
Vague descriptions25
Inflated future claims10

Interpretation, roughly from PD surveys and common sense:

  • Clear roles & methods: almost always positive signal.
  • Concrete outputs: strong positive.
  • Vague descriptions: mostly neutral or mildly negative.
  • Inflated future claims: makes people suspicious.

16. Putting It All Into a Workflow

If you want to overhaul your ERAS research section efficiently, do this stepwise.

Mermaid flowchart TD diagram
Workflow for Rewriting ERAS Research Entries
StepDescription
Step 1List all projects
Step 2Categorize by type
Step 3Identify role & outputs
Step 4Draft 2-4 sentence summary
Step 5Standardize verbs & tense
Step 6Check for redundancy
Step 7Finalize in ERAS

And for each project, literally answer these on a scratchpad:

  1. What exactly was the design and population?
  2. What did I physically do?
  3. What products came out of it (or will realistically come out)?
  4. What’s one specific, concrete detail that proves I was actually there?

Turn those answers into 2–4 tight sentences.


17. A Few Specialty-Tailored Example Snippets

To give you more raw material, here are short, ready-to-adapt snippets by specialty flavor.

For radiology

Retrospective review of 430 CT pulmonary angiograms evaluating adherence to evidence-based ordering criteria for suspected PE. Abstracted clinical indications, Wells scores, and D-dimer use from the EHR and calculated yield of positive studies before and after educational intervention. Presented as first-author poster at the state radiology society meeting (2023).

For EM

Prospective observational study of ED patients presenting with syncope, comparing diagnostic yield of routine head CT versus selective imaging based on clinical decision rules. Enrolled 102 patients, administered standardized data forms, and performed chart review for 30-day serious outcomes. Project submitted as abstract to SAEM.

For psychiatry

Chart review of 176 adolescents admitted to an inpatient psychiatric unit assessing rates of metabolic monitoring before and after initiation of second-generation antipsychotics. Developed the data abstraction tool, reviewed charts, and analyzed adherence to monitoring guidelines. Results shared with unit leadership and used to implement a standardized metabolic monitoring checklist.


18. Visual Recap: Components of a Strong Research Entry

Diagram showing components of a strong ERAS research entry -  for Structuring Your ERAS Research Entries: Bullet Examples Tha

If you strip away the wording tricks, every strong entry has the same skeleton:

  • Study design / project type
  • Your role
  • Methods or tasks
  • Output
  • Impact or scale

You can vary the order, but if one of those is missing in every entry, your section will feel hollow.


19. Before-After Example Set (Side-by-Side)

To really drive it home, here’s a mini “before vs after” set.

Before vs After ERAS Research Descriptions
VersionDescription Snippet
BeforeWorked on cardiology research project. Helped with data collection and literature review.
AfterMember of a cardiology research team conducting a retrospective study of 312 patients with NSTEMI to assess adherence to guideline-directed medical therapy at discharge. Abstracted chart data, verified medication lists, and assisted with drafting the results section for an abstract submitted to ACC.
BeforeDid research in medical school on diabetes. Poster at conference.
AfterCompleted a medical student project analyzing 2-year HbA1c trends among 210 adults with type 2 diabetes at a resident clinic. Cleaned and analyzed EHR data in SPSS, created figures, and presented a first-author poster at the state ACP meeting (2022).

You do not need poetry. You need specificity.


20. Final Check: Common Questions You Should Answer For Yourself

Before you lock in your ERAS entries, ask:

  • Does each major project clearly indicate my role and what I actually did?
  • Can an interviewer pick any entry and have 2–3 obvious questions to ask me about methods or results?
  • Am I honest about status (submitted vs in prep vs published)?
  • Have I cut generic filler that could apply to any project?

If the answer is yes across the board, your research section is doing its job.


Key Takeaways

  1. Strong ERAS research entries are built on specificity: design, N, your tasks, and concrete outputs.
  2. Focus on what you actually did and what came out of it, not on vague “exposure to research.”
  3. Use a consistent 2–4 sentence structure for every entry so PDs can quickly see your role, methods familiarity, and ability to finish work.
overview

SmartPick - Residency Selection Made Smarter

Take the guesswork out of residency applications with data-driven precision.

Finding the right residency programs is challenging, but SmartPick makes it effortless. Our AI-driven algorithm analyzes your profile, scores, and preferences to curate the best programs for you. No more wasted applications—get a personalized, optimized list that maximizes your chances of matching. Make every choice count with SmartPick!

* 100% free to try. No credit card or account creation required.

Related Articles