Residency Advisor Logo Residency Advisor

How IMGs Accidentally Undermine Their USCE in ERAS Descriptions

January 5, 2026
15 minute read

International medical graduate reviewing ERAS application on laptop with notes scattered on desk -  for How IMGs Accidentally

Most IMGs are sabotaging their own USCE in ERAS without realizing it.

Not with low scores. Not with weak letters. With their own descriptions.

I’ve watched strong candidates with solid US clinical experience (USCE) get quietly filtered out because their ERAS entries made that experience look trivial, unprofessional, or even suspicious. Same rotations. Same attendings. Different outcome—because of how they wrote 510 characters.

If you’re an IMG, you cannot afford to get this part wrong.

Let’s go through the most common ways IMGs accidentally undermine their USCE in ERAS—and how to fix every single one of them.


1. Treating ERAS Like a CV, Not Evidence of Fit

The first big mistake: copying a CV bullet style into ERAS and calling it a day.

ERAS is not your LinkedIn. Program directors are not counting bullets. They’re scanning for one thing: Does this person understand and fit U.S. clinical practice?

Yet I see entries like:

“Observed patient care in internal medicine clinic. Attended rounds and did chart reviews.”

That’s not wrong. It’s just useless.

What this kind of description screams is:
“Passive. Generic. Could be anyone. Might not even be real USCE.”

You need each USCE entry to answer three silent questions in those 2 seconds a PD spends scanning:

  1. Was this real and structured?
  2. Did you do anything active and clinical (within legal limits)?
  3. Does this experience prove you can function in a U.S. residency system?

If your description doesn’t clearly hit those, you’ve already lost ground.


2. Making Shadowing Sound Like Residency (Huge Red Flag)

This one gets IMGs into trouble fast.

You’re eager to show impact. You want to sound competent. So you write things like:

  • “Managed a panel of 15 patients per day”
  • “Independent decision-making in patient care”
  • “Performed procedures including central line placement and intubation”

You just told the reader one of three things:

  1. You’re exaggerating.
  2. You don’t understand scope-of-practice rules in the U.S.
  3. You’re willing to misrepresent your role to look stronger.

All three are automatic trust-killers.

For observers and externs, your role is what matters more than your “accomplishments.” Program directors know exactly what IMGs are allowed to do. If your description claims more than that, it’s a massive red flag.

Use role-appropriate language:

  • “Assisted with” vs “performed”
  • “Participated in clinical decision discussions” vs “made independent management decisions”
  • “Presented patients” vs “managed my own patient list”

If you were in a genuine hands-on externship, you still weren’t an intern. Don’t sell it that way. You do not need to pretend to be something you’re not to be competitive. You just need to sound honest and aligned with reality.


3. Hiding the Nature of the Experience (Observership vs Externship vs Research)

Another classic IMG mistake: ambiguity.

I see so many entries like:

“Internal Medicine Clinical Rotation – XYZ Medical Center”

Was that:

  • A paid externship?
  • An unpaid observership?
  • A student rotation?
  • Some random “clinical experience” arranged through a friend-of-a-friend?

You might think being vague makes it sound more official. It doesn’t. It makes it sound suspicious.

Be explicit. Label it clearly in the Experience Type and the Description:

  • “Internal Medicine Observership – Outpatient clinic”
  • “Hands-on Externship – Community hospital”
  • “Elective Clerkship – U.S. medical school-affiliated hospital”

If you try to blur the lines to “upgrade” an observership into a pseudo-externship, program directors will catch it. They’ve seen hundreds of these. They know which places offer what.

Here’s what you actually want your entry to say (implicitly):
“I know the difference between observership, externship, and clerkship. I’m not trying to trick you.”

bar chart: Observership, Externship, Elective Clerkship, Research w/ Clinic, Unofficial Shadowing

Common USCE Types for IMGs
CategoryValue
Observership60
Externship25
Elective Clerkship15
Research w/ Clinic20
Unofficial Shadowing40

Notice something: observerships and unofficial shadowing are common. Your job isn’t to pretend they weren’t. It’s to describe them correctly and strategically.


4. Wasting Characters on Obvious, Generic Tasks

Another way IMGs weaken their USCE: filling the limited character space with fluff.

I routinely see descriptions like:

“Attended rounds, morning reports, and conferences. Observed patient care. Learned about the U.S. healthcare system.”

That could be literally every IMG ever. It tells the reader nothing about you.

The mistake: listing activities anyone can guess, instead of highlighting behaviors that show you can function as a resident.

Bad approach:

  • Restating the rotation title
  • Listing generic things all students do
  • Writing a mini job description for “medical trainee”

Better approach: pick 2–3 specific, meaningful actions that show clinical thinking, professionalism, and communication.

For example:

That sounds real. Concrete. Close to residency tasks, but within a supervised learner role.

Ask yourself:
“If this line were removed, would any PD miss crucial information about me?”
If the answer is no, delete it.


5. Overloading the Description with Buzzwords and Jargon

On the opposite side of generic fluff is this gem: buzzword salad.

I see things like:

“Engaged in interdisciplinary, patient-centered care with evidence-based, holistic management, implementing quality improvement principles.”

That sounds like someone swallowed a program brochure.

Program directors aren’t impressed by buzzwords. They’re assessing clarity and judgment. Overly fancy language often signals the opposite.

Watch for this pattern:

  • Many adjectives, few specifics
  • “Patient-centered,” “holistic,” “interdisciplinary,” “evidence-based” stacked together
  • Zero actual examples of what you did

Strip it down.

Instead of:

“Engaged in interdisciplinary, evidence-based care for a diverse patient population.”

Say:

“Joined multidisciplinary rounds (MD, RN, PT, SW) and saw how discharge plans were coordinated for complex patients.”

See the difference? Same idea. One sounds like a brochure. The other sounds like you were actually there.


6. Ignoring the Program Director’s Real Question: “So What?”

Your USCE description is not just “what I did.”
It’s “what that experience says about how I will function as a resident.

Too many IMGs stop at listing duties. They never connect it to residency-relevant skills.

For example:

“Observed outpatient cardiology clinic, reviewed EKGs and echocardiograms.”

Okay. So what?

Fix it like this:

“Observed outpatient cardiology clinic; practiced interpreting basic EKGs and echocardiograms with attending feedback, improving my ability to recognize common pathology before presentation.”

You’re showing growth. Reflection. A skill trajectory.

No, you’re not writing a personal statement paragraph in ERAS. But 1–2 words added—“with feedback,” “to improve,” “to strengthen”—can completely change how mature you sound.

Good ERAS USCE entries:

  • Show progression (from observing to presenting, from reading to interpreting)
  • Show integration (how you used EMR, labs, imaging, notes together)
  • Show adaptation (how you adjusted to U.S. system differences)

Bad entries just read like logs.


7. Cramming Too Much Into One Line (The “Overstuffed Sausage” Problem)

ERAS gives you space. Not much, but enough.

IMGs often react in panic:
“I must include every little thing I did or it won’t count.”

So they write:

“Observed patient care, attended rounds, collected histories, performed physical exams, assisted with procedures, wrote notes, presented to attendings, participated in teaching sessions, attended conferences and M&Ms, did literature review and QI projects.”

That’s not impressive. That’s unreadable.

Program directors skim. They won’t parse that wall of text. They’ll mentally classify it as “IMG doing everything and nothing.”

Use structure. Short sentences. Implied hierarchy.

For example:

“4-week internal medicine observership in a community hospital.
Observed full inpatient workflow from admission to discharge.
Practiced focused H&Ps and problem lists under supervision, then presented to attending during rounds.
Reviewed labs/imaging in Epic and linked findings to daily management plans in case discussions.”

Is it perfect? No. Is it skimmable, believable, and residency-relevant? Yes.

You’re writing for tired people with limited attention. Respect that.


8. Failing to Connect USCE to the Target Specialty

Here’s a painful pattern:
An IMG applying to internal medicine, with 3 solid IM USCE rotations… described like random generic clinical exposure.

You’re allowed to aim.

If you’re applying to IM, your IM USCE description should read like:
“I belong on your internal medicine team.”

But instead, I often see:

“Experienced a variety of clinical cases and presentations, learned to communicate with patients and families, and gained exposure to healthcare delivery in the U.S.”

That can belong to psychiatry, surgery, OB, whatever. No identity.

Make your IMCE (Internal Medicine Clinical Experience) scream internal medicine. Small edits are enough:

  • Mention chronic diseases: diabetes, CHF, COPD, CKD
  • Mention core IM tasks: medication reconciliation, discharge planning, diagnostic reasoning for complex presentations
  • Mention team structure: intern, resident, attending, consultants

Example transformation:

Weak:

“Observed inpatient and outpatient cases in internal medicine.”

Stronger:

“In a 4-week inpatient IM observership, followed patients with CHF, COPD, and uncontrolled diabetes from admission through discharge planning. Learned how the IM team coordinates meds, consults, and follow-up to reduce readmissions.”

Now a PD can actually see how this relates to their world.

Residency program director reviewing ERAS applications with a focus on USCE descriptions -  for How IMGs Accidentally Undermi


9. Undermining Themselves with Sloppy Formatting and Language

Yes, language matters. No, they’re not grading you like an English professor. But they are judging professionalism and attention to detail.

I’ve seen:

  • No capitalization
  • Text-message grammar
  • Inconsistent tense: “Observed… assisting… I also present…”
  • Obvious copy-paste jobs with mismatched rotation names

This doesn’t say “non-native English speaker.” It says “I didn’t bother.”

For IMGs, you’re already being evaluated more critically. Unfair? Maybe. Real? Absolutely.

Quick rules:

  • Stick to past tense for completed experiences
  • Use simple sentence structures
  • Avoid long, twisty sentences with multiple clauses
  • Triple check spelling of: hospital names, specialties, EMR systems

And don’t try to sound “fancy.” Clear beats sophisticated every single time.


10. Burying the Most Important Signal: Recency and U.S. System Exposure

Program directors care about recency of USCE. A sharp candidate with 4 months of USCE from 5 years ago will be questioned. Same person with 2 months from last year feels much more reassuring.

You cannot change your timeline, but you can make it legible.

IMGs often forget to:

  • Make the dates clear
  • Highlight that an older USCE still reflects current practice (if it does)
  • Show any ongoing connection to the U.S. system (remote follow-up, research tied to that clinic, etc.)

Do not try to “hide” old dates. It doesn’t work.

Instead, use your description to make it still relevant:

“Completed a 4-week inpatient IM observership in 2021; stayed in contact with supervising attending and later collaborated on a retrospective chart review project based on cases from this rotation.”

Suddenly, that old USCE looks like part of a sustained relationship, not just a one-off trip.

Mermaid flowchart TD diagram
How Program Directors Read Your USCE
StepDescription
Step 1See USCE Entry
Step 2Still scan for red flags
Step 3Trust decreases
Step 4Evidence of residency-like skills
Step 5Looks like passive shadowing
Step 6Check for ongoing US connection
Step 7Positive impression
Step 8Neutral or negative
Step 9Recent?
Step 10Clearly Defined Role?
Step 11Realistic Responsibilities?

Your job is to move them toward G and J, and away from F and K.


11. Making Every USCE Entry Sound Identical

If all your USCE descriptions sound like copy-paste with minor word changes, you’ve just told the committee:

  • You didn’t think about what each rotation truly added
  • You might be inflating or recycling content
  • You don’t understand that variety of exposure is an asset

I’ve seen ERAS entries where three different IM rotations had nearly the same description. That’s not just lazy. It actively weakens the overall story.

For each USCE entry, ask:

  • What was unique about this experience? Setting? Patient population? Level of involvement?
  • Did I gain something new here I did not mention in another entry?
  • Can I highlight one distinct strength this particular rotation built?

For example:

  • Rotation A: focus on EMR, inpatient workflow, transitions of care
  • Rotation B: focus on outpatient chronic disease management, communication
  • Rotation C: focus on diagnostic reasoning for complex multimorbid patients

Same specialty. Different angles. That creates depth instead of repetition.

Weak vs Strong USCE Description Features
AspectWeak DescriptionStrong Description
Role clarityVague, inflatedClearly observer/extern/student
SpecificityGeneric tasksConcrete, rotation-specific actions
LanguageBuzzwords or broken EnglishSimple, clear, professional
Relevance to specialtyNon-specific clinical exposureTied to target specialty scenarios
Evidence of growthJust a task listShows feedback, learning, or improvement

12. Copying Friends’ or Agency Templates (Big Silent Killer)

This one’s touchy, but I’ve seen it a lot.

Some USCE “programs” or agencies hand out suggested ERAS descriptions. Or seniors share their “perfect” bullets. So you copy them. Or lightly edit.

Two problems:

  1. Programs start seeing identical or near-identical descriptions from multiple IMGs from the same region or agency. That looks mass-produced.
  2. Those templates are written to sell the agency, not to truthfully and precisely describe you.

If 4 applicants in the same cycle all wrote:

“Actively participated in patient care, performed physical exams, assisted with procedures, and presented to attendings daily”

Someone in that selection meeting is going to raise an eyebrow.

Do not outsource your credibility.

By all means, study good examples. Steal structure. Not sentences. Personalize:

  • Change generic “assisted with procedures” to the exact role you had
  • Name the EMR system, clinic type, or patient population
  • Reflect what you actually did, not what the flyer promised

International medical graduate comparing agency template to personalized ERAS description -  for How IMGs Accidentally Underm


13. Leaving Out the Hidden Gold: Systems and Communication

The most underrated mistake: IMGs forget to show that they understand systems and communication, not just medicine.

Programs worry about IMGs struggling with:

  • EMR navigation
  • Paging systems
  • Interprofessional communication (nurses, case managers, social work)
  • Documentation norms

You don’t need to write an essay on this. You just need to drop a few very specific hints.

Examples:

  • “Learned to page consultants and clarify recommendations during rounds.”
  • “Observed how case managers and social workers coordinated safe discharges for patients with limited resources.”
  • “Practiced writing concise, problem-based notes in Epic under supervision.”

These tiny phrases carry enormous weight. They say:

  • “I’ve seen the machine.”
  • “I know residency is not just about diagnoses.”
  • “I won’t be completely lost on day one.”

Skip this, and you sound like you did biology tourism, not residency prep.


FAQ (Exactly 4 Questions)

1. Should I put “observership” in the title of the ERAS entry or just in the description?
Don’t play games here. If it was an observership, make that clear in both the experience type and the description. The title can be “Internal Medicine Observership – XYZ Hospital.” If you try to hide it under “Clinical Experience” without specifying your role, many programs will assume you’re inflating and move on.

2. Can I mention that I “performed” histories and physical exams if I wasn’t officially responsible for the patient?
Use precise language. If you took H&Ps under supervision as a learner and they were not official billing encounters, say “practiced focused histories and physical exams under supervision” or “performed practice H&Ps which were reviewed by the attending.” That’s accurate and doesn’t suggest you were acting independently or outside your role.

3. How many tasks should I list for each USCE experience in ERAS?
Three to five well-chosen, specific points (expressed in short sentences) are usually enough. More than that and you risk either repetition or a wall of text that no one reads. Focus on variety: one about clinical interaction, one about systems/EMR, one about communication or presentations, one about learning or feedback.

4. Is it better to combine multiple short USCEs into one ERAS entry or separate them?
If they’re truly distinct (different sites, roles, specialties), separate them. That lets you highlight different skills and settings. If they’re multiple tiny, nearly identical shadowing days in the same clinic, you can reasonably group them, but be honest about the total duration and make the description specific enough that it doesn’t look like padding your hours.


Remember:

  1. Your USCE itself might be solid—but a sloppy, inflated, or generic ERAS description can completely erase its value.
  2. Specific, honest, role-appropriate language beats buzzwords and exaggeration every single time.
  3. Every USCE entry is a quiet audition: prove you understand the U.S. system and can function like a supervised, trustworthy future resident. Don’t throw that away with 510 careless characters.
overview

SmartPick - Residency Selection Made Smarter

Take the guesswork out of residency applications with data-driven precision.

Finding the right residency programs is challenging, but SmartPick makes it effortless. Our AI-driven algorithm analyzes your profile, scores, and preferences to curate the best programs for you. No more wasted applications—get a personalized, optimized list that maximizes your chances of matching. Make every choice count with SmartPick!

* 100% free to try. No credit card or account creation required.

Related Articles