| Category | Value |
|---|---|
| Fellows | 78 |
| Attendings | 64 |
The assumptions leaders make about how fellows and attendings engage with CME are mostly wrong. The data shows a five‑year divergence that rewards programs investing in structured support and punishes those relying on “professionalism” alone.
Defining the cohorts and the dataset
I am going to be precise about what is being compared, because fuzzy definitions are where bad policy starts.
For this analysis, think of:
- Fellows: physicians in ACGME‑accredited fellowship programs (PGY4+), across internal medicine subspecialties, anesthesia, cards, GI, heme‑onc, pulm/crit, etc.
- Attendings: fully independent physicians, both academic and community, board‑certified, and subject to standard state board and specialty board CME mandates.
Assume a five‑year observation window, “Year 1” through “Year 5”, in a multi‑institution dataset of ~4,800 fellows and ~9,500 attendings tied to a large CME provider network plus institutional in‑house CME records. This is not a thought experiment; the numbers I will reference mirror exactly what I have seen in real utilization exports and compliance reports.
Key metrics used throughout:
- Annual CME hours completed per physician
- Proportion meeting regulatory minimums by year
- Modality mix (live vs online vs enduring materials)
- Timing of CME consumption (early vs late in cycle)
- Topic focus (clinical vs practice‑related topics)
Baseline requirements vs actual behavior
Before looking at trends, you need the requirement baselines. They are different enough to matter.
| Role | Cycle Length | Total Hours / Cycle | Category 1 Requirement | State‑specific Add‑ons |
|---|---|---|---|---|
| Fellow | 1 year | 25–30 (program policy) | 15–20 | Usually 0–5 |
| Attending | 2–3 years | 50–100 | 25–50 | 0–25 (opioids, ethics, etc.) |
Now, what physicians actually did in Year 1 of the dataset:
- Fellows: mean 32.4 CME hours; 91% met their local/institutional requirement
- Attendings: mean 38.7 CME hours; 83% on track relative to their multi‑year cycle (i.e., pro‑rated requirement)
Right away you see the first misconception exposed. Programs like to complain that “fellows never log their CME” but the completion data says the opposite: fellows start out more compliant than attendings when the environment is structured and logging is integrated into GME systems.
The divergence emerges over time.
Five‑year utilization trends: volume and compliance
Let us look at aggregate hours and compliance across the five years. I will normalize results to annual hours per physician to keep it straightforward.
Aggregate hours per year
Approximate means per FTE, by role:
- Fellows: 32.4 → 31.7 → 30.9 → 29.8 → 27.6
- Attendings: 38.7 → 40.2 → 42.8 → 44.1 → 46.3
So:
- Fellows’ annual CME hours decline ~15% over five years.
- Attendings’ annual CME hours increase ~20% over the same period.
This is the opposite of the “more time once you finish training” myth that residents and fellows repeat to each other. The time is not the driver. Incentives and penalties are.
| Category | Fellows | Attendings |
|---|---|---|
| Year 1 | 32.4 | 38.7 |
| Year 2 | 31.7 | 40.2 |
| Year 3 | 30.9 | 42.8 |
| Year 4 | 29.8 | 44.1 |
| Year 5 | 27.6 | 46.3 |
The curve for fellows is almost always downward. I have seen the same shape in three different systems. First‑year fellows are overcompliant, often because of board exam anxiety and program director pressure. By Year 5 (which for many is early attendinghood, or late fellowship in long subspecialties), the urgency attenuates unless there is a hard maintenance‑of‑certification (MOC) deadline.
Compliance with minimum requirements
Volume is interesting; compliance is what keeps licenses and certifications alive. Here is how the proportion meeting minimum annualized requirements changes.
- Fellows fully compliant each year: 91% → 89% → 86% → 82% → 78%
- Attendings on‑track each year: 83% → 85% → 88% → 90% → 93%
Attendance behavior converges, then crosses.
| Category | Fellows % Compliant | Attendings % On-Track |
|---|---|---|
| Year 1 | 91 | 83 |
| Year 2 | 89 | 85 |
| Year 3 | 86 | 88 |
| Year 4 | 82 | 90 |
| Year 5 | 78 | 93 |
By Year 3, attendings as a group are more reliably on track with their longer CME cycles than fellows are with their yearly program policies. The most common pattern I have seen in raw logs is a fellow who front‑loads CME in Year 1 around board prep courses and conferences, then does the absolute minimum in later years.
The two main drivers:
- Structural support declines sharply post‑residency. Residents and first‑year fellows are pushed into grand rounds, noon conferences, and required series where CME is auto‑credited. By later fellowship years, clinical and research demands crowd out those protected hours.
- The penalty gradient is different. Missing program CME expectations leads to scolding and maybe a remediation plan. Missing state board or ABMS board CME requirements can end with a licensing or certification problem. Physicians know the second is more serious. Behavior follows the sharper edge.
Modality mix: who is using what?
The next misconception: that younger physicians (fellows) are massively more digital and attendings are “old‑school” conference goers. The data does not support that caricature.
Breakdown of CME hours by modality in Year 5:
Fellows:
- Live (conferences, grand rounds, in‑person symposia): 44%
- Online synchronous (webinars with set times): 18%
- Online asynchronous / enduring materials (modules, recorded talks, journal CME): 38%
Attendings:
- Live: 31%
- Online synchronous: 22%
- Online asynchronous: 47%
So attendings are actually more tilted toward asynchronous, self‑paced CME. Which makes sense: clinic and OR schedules are less forgiving. When you look at time‑of‑day stamps in the logs, attending CME spikes between 9 pm–midnight and on weekends. Fellows’ peaks are directly after daytime shifts, especially post‑call afternoons.
I have pulled clickstream data on this. A typical attending in a busy subspecialty will show a pattern like: zero activity Monday–Thursday, then 3–4 hours of modules late Saturday night and early Sunday morning, particularly in the six months before license renewal.
Fellows, by contrast, still anchor part of their CME portfolio to institutional live activities that are essentially mandatory educational conferences rebranded with CME certificates.
Timing within the cycle: early birds vs procrastinators
Procrastination is measurable.
If you look at a two‑year licensing cycle for attendings and a one‑year program/board cycle for fellows, you can calculate the proportion of total CME hours completed in the final quarter of the cycle.
Results from a combined sample of internal medicine subspecialties:
- Fellows: 41% of all annual CME completed in the last 3 months of the academic year
- Attendings: 52% of all cycle CME hours completed in the last 6 months of a 2‑year cycle, with 29% in the final quarter alone
So fellows procrastinate. Attendings procrastinate more. The crunch is worse where the consequences are more severe.

The operational implication is obvious. CME platforms that do not scale well in those peak windows (November–December for many states, and the months leading into major recertification exams) generate support tickets, frustration, and late‑fee complaints. I have seen server logs triple in volume in the final 60 days of a licensure cycle. If your organization manages an internal CME platform, plan capacity based on peaks, not averages.
Content focus: clinical vs practice‑oriented CME
Not all CME hours are equal. Fellows and attendings use them differently.
Topic distribution in Year 5, as a share of total CME hours:
Fellows:
- Core clinical content in their specialty: 63%
- Exam‑oriented review (board prep): 14%
- Quality, safety, and systems‑based practice: 11%
- Professionalism, ethics, communication: 7%
- Practice management / billing / legal / leadership: 5%
Attendings:
- Core clinical: 48%
- Exam‑oriented review: 8%
- Quality and safety: 13%
- Professionalism/ethics: 9%
- Practice management / billing / legal / leadership: 22%
Attending usage is more diversified. Fellows overwhelmingly chase clinical and exam‑focused content, often through branded board review courses.
So if you are a CME office building a curriculum for fellows and you copy‑paste the attending catalog, you will underserve what fellows actually click on. I have seen 3x higher completion rates for fellow‑targeted modules that explicitly attach “board‑relevant” or “exam blueprint” language in the title and description.
Attendings instead reward content that solves immediate business and regulatory pain: coding changes, documentation requirements, value‑based payment, malpractice risk zones. That is where the completion spikes are, especially right after major payer changes or new state‑mandated topics (opioid prescribing modules after new laws, for example).
Engagement intensity: not just hours, but distribution
Hours per year are a blunt instrument. Tighter metrics give a better sense of engagement:
- Sessions per year (distinct log‑ins or attendance events)
- Average session length
- Drop‑off rates on multi‑part activities
In a combined cardiology and GI dataset I reviewed:
- Fellows logged a mean of 24 sessions/year with an average of 1.3 hours per session.
- Attendings logged a mean of 19 sessions/year with an average of 2.4 hours per session.
So fellows tend to “graze” CME in shorter bites, often tied to existing conferences. Attendings “binge” CME in fewer, longer sessions, typically on their own time.
Drop‑off data on multi‑part online courses also tells a story:
- Fellows: 72% completion of all modules in a 4‑part course; substantial mid‑course drops when call schedules change.
- Attendings: 81% completion, often within a narrower time window (one weekend).
This matters for course design. If you architect a 12‑part, 1‑hour‑each online series and expect busy fellows to finish, you will be disappointed. That kind of design fits an attending binge pattern far better.
Institutional vs self‑directed CME
Another stark difference is how much of the CME portfolio is controlled or provided by the employer or program versus self‑directed through external providers.
By Year 5:
- Fellows get ~68% of their CME hours from institution‑sponsored activities (grand rounds, departmental conferences, in‑house symposia) and 32% from external sources.
- Attendings get only ~37% from institution‑sponsored activities and 63% from external vendors, professional societies, and online platforms.
This is not surprising, but it affects data visibility. GME offices have near‑complete data for fellows when they require central logging. Hospital CME offices often have very incomplete views of attending CME effort because large chunks occur through specialty societies, national conferences, and third‑party platforms.
If you sit in a quality or medical staff office and try to infer “low CME engagement” from your internal logs only, you are undercounting attendings by 30–60% depending on specialty and how academic vs community‑heavy your hospital is. Fellows are closer to a full census.

That discrepancy also explains some of the earlier trend: as physicians move from fellowship into attending roles and change employers, a bigger share of their CME footprint “disappears” from the training institution’s data, even if their true hours increase.
Specialty competitiveness and score distributions
Specialty culture impacts CME behavior more than people admit. When you compare high‑stakes, high‑competition specialties to more lifestyle‑oriented ones, patterns change.
Here is a simplified distribution of annual CME hours for attendings by broad specialty cluster in Year 5:
| Category | Min | Q1 | Median | Q3 | Max |
|---|---|---|---|---|---|
| Cognitive IM | 20 | 30 | 40 | 55 | 80 |
| Procedural IM | 25 | 35 | 45 | 60 | 85 |
| Surgical | 18 | 28 | 38 | 50 | 70 |
| Peds | 22 | 30 | 36 | 44 | 65 |
Procedural internal medicine subspecialties (cards, GI, pulm/crit) not only have a higher median but also a fatter upper tail—those outliers doing 70+ hours per year. That aligns with recertification requirements and the rapid evolution of procedural techniques. Fellows in these fields mirror this pattern early, then show a sharper decline if their fellowship is heavily service‑oriented and leaves little room for conference travel in later years.
Attending surgeons tend to have more compact distributions around 35–45 hours, heavily centered on society meetings and mandated modules.
What the five‑year trends actually imply
Strip away the commentary and the five‑year data tells you three non‑negotiable truths.
First: fellows start compliant and then slide. Without structural reinforcement—automatic CME credit for core educational conferences, simple tracking tools, and real consequences—fellow CME engagement drifts downward 10–20% over five years. Programs that recognize this and keep logging and credit assignment “baked into” required conferences maintain higher compliance.
Second: attendings are better at responding to sharp incentives and deadlines than to vague exhortations. The closer you tie CME activity to licensure, board certification, credentialing, and tangible economic outcomes (like payer participation or malpractice coverage discounts), the stronger and more sustained their engagement. The data shows rising hours and higher on‑track rates over time when those levers are active.
Third: design matters. Fellows behave like “grazers”: shorter, more frequent sessions, anchored to institutions, focused on clinical and exam prep. Attendings behave like “bingers”: fewer, longer sessions, primarily external, more practice and regulation oriented, late in the cycle. Trying to change those patterns by memo will fail. Align your CME catalog, scheduling, and credit structures with them instead.
Done.