Residency Advisor Logo Residency Advisor

Retention of Local Health Workers After International Partnerships: The Data

January 8, 2026
16 minute read

Local health workers in a low-resource clinic collaborating with international partners -  for Retention of Local Health Work

The global health community keeps repeating the same feel‑good story about “capacity building”. The data on local health worker retention after international partnerships tells a much harsher, more complicated story.

If you look past the glossy partnership brochures and dive into staffing numbers, exit interviews, and payroll data, a consistent pattern emerges: poorly designed international collaborations can destabilize local health workforces, accelerate brain drain, and quietly drain public systems of their best people. Well‑designed ones can do the opposite. The difference is not “good intentions”. It is structure, incentives, and measurement.

Let me walk through what the numbers actually say.


1. What we are really trying to measure

Everyone likes to say “we improved capacity.” That phrase is meaningless unless you tie it to retention.

From a data perspective, “retention of local health workers” is not a feeling, it is a measurable construct built from several metrics:

  • Annual retention rate: proportion of staff who remain in a specific facility or system after 12 months.
  • 3–5 year survival in post: proportion still in the same role (or same district) at 3 or 5 years.
  • Internal vs external migration: movement within the country (e.g., rural → capital city, public → NGO) vs movement abroad.
  • Vacancy and time‑to‑fill: how long critical posts stay empty.
  • Skill‑mix stability: whether the system retains not just raw headcount, but key cadres (nurses, midwives, anesthetists, community health workers).

The question for “international partnerships” is simple: do these metrics improve, worsen, or stay flat relative to comparable settings without such partnerships?

Most published data come from:

  • Academic–hospital partnerships (US/UK/Europe universities with African or Asian teaching hospitals).
  • NGO–MOH collaborations for HIV, TB, or maternal health.
  • Foreign volunteer programs (short‑term missions, long‑term technical assistance).

The evidence is fragmented but quantifiable enough to see patterns.


2. Baseline: retention is already fragile before partnerships arrive

You cannot evaluate impact if you do not know the baseline. Most low‑ and middle‑income countries (LMICs) start from a position of weak retention, especially in rural and underserved areas.

Let’s anchor this with a rough comparative snapshot from multi‑country HRH (human resources for health) reports:

Approximate Annual Retention Rates for Public-Sector Health Workers
Setting / Cadre1-year Retention in Same Facility
High-income urban hospital (nurses)85–92%
High-income rural hospital (nurses)78–88%
LMIC urban referral hospital (doctors)70–85%
LMIC rural district hospital (doctors)45–65%
LMIC community health workers (CHWs)50–75%

In other words, in many rural LMIC settings, 1 in 3 or even 1 in 2 clinicians will have left a given facility within a year. Not all of them leave the country; many shift to NGOs, private practice, or the capital.

Key baseline drivers (again, data, not anecdotes):

  • Pay differentials: NGO or private salaries often 2–5× public sector for similar work.
  • Workload: Doctor‑to‑population ratios so low that burnout is almost guaranteed.
  • Infrastructure gaps: Lack of equipment, drugs, or electricity makes the work demoralizing.
  • Career ladders: Promotion opportunities heavily concentrated in urban centers or national programs.

Into this environment walks the international partner.


3. What the data show about partnerships and retention

There is no single “effect size” for all partnerships. Impact varies wildly. But if you cluster programs by a few core design features, patterns become clear.

3.1 Training‑heavy partnerships: retention is fragile, but can be improved

Most academic partnerships focus on training: residency programs, nursing schools, subspecialty fellowships.

The big fear: you train people, they leave for higher‑income countries. That happens. But the magnitude is very sensitive to design.

Across several documented physician training initiatives in Africa and Asia, three rough groups emerge:

bar chart: Weak Bonding/No Incentives, Moderate Incentives, Strong Localization Policies

Post-training Retention of Locally Trained Specialists
CategoryValue
Weak Bonding/No Incentives55
Moderate Incentives72
Strong Localization Policies88

Interpretation from multiple case series and program evaluations:

  • Weak bonding / no incentives: Around 45–60% of newly trained specialists were still in the public sector 5 years after graduation. The rest had left to NGOs, private sector, or abroad.
  • Moderate incentives (some salary top‑ups, partial rural posting policies): 65–75% 5‑year retention in‑country; a chunk still moved to capital cities or better‑resourced hospitals.
  • Strong localization (local accreditation, clear career paths, bonding contracts, strong institutional identity): 80–90%+ remained in‑country, many staying in teaching hospitals or major public facilities.

The data story: training in partnership is not the problem. Training without a binding institutional and financial environment is the problem.

A concrete example pattern I have seen:

  • A new anesthesia residency starts at a national hospital with a European partner.
  • First cohort: 10 residents graduate.
  • Five years later: only 3–5 are still in the original hospital. But 7–8 are still in the country, often in NGOs or private hospitals.
  • Where the program had explicit promotion tracks and reasonable call schedules, the “still in the original hospital” number jumps.

So the nuance: partnerships often improve “in‑country retention of specific skills,” but may simultaneously accelerate internal migration away from the public system that hosted the partnership.

3.2 NGO‑heavy partnerships: retention inside the system often drops

Now look at international NGO collaborations focused on service delivery (HIV, TB, maternal health). This is where the workforce distortion is usually worst.

You can measure it directly by looking at hiring among NGO partners vs local government over time.

Typical pattern over 3–5 years in high‑funding environments:

  • Total health worker stock in a district increases by, say, 20–40%.
  • But public sector health workers decline or stagnate, while NGO‑employed health workers grow rapidly.

A stylized but data‑consistent example:

Illustrative Workforce Changes During a Major NGO Partnership (5-year period)
Employer TypeYear 0 StaffYear 5 Staff% Change
Ministry of Health200190-5%
Local private sector5060+20%
International NGOs1090+800%

Where do those NGO staff come from? Largely from the first column. The data from exit interviews and HR files in several countries show:

  • 40–70% of new NGO clinical hires were previously public sector employees in the same district.
  • Salary differences of 2–4× were common, often with better working hours and more training opportunities.

So from a national perspective, the total number of health workers has increased. From the public system’s perspective, retention has just cratered.

This is usually invisible in shiny partnership reports because they track “health workers trained” or “health workers employed”, not “health workers retained in the public system that was supposedly being strengthened.”

3.3 Short‑term volunteer partnerships: minimal long‑term retention gains

The data on short‑term missions (2–8 weeks) and volunteer brigades is blunt: they rarely move the needle on local retention, except indirectly and unpredictably.

When researchers have tried to match facilities with high volume of foreign teams versus similar facilities without them, over 3–5 years, the long‑term staffing trends look nearly identical. The only statistically consistent signals are:

  • Slight short‑term morale boosts right after visits.
  • Mild frustration or demotivation when visiting teams perform procedures or use equipment that locals cannot sustain afterward.

In other words: from a retention and workforce stability perspective, the measurable effect is close to zero or occasionally negative, especially if local staff feel overshadowed or bypassed.


4. Specific design features that move retention numbers

The interesting part for you, ethically and professionally, is not just the descriptive statistics. It is which partnership features consistently correlate with better retention outcomes.

4.1 Salary and incentive alignment

The data are boringly clear on this. If international partners pay much more than the local system for the same work, local retention drops.

Programs that take retention seriously use one or more of the following:

  • Salary top‑ups that are:
    • Transparent.
    • Time‑bound.
    • Harmonized with MOH scales instead of inventing their own.
  • Non‑salary incentives: housing, childcare, CME, exam subsidies, conference attendance.
  • Retention bonuses tied to length of service in rural or priority areas.

The win‑rate here is quantifiable. In evaluations where targeted retention packages were introduced:

  • Facilities with no structured incentives: 1‑year rural doctor retention around 45–60%.
  • Facilities with a defined retention scheme (even if modest): 1‑year retention jumps into the 65–80% band.

hbar chart: No Incentives, Modest Non-salary Incentives, Salary + Non-salary Package

Estimated 1-year Rural Doctor Retention by Incentive Level
CategoryValue
No Incentives55
Modest Non-salary Incentives68
Salary + Non-salary Package78

The ethical implication for you as a future or current partner: if your “capacity building” project ignores pay structures, you are not actually serious about retention. You are hoping culture and gratitude will override economics. They will not.

4.2 Local accreditation and career ladders

Another strong predictor: whether training and roles are embedded in local systems.

Programs that:

  • Create or strengthen local specialty boards.
  • Ensure that new credentials are recognized for promotion.
  • Support local leadership (department heads, deans, program directors).

…tend to see much higher in‑country and in‑institution retention of trained staff.

I have seen the opposite scenario play out too many times: colleagues complete a partnership‑branded “advanced diploma” that is not fully integrated into national HR policies. Three years later, they realize it does nothing for their grade or salary. They leave for an NGO that values the skills informally with higher pay, even if the certificate means nothing on paper.

4.3 Distribution of training opportunities

Another subtle but powerful factor: who gets access to international fellowships and courses.

Two patterns:

  • Concentrated opportunity (same small inner circle always chosen):
    • Increases resentment.
    • Often correlates with higher attrition among those excluded.
  • Broad, transparent selection with clear criteria:
    • Reduces perceptions of favoritism.
    • Correlates with better team cohesion and slightly higher retention.

Even simple tracking shows this:

  • Teams where ≥30–40% of staff have received some recognized training (short course, workshop, or placement) in a 3–4‑year period tend to have lower attrition than teams with a small elite repeatedly sponsored abroad.

It is not just the absolute number of training slots. It is the distribution and fairness.


5. Ethics: what the numbers say you should care about

This is a “personal development and medical ethics” topic, not just a workforce planning problem. So let’s connect the dots.

5.1 Capacity building without retention is ethically hollow

If your partnership increases the number of highly trained local clinicians but half of them exit the public system or leave the country within a few years, you have not “strengthened” the system. You have subsidized a labor pipeline for NGOs and high‑income countries.

Some people will argue that as long as individuals benefit, it is fine. That argument dies as soon as you look at where the morbidity and mortality burden sits: in the same under‑resourced communities whose public facilities lose staff.

Ethically solid partnerships:

  • Track retention in the host institution and public sector, not just numbers trained.
  • Define success partly as “X% of trained staff still practicing in the public system at 5 years.”

If you are not measuring that, you are operating in ethical fog.

5.2 The “dual loyalty” problem for local staff

Internationally funded projects often create tension for local clinicians:

  • Loyalty to patients and public service vs loyalty to better‑paid, better‑equipped NGO or research projects.
  • Time “bought out” for research or vertical programs vs core clinical service.

Over time, this shows up in daily schedules. I have seen outpatient departments where the only doctor is “in a workshop” half the week, funded by a partner project, while the waiting room is overflowing.

The data manifestation:

  • Rising outpatient waiting times and unfilled appointment slots.
  • Simultaneously, high completion rates for research protocols or program indicators linked to donor money.

From an ethics standpoint, you as a visiting trainee or partner are part of that system. Pretending you are not complicit because you are “only there for 4 weeks” is naive.

5.3 Short‑term visits with long‑term consequences

Short‑term volunteers often say, “I am only helping; I cannot affect retention.” The numbers disagree.

Three specific ways short‑termers influence retention:

  1. Norms and expectations. If visiting teams routinely bypass local staff, do procedures without meaningful skill transfer, or use equipment that will disappear with them, they reinforce the message: “real medicine happens when foreigners are here.” That corrodes professional pride and long‑term motivation.

  2. Hidden workload. Local nurses and junior doctors often carry the logistical and translation burden for visitors, on top of their own work. Over time, that additional cognitive and emotional load fuels burnout.

  3. Distorted patient preferences. Patients learn to wait for “the foreigners” for certain services. Local staff become second‑choice providers. That is not motivational.

Individually, your 2‑week visit is a drop. Collectively, hundreds of such visits per year create a signal in the system. The sustainability data show that.


6. What ethically serious partners do differently (backed by retention data)

If you want to be on the right side of this, you need to align your individual behavior and your institutional choices with what we know helps retention, not harms it.

Let me translate that into concrete actions.

6.1 Ask for retention data upfront

Before joining or designing a partnership, ask explicit, slightly uncomfortable questions:

  • “Of the staff trained through this program in the last 5–10 years, what percentage are still:

    • in this institution?
    • in the public sector?
    • in‑country?”
  • “How are we tracking those numbers?”

If the answer is vague hand‑waving, that is a red flag. Serious programs often have simple tracking spreadsheets or HR reports. They may not be perfect, but they exist.

6.2 Push for alignment on pay and incentives

You may not control donor budgets, but you can refuse to ignore the obvious:

  • Advocate for harmonized salary scales where possible.
  • If differentials are unavoidable (they often are), push for:
    • Clear justification.
    • Time limits.
    • Complementary support to the MOH (e.g., financing more public positions, not just NGO ones).

From the actual numbers, even modest, fairly structured incentives do more for rural retention than occasional, flashy perks (like a single overseas conference trip).

6.3 Design training that locks into local systems

When you propose a module, fellowship, or workshop, think about:

  • Will this credential be recognized by the local HR and promotion system?
  • Are you co‑designing curricula with local faculty who will stay when you leave?
  • Are you building exam, evaluation, and accreditation capacity locally?

Retention data consistently show that when local institutions own and accredit training, graduates stick around at far higher rates than in externally controlled, brand‑name programs.

6.4 Behave on the ground as if you care about local careers

On a micro‑level, everyday behavior feeds into long‑term retention:

  • Make sure local staff are visibly leading rounds, clinics, and teaching sessions.
  • Avoid doing procedures without a local counterpart unless there is no alternative.
  • Share authorship fairly on research, prioritizing local first or senior authorship where they meet the criteria.

These are not abstract “ethics checkboxes.” They correlate with whether talented local clinicians feel they have a future in their own systems or need to leave to be taken seriously.


7. Practical decision framework for you

To close the loop, here is a simple way to think about your involvement in any global health partnership, as a student, resident, or faculty member.

Mermaid flowchart TD diagram
Ethical Partnership Engagement Decision Flow
StepDescription
Step 1Considering Partnership
Step 2Review Trends 5+ Years
Step 3Advocate for Tracking
Step 4Engage and Support
Step 5Identify Design Flaws
Step 6Address Pay and Training Issues
Step 7High-Risk for Harm
Step 8Limit Role or Redesign
Step 9Retention Data Available
Step 10Retention Stable or Improved

If you cannot get even basic retention numbers and nobody seems bothered by that, assume you are walking into a system that is not serious about ethical workforce strengthening.


FAQs

1. How can a medical student realistically influence retention in a large partnership?
You will not rewrite the HR policy, but you can influence culture and transparency. Ask about retention data in pre‑departure briefings. Choose electives and programs where local faculty lead and where there is visible investment in local training, not just foreign trainee experiences. Your demand as a learner shapes what institutions prioritize.

2. Are bonding contracts for trained staff ethical, or are they coercive?
The data show that structured bonding (e.g., 3–5 years of service in exchange for funded training) combined with fair pay can substantially improve retention without dramatic spikes in dissatisfaction. Coercive is when bonds are imposed without clear benefit or with punitive, unrealistic penalties. Well‑designed bonds are negotiated, transparent, and paired with decent working conditions.

3. Is it always bad if local health workers leave the public sector for NGOs after partnerships?
Not always, but it depends on net system effects. If NGOs are filling genuine service gaps in alignment with national plans, some movement is acceptable. The problem is when NGO growth cannibalizes public services, leaving government facilities hollowed out. The metric to watch is not “NGO headcount” but vacancy and stability in critical public posts.

4. What single change has the biggest positive effect on rural retention?
No silver bullet, but packages that combine modest salary enhancements, reliable housing, safe working conditions, and predictable career progression consistently outperform isolated interventions. If you had to pick one lever that is often ignored, it would be clear, enforced promotion pathways for rural staff who meet competency milestones.

5. How do you ethically justify participating in a short-term clinical mission?
You justify it by being honest about what it does and does not accomplish, and by choosing programs that integrate with local systems. Ethically stronger missions: work under local leadership, have explicit teaching and skills transfer goals, avoid parallel systems, and at least attempt to track longer‑term workforce and service outcomes. If a mission is built entirely around foreign experience with no local capacity or retention metrics, you should question your participation.


Three take‑home points. First, retention is the only honest test of whether “capacity building” did more than create good conference slides. Second, the data show that partnership design—especially around pay, accreditation, and leadership—predicts whether workers stay or leave. Third, as an individual, you are not powerless; your choice of programs, questions, and behavior can nudge partnerships toward models that actually keep local health workers in the systems that need them most.

overview

SmartPick - Residency Selection Made Smarter

Take the guesswork out of residency applications with data-driven precision.

Finding the right residency programs is challenging, but SmartPick makes it effortless. Our AI-driven algorithm analyzes your profile, scores, and preferences to curate the best programs for you. No more wasted applications—get a personalized, optimized list that maximizes your chances of matching. Make every choice count with SmartPick!

* 100% free to try. No credit card or account creation required.

Related Articles