Residency Advisor Logo Residency Advisor

How to Build a Simple, IRB‑Friendly Survey Study as a Premed

December 31, 2025
19 minute read

Premed student designing an IRB-friendly survey study -  for How to Build a Simple, IRB‑Friendly Survey Study as a Premed

The biggest mistake premeds make with survey research is treating it like a casual Google Form instead of a regulated human-subjects study.

You can absolutely build a clean, IRB‑friendly survey as a premed. You just need to think like a junior investigator, not like a student doing a quick class project.

(See also: Basic Biostatistics for Student Researchers for essential statistical concepts.)

Let me walk you through exactly how to do that.


Step 1: Choose a Question That Actually Works as a Survey

Not every idea belongs in a survey study. Many good ideas die here because students pick questions that require chart review, lab work, or long-term follow-up and then try to force them into a 10‑item questionnaire.

You want a research question where:

  1. The key variables are attitudes, knowledge, behaviors, or preferences
  2. Participants can reasonably self‑report those variables
  3. The answers will not require clinical verification to be meaningful

Concrete examples that work well for premed survey research:

  • “What proportion of undergraduate premed students at [your university] feel comfortable discussing mental health concerns with their premed advisor?”
  • “How do medical students at [institution] perceive the usefulness of near‑peer teaching in anatomy?”
  • “What is the level of awareness of HPV vaccination recommendations among college students aged 18–24 at [your university]?”

And examples that are poor choices for a simple survey:

  • “How effective is Drug X at lowering A1C in diabetic patients?” (needs clinical data, not self‑report)
  • “What are the predictors of mortality in ICU patients?” (chart review, complex stats)
  • “What are the genetic determinants of asthma severity?” (lab/genetic work, not surveys)

A simple test: if your entire dataset can be captured by asking someone questions once, with no labs, follow‑up, or chart reviews, you are in survey territory.

For IRB friendliness, focus on:

  • Minimal risk topics (general wellness, academic attitudes, career interests, non‑stigmatized behaviors)
  • Populations you can reasonably access as a premed (your classmates, campus organizations, online student communities, premed clubs)
  • Narrow scope: one main outcome, 1–3 predictors, not 30 variables you hope to correlate

Write your question in the PICO structure where possible (even for survey):

  • Population: who exactly? (e.g., “undergraduate premeds at [X]”)
  • Interest/Exposure: what factor? (e.g., receiving mentorship, attending a workshop)
  • Comparator: sometimes none in simple surveys, or “those who did vs. did not”
  • Outcome: what are you measuring? (e.g., confidence, knowledge score, intention to vaccinate)

Example final question:
“Among undergraduate premed students at University X, is participation in a structured premed mentoring program associated with higher self‑reported confidence in applying to medical school?”

That is clean. That is measurable. That is survey‑appropriate.


Step 2: Anchor Yourself to a Faculty Mentor and an IRB

Premeds often try to “DIY” IRB. That is a shortcut to frustration.

You need three anchors:

  1. A faculty mentor with research experience
  2. An institutional home (your university, or a hospital/center willing to sponsor the project)
  3. The institutional IRB guidelines and templates

How to find a mentor for a simple survey study

Target faculty who:

  • Already work in medical education, public health, behavioral science, or student wellness
  • Have existing interest in the population you want to survey (premeds, med students, patients with a specific condition)
  • Are at your institution and have experience as PI or co‑investigator

Email template structure (keep it tight):

  • 1–2 sentences: Who you are
  • 1–2 sentences: Your interest area and very short idea
  • 1–2 sentences: What you are asking for (guidance on a small, survey‑based study under their supervision)
  • 1 sentence: Offer flexibility and willingness to do the legwork

You are not asking them to design your study from scratch. You are inviting them to be PI or co‑PI on a tightly scoped, student‑driven project.

Understand your IRB environment

Look up:

  • “[Your University] IRB”
  • Download:
    • Student research guidelines
    • Template informed consent for surveys
    • Template protocol for exempt or expedited review
    • Any “quality improvement vs. research” guidance

Most minimal‑risk survey studies of adults fall into exempt or expedited review categories in the U.S., particularly if:

  • You are not collecting identifiers, or only minimal identifiers
  • The topics are not highly sensitive (no detailed illegal behaviors, trauma narratives, etc.)
  • The sampling is non‑vulnerable adults (not prisoners, children, etc.)

Never assume exemption yourself. The IRB determines this. Your job is to design a study that could plausibly be exempt or expedited by:

  • Minimizing risk
  • Limiting identifiers
  • Avoiding vulnerable populations unless supervised by experienced investigators

Your mentor should be the PI on the protocol; as a premed, you are usually “student investigator” or similar.


Step 3: Translate Your Question into Variables and a Clean Survey Design

You now need to break your question into measurable components.

Take the example:
“Among undergraduate premed students at University X, is participation in a structured premed mentoring program associated with higher self‑reported confidence in applying to medical school?”

Key components:

  • Population: undergraduate premed students at University X
  • Main predictor (independent variable): participation in mentoring program (yes/no; maybe also frequency)
  • Outcome (dependent variable): confidence level in applying
  • Covariates (potential confounders): year in school, GPA range, first‑generation status, prior exposure to mentors, etc.

Then pick the survey architecture:

  • Cross‑sectional (single time point)
    • This is what you want as a premed. It is simpler, IRB friendly, and much easier to execute.
  • Anonymous or confidential
    • Anonymous: you collect no identifiable info at all
    • Confidential: you collect identifiers but protect them and separate them from data

For IRB friendliness, if you do not need to re‑contact people or link them to other data sources, design it as anonymous.

Concrete moves:

  • Do not ask for name, student ID, email, phone, exact date of birth, or specific small subgroups that could re‑identify someone.
  • Use broad categories (e.g., age ranges, GPA ranges) rather than precise numbers if feasible.
  • If you plan incentives, use a separate link/form to collect emails for a raffle, not within the survey itself.

At this stage, outline:

  1. Demographics/covariates: 5–10 questions (age range, gender, year in school, etc.)
  2. Main predictor(s): 3–5 questions
  3. Outcome(s): 5–10 questions
  4. Optional: open‑ended “anything else to share?” question (1 item max)

Aim for 5–10 minutes total survey length. Shorter surveys reduce drop‑off and are easier to defend as minimal burden in your IRB application.


Step 4: Build IRB‑Friendly Questions and Measures

Now you transform those variables into actual survey items.

Principles of good, IRB‑friendly survey items

  • Avoid coercive or shaming language
  • Avoid leading questions (“Don’t you think…?”)
  • Avoid double‑barreled questions (“How satisfied are you with premed advising and financial aid support?”)
  • Use established scales when possible
  • Provide “Prefer not to answer” for sensitive items

Use validated instruments when possible

IRBs are more comfortable with instruments that are:

  • Previously used in published literature
  • Clearly referenced
  • Not obviously biased or untested

Examples:

  • For mental health: PHQ‑2, PHQ‑9, GAD‑7 (these may trigger more IRB safety questions, but they are validated)
  • For burnout: Maslach Burnout Inventory subscales or shorter derivatives (many have copyright implications—check usage rules)
  • For self‑efficacy or confidence: Likert‑scale items modeled on published educational research

If you create your own questions:

  • Use clear Likert scales (e.g., 1–5: Strongly disagree to Strongly agree)
  • Standardize your scales across the survey to reduce confusion
  • Avoid excessive free‑text questions; they are harder to analyze and can inadvertently collect identifiable info

Example item set for “confidence in applying”:

Please indicate how strongly you agree or disagree with the following statements:

  • I understand the overall timeline for applying to medical school.
  • I feel confident that I can prepare a competitive medical school application.
  • I know where to seek advice if I have questions about the application process.
  • I feel that I belong in a future medical school environment.

Response options (5‑point Likert):

  1. Strongly disagree
  2. Disagree
  3. Neither agree nor disagree
  4. Agree
  5. Strongly agree

You can then create a composite “confidence score” by summing or averaging these items.

Key IRB‑friendly move: include a statement such as “You may skip any question you do not wish to answer” in both the consent and the survey instruction.


Even for a low‑risk anonymous survey, you are typically doing informed consent, often via an information sheet at the beginning of the survey.

Your IRB website likely has a template “Exempt Survey Consent” or “Information Sheet”. Use it. Adapt minimally.

Standard elements:

  • Study title
  • Investigator names and contact information
  • Purpose: 2–3 sentences, plain language
  • Procedures: explain it is an online survey, approximate time (e.g., 5–10 minutes)
  • Risks: usually minimal, maybe psychological discomfort if topics include stress, etc.
  • Benefits: often no direct benefit / potential to improve understanding of X
  • Confidentiality/anonymity: how you protect their data
  • Voluntary participation: no penalty for not participating or withdrawing
  • Compensation (if any): e.g., raffle entry
  • IRB contact information for questions about rights as a participant

For online surveys, consent is often implied by proceeding.

A standard opening:

By clicking “Next,” you indicate that you are 18 years of age or older, have read this information, and voluntarily agree to participate. You may stop at any time and skip any questions you prefer not to answer.

IRB friendliness points:

  • Be explicit about whether responses are anonymous.
  • Be realistic about risks. “There are no risks” is often not acceptable; say “minimal risk” and specify any potential discomfort in answering some questions.
  • Use non‑technical language (an 8th–10th grade reading level is appropriate for most adult surveys).

If you use incentives:

  • Clarify odds or structure (e.g., “Participants may choose to enter a drawing for one of four $25 gift cards. The odds of winning depend on the number of participants.”)
  • Separate the raffle email collection from the survey responses via a distinct link, if claiming anonymity.

Step 6: Build and Pilot the Survey Using the Right Platform

As a premed, your likely tools are:

  • Qualtrics (common at universities; strong IRB comfort, good branching logic and security features)
  • REDCap (very IRB‑friendly, more common in academic medical centers)
  • SurveyMonkey or Google Forms (often permitted for non‑PHI data in some institutions, but check institutional policies)

Prefer institution‑licensed tools (Qualtrics/REDCap) if at all possible. IRBs and IT security offices are already familiar with them.

Implementation details that matter:

  • Turn on anonymous response collection if the platform supports it.
  • Disable IP tracking and geographic location where possible.
  • Avoid free‑text fields asking “Name” or “Email” unless absolutely required and justified.

Pilot testing

You must pilot.

  • Recruit 3–10 peers similar to your target population.
  • Ask them to:
    • Time how long it takes
    • Point out confusing or ambiguous questions
    • Report any technical issues

Collect their feedback informally, but do not store or analyze pilot data as part of your main study unless your IRB approval explicitly covers it.

After the pilot:

  • Remove or reword confusing questions.
  • Merge overlapping items to keep the survey concise.
  • Check that skip logic (if any) functions correctly.

Premed student piloting a survey study on a laptop with peers -  for How to Build a Simple, IRB‑Friendly Survey Study as a Pr

Step 7: Write the IRB Protocol So It Actually Passes

Now you have:

  • A clear research question
  • Defined variables and study design
  • Draft survey in Qualtrics/REDCap
  • Draft consent/information sheet

Your mentor will help you submit the IRB application. You will often need to fill out:

  • A protocol narrative (1–5 pages)
  • Application form fields in the IRB portal
  • Uploads: survey instrument, consent text, recruitment script/email, any flyers

Key elements of the protocol narrative and how to frame them:

1. Background and Rationale (short and focused)

  • 1–3 paragraphs summarizing:
    • What is known from prior literature
    • What gap exists
    • Why this matters at your institution or population

Use 3–6 citations at most, not a full systematic review.

2. Objectives

State:

  • Primary objective: typically a descriptive or comparative aim
  • Secondary objectives: 0–2 additional questions

Example:

  • Primary: “To assess the relationship between participation in a structured premed mentoring program and self‑reported confidence in applying to medical school among undergraduate premed students at University X.”
  • Secondary: “To describe the overall levels of perceived preparedness for medical school application in this population.”

3. Study Design and Procedures

Describe:

  • Cross‑sectional, online survey
  • Estimated survey length
  • Where link will be hosted (emailed via listserv, posted in learning management system, etc.)
  • That participants complete it once and that there are no follow‑up interactions

Outline exactly what a participant experiences:

  1. Receives recruitment email with survey link
  2. Clicks link → sees consent/information sheet
  3. If they choose to proceed, spends ~5–10 minutes answering questions
  4. Optionally, may click a separate link to enter email for raffle

This kind of procedural clarity reassures the IRB that you have thought through the participant experience.

4. Study Population, Inclusion/Exclusion

Example:

  • Inclusion:

    • Age 18 or older
    • Currently enrolled undergraduate student at University X
    • Self‑identified as planning to apply or considering applying to medical school
  • Exclusion:

    • Under 18
    • Not currently enrolled at University X
    • Not considering a medical school application

Justify that this is a non‑vulnerable adult population and that the survey topics are appropriate.

5. Risks and Risk Management

For minimal‑risk surveys, typical risks are:

  • Psychological discomfort or mild distress in answering certain questions
  • Breach of confidentiality

Your protocol should address:

  • Participants can skip any questions and stop at any time.
  • No directly identifying information is collected in the survey (if true).
  • Data stored on secure, password‑protected institutional servers.
  • Access limited to study team.
  • Any open‑ended responses will be de‑identified before analysis.

If your survey asks about sensitive topics (mental health symptoms, substance use), plan safety language:

  • Provide information about campus counseling resources or hotlines at the end of the survey.
  • Explicitly state you are not providing clinical care or monitoring individual responses.

6. Benefits

Most simple surveys have:

  • No direct benefit to participants
  • Potential benefit to scientific understanding or to improving programs/services

Be honest: “There are no direct benefits to you for participating. The information collected may help improve mentoring programs for premedical students at University X.”

7. Confidentiality

Clarify:

  • Whether survey is anonymous or confidential
  • How data are stored, for how long
  • Who can access raw data
  • Plans for aggregate reporting only (no identifiable quotes, or de‑identified quotes only)

8. Recruitment

Attach:

  • Email script for listservs
  • Social media post text, if relevant
  • Classroom announcement text if faculty are helping you recruit

Avoid any pressure or coercion.

If a faculty member is recruiting their own students, emphasize that:

  • Participation (or non‑participation) will not affect grades or standing.
  • The instructor will not know who participated until after final grades, or not at all, depending on your design.

Step 8: Data Management and Simple Analysis (That You Can Actually Do)

The IRB will not ask you for full statistical plans in as much detail as a grant would, but you should still have a coherent plan.

Data management

  • Export data from Qualtrics/REDCap as CSV or SPSS/Stata file
  • Immediately remove any accidental identifiers
  • Assign a file‑naming convention (e.g., “PremedMentorSurvey_clean_YYYYMMDD.csv”)
  • Store in secure institutional cloud or encrypted device, per your mentor’s guidance

Document:

  • How you handle missing data (e.g., omit cases with >50% missing, simple descriptive handling)
  • That you will only analyze aggregated data

Basic analysis you can realistically manage as a premed

With faculty or statistician support if possible, plan:

  1. Descriptive statistics:

    • Frequencies and percentages for categorical variables (gender, class year, mentoring program participation)
    • Means/medians and standard deviations for continuous scores (confidence score)
  2. Group comparisons:

    • Compare confidence scores between mentors vs. non‑mentors using t‑tests (for two groups) or ANOVA (for multiple categories) if data are reasonably continuous and normal.
    • For categorical outcomes, chi‑square tests.
  3. Exploratory regression (optional, with help):

    • Linear regression with confidence score as outcome, mentoring participation as main predictor, controlling for year in school, GPA range, etc.

You do not need extremely advanced statistics for a straightforward cross‑sectional survey. However, your mentor’s input can prevent misinterpretation.

Specify in your protocol:

  • Primary outcome: mean confidence score
  • Primary analysis: difference in mean confidence between those who did vs. did not participate in mentoring
  • Secondary analyses as exploratory

Do not promise complex structural equation models or mediation analyses unless you already have that expertise and support. Keep the analysis plan bounded and realistic.


Step 9: Presenting and Publishing Without Violating Anything

Once data are collected and analyzed, you want to do something with it.

Usual routes for premeds:

  • Campus research day or undergraduate research symposium
  • Regional or national conferences:
    • AAMC (for educational research, often at the medical school level with a faculty sponsor)
    • AMSA, SNMA, APAMSA, or specialty‑specific student meetings
    • Public health or education conferences
  • Undergraduate research journals or medical education journals (with mentor guidance)

IRB‑related considerations:

  • Stay within the scope of your IRB protocol. If you originally said you were only analyzing certain variables, you can generally still explore others in the same dataset as long as they are covered by your consent document and pose no new risk—but if you dramatically change your aims, check with the IRB.
  • De‑identify all data in presentations:
    • No small‑cell reporting that could identify individuals (e.g., “Only 2 Native Hawaiian students, both in the mentoring group” with identifiable context).
    • Use aggregated categories if necessary.

Work on a one‑page “study at a glance” document with your mentor:

  • Background & question
  • Methods in 5 bullet points
  • Key results in 3–5 bullet points
  • One figure or table

You will reuse this structure for abstracts, posters, and interview discussions.


Common Pitfalls Premeds Hit—and How to Avoid Them

Let me be blunt about what derails student survey projects most often:

  1. No faculty PI
    IRBs will not approve a lone undergraduate as the responsible investigator. Solution: secure a mentor early and have them listed as PI.

  2. Overcomplicated scope
    A 60‑question survey with eight aims is too much. Narrow to 1–2 main outcomes, 10–25 questions.

  3. Sensitive content without safeguards
    Asking detailed questions about self‑harm, trauma, or illegal behavior without a clinical backup plan triggers intense IRB scrutiny. For a first survey project, choose less sensitive topics or work with a mentor experienced in those domains.

  4. Lack of clear recruitment strategy
    “We will send the survey link to everyone we can find” is not a strategy. Define exactly:

    • Which listservs
    • Which classes
    • Which organizations
    • For how long you will keep the survey open
  5. Ignoring institutional IT/security rules
    Some universities prohibit use of Google Forms for research involving human subjects. Confirm with your mentor / IRB which platforms are allowed.

  6. No pilot testing
    Unpiloted surveys often have broken skip logic, confusing questions, and higher abandonment rates. A 30‑minute pilot with 5 people solves most of this.


Putting It All Together as a Premed

If you follow a sequence, you dramatically reduce friction.

A workable timeline for a premed over 1–2 semesters:

  1. Weeks 1–2:

    • Narrow your question
    • Identify and meet with a faculty mentor
  2. Weeks 3–4:

    • Review 5–10 key papers
    • Define variables and survey structure
  3. Weeks 5–6:

    • Draft survey in Qualtrics/REDCap
    • Draft consent text using IRB templates
  4. Weeks 7–8:

    • Pilot test with 3–10 peers
    • Revise survey
  5. Weeks 9–12:

    • Write IRB protocol with mentor
    • Submit to IRB; respond to minor revisions
  6. Weeks 13–18 (or next term, depending on IRB timing):

    • Launch survey
    • Monitor response rates, send 1–2 reminders
  7. Following term:

    • Clean data
    • Run basic analyses with mentor or statistician
    • Prepare abstract/poster

The key is not speed. The key is a clean, coherent, minimal‑risk design that your IRB can understand in one reading and that you can actually execute.


Key Takeaways

  1. Treat your survey as a genuine human‑subjects study: anchor yourself to a mentor, an IRB, and institution‑approved tools, and design a specific, minimal‑risk question that fits a cross‑sectional survey.
  2. Build IRB‑friendly instruments: short, clear, mostly multiple‑choice surveys with validated or well‑structured items, robust consent language, and a realistic data/analysis plan.
  3. Execute systematically: pilot, submit IRB, recruit carefully, protect confidentiality, and then actually disseminate your findings through posters, presentations, or manuscripts—turning a “simple survey” into meaningful, credible research experience.
overview

SmartPick - Residency Selection Made Smarter

Take the guesswork out of residency applications with data-driven precision.

Finding the right residency programs is challenging, but SmartPick makes it effortless. Our AI-driven algorithm analyzes your profile, scores, and preferences to curate the best programs for you. No more wasted applications—get a personalized, optimized list that maximizes your chances of matching. Make every choice count with SmartPick!

* 100% free to try. No credit card or account creation required.

Related Articles