VR and AR in the OR: Safety Missteps That Can Put You at Risk

January 8, 2026
15 minute read

Surgeon using AR overlay in operating room -  for VR and AR in the OR: Safety Missteps That Can Put You at Risk

The rush to bring VR and AR into the operating room is moving faster than our safety habits. That disconnect will hurt patients and careers if you are careless.

You are being sold a future of “smart” surgery: headsets that overlay anatomy, VR platforms that “guarantee” competency after a few modules, flashy dashboards that promise zero complications. Underneath the marketing, there are landmines. Ethical ones. Cognitive ones. Very practical, lawsuit-generating ones.

Let me walk through where smart people are already making dumb mistakes with VR/AR in the OR—and how you avoid becoming the cautionary story others whisper about in M&M.


1. Blind Trust in the Tech: The Most Dangerous New Habit

pie chart: Highly confident, Somewhat confident, Neutral, Skeptical

Clinician attitudes toward VR/AR reliability
CategoryValue
Highly confident35
Somewhat confident40
Neutral15
Skeptical10

The most lethal error is subtle: shifting your trust from your brain and hands to whatever is projected in your field of view.

You know the line: “The map is not the territory.” AR is a map. VR simulators are maps. They are useful, but they lie. Sometimes a little. Sometimes a lot.

Common manifestations of blind trust:

  • Accepting AR overlay anatomy as “exact” instead of approximate
  • Treating VR training scores as proof of real-world competence
  • Assuming registration and tracking are still accurate 2 hours into a case
  • Believing vendor claims about error rates without reading the fine print

I have watched surgeons in early AR trials continue dissecting along an overlay path even as tissue clearly did not match: shifted vessels, aberrant anatomy, swelling. They were following the hologram, not the patient.

Do not make that mistake.

Practical rules to protect yourself:

  • Treat every AR overlay as a hypothesis, never as ground truth. You verify it with anatomy, imaging, and tactile feedback.
  • Build habits where you intentionally look away from the display and re-orient using old‑fashioned landmarks. Force your brain to stay in charge.
  • Before relying on AR guidance, say out loud to the team what its limitations are: registration accuracy, lag, reliance on pre-op imaging that may no longer reflect reality.

If the tech and the tissue disagree, the tissue wins. Every time. If you need the AR to be right to justify what you are doing, you are already unsafe.


Surgeon explaining AR-assisted procedure to patient -  for VR and AR in the OR: Safety Missteps That Can Put You at Risk

Another major misstep: using VR/AR intraoperatively without specific, meaningful consent.

I have seen consent forms that bury AR use in a vague line like “modern surgical visualization techniques may be used.” That is not informed consent. That is legal camouflage.

Ethical and legal red flags:

  • Not telling patients their procedure will rely on experimental AR guidance or a new VR‑based planning platform
  • Skipping discussion of what happens if the tech fails mid‑case
  • Not addressing data use if their anatomy or procedure is recorded, processed, or used to “improve algorithms”
  • Using AR/VR systems that are still under study, but not enrolling the patient in a trial or disclosing that status

You must spell out:

  • That you will use VR/AR tools, what they do in plain language, and what they do not do
  • Whether this tech is standard of care at your institution or novel/adjunctive
  • That the surgery will proceed safely if the system fails or is turned off
  • How their images/data may be stored, transmitted, or anonymized

A simple rule: if you would hesitate to explain the tech’s role honestly because you fear the patient will say “no,” then you should hesitate to use it at all.

Ethically, the patient gets to be wary of new tools. Legally, a bad outcome tied to experimental tech that was not clearly disclosed is fertile ground for litigation.

Do not hide behind “This is just like a better microscope.” It is not, especially when data is leaving the hospital or when the visualization is actively guiding your decisions.


3. Data, Privacy, and the “Oops, It Was in the Cloud” Problem

Many VR and AR platforms live on someone else’s server. That means your patient’s anatomy, images, and sometimes intraoperative video are being uploaded, processed, and stored beyond your walls.

Common, career-threatening mistakes:

  • Uploading identifiable DICOM data into VR planning software without confirming HIPAA compliance and data handling policies
  • Allowing live OR video streams to be piped through vendor servers for “support” without hard limits on recording and retention
  • Using personal devices (tablets, headsets linked to your private account) for intraoperative AR visualization
  • Failing to de-identify screen captures or AR recordings that later appear in teaching sessions or, worse, on social media

If you could see the vendor-side dashboards some companies run—where they can watch live usage, see recording logs, and sometimes pull back “samples” for QA—you would think twice before casually firing up that “free trial” AR app in a real case.

Basic protections you must insist on:

  • Written confirmation that the system is compliant with your country’s privacy laws (HIPAA, GDPR, etc.) and that data ownership remains with the institution
  • Clear answers to: Where is this data stored? For how long? Who can access it? Can it be used for model training? Can it be sold or shared?
  • Institution-approved hardware and accounts only; no mixing of personal and clinical accounts
  • A policy about VR screen captures, AR recordings, and their use in teaching, research, and marketing

If your name is on the op note, your name is tied to what happened to that data. “The vendor handled it” is not a defense that will impress an ethics committee.


4. Overestimating VR Training: “I’m Good, The Simulator Said So”

scatter chart: Trainee 1, Trainee 2, Trainee 3, Trainee 4, Trainee 5

VR simulator scores vs. real OR performance
CategoryValue
Trainee 170,60
Trainee 285,72
Trainee 395,78
Trainee 488,65
Trainee 592,69

VR simulators are useful. They are also dangerously flattering.

I keep meeting residents who brag, “I scored in the 95th percentile on the lap chole VR module.” Good. But that does not mean they can handle adhesions, obese abdomens, unexpected bleeding, or a glitching insufflator at 2 a.m.

Here is where people go wrong:

  • Treating VR metrics (time, error count, path length) as the primary measure of readiness, rather than one piece of a much bigger picture
  • Performing in the OR at the edge of their comfort zone because the simulator said they were “advanced”
  • Skipping slower, supervised real-world practice because their VR logbook is thick
  • Failing to appreciate what simulators usually lack: haptics, stress, incomplete data, team communication, real consequences

Ethically, overestimating your skill based on VR performance slides toward misrepresentation of competence. If you represent yourself to your attending—or worse, to a patient—as “proficient” because of a game-like platform, you are misleading them.

Use VR to:

  • Learn steps and sequences
  • Practice rare scenarios in a safe environment
  • Build muscle memory for basic maneuvers

Do not use VR to:

  • Declare yourself “ready to operate alone”
  • Downplay the need for graded responsibility, supervision, and traditional simulation with actual instruments and tissue
  • Justify taking on more than your real-world exposure supports

In short: VR can accelerate learning. It must not replace honest self-assessment and supervised practice on real cases.


5. Cognitive Overload: Turning the OR into a Heads-Up Circus

Overloaded AR display during surgery -  for VR and AR in the OR: Safety Missteps That Can Put You at Risk

More data is not always better. In fact, it is often worse.

A recurring mistake in early AR deployments: cluttering the surgeon’s view with so many metrics, annotations, and holograms that their situational awareness drops, not rises. Cognitive overload is real.

Classic overload problems:

  • AR overlays showing tool trajectory, anatomy, vitals, and checklists simultaneously in the center of the field
  • Alerts and pop-ups mid‑dissection that steal attention at exactly the wrong moment
  • Poorly positioned holograms blocking direct visualization of tissue or instruments
  • Laggy displays that subtly desynchronize what your hands feel and what your eyes see

Cognitively, you only have so much bandwidth. If half of it is spent managing the interface, you have less left for surgical judgment.

Designers love demo screens that show everything. Surgeons need almost the opposite: minimal, high-yield, context-aware information.

Protective habits:

  • Ruthlessly simplify: turn off nonessential overlays by default. Use “on-demand” views rather than “always on everything.”
  • Ask yourself mid‑case: “If I turned this off right now, would my performance improve?” If the answer is yes, you are using the system wrong.
  • Train with the system in realistic, longer sessions. Many side effects (eye strain, neck fatigue, subtle distraction) only appear after 1–2 hours, not in a 10‑minute demo.

Do not be the operator who kept AR on simply because it was “cool,” even as they knew it was getting in the way. Tech vanity has no place between knife and patient.


6. Poor Integration with the Team: The Lone Cyborg Problem

Mermaid flowchart TD diagram
Team communication risks with AR use
StepDescription
Step 1Surgeon using AR headset
Step 2Visual info only surgeon sees
Step 3Team out of sync
Step 4Miscommunication
Step 5Delays and errors
Step 6Patient risk
Step 7Surgeon verbalizes AR info
Step 8Shared understanding
Step 9Safer decisions

There is a weird side effect of AR: it can make the surgeon the only person in the room with access to crucial information.

You, in the headset, see a vessel map, target trajectory, or risk zone. Your scrub, anesthesiologist, and assistant see… none of it. If you do not explicitly share what you are seeing, the OR stops being a shared mental model and becomes “whatever is in the surgeon’s head.”

Common missteps:

  • Making positional decisions based on AR overlays without saying what you are doing or why
  • Moving faster, or in new ways, because the guidance “makes it clear,” while your assistant is still thinking in the old workflow
  • Failing to brief the entire team before the case on what the AR/VR system will show and how it may change the flow
  • Ignoring objections grounded in direct observation because “the system says this is correct”

You know what that looks like to the team? Arrogance dressed up as innovation.

You fix this by:

  • Doing a 60‑second team brief: “I will be using an AR overlay that shows X. It may suggest Y. If what I do based on that ever conflicts with what you are seeing or feeling, speak up immediately.”
  • Verbalizing key AR insights: “The overlay shows the pedicle more medial than expected; I am going to dissect cautiously in this zone.”
  • Encouraging and rewarding challenges: if the assistant says, “The tissue planes do not match what your overlay suggests,” you stop and reassess.

Ethically, you are obligated to maintain a culture where anyone can question a decision, including one driven by technology. “Because the headset told me” is never an acceptable justification.


7. Using Beta Tech on Real Patients: The “Demo in the OR” Disaster

Risk levels of introducing new VR/AR systems
ScenarioRisk Level
Vendor demo in real case, no trialVery High
Limited pilot under IRB oversightModerate
Use of validated, widely adopted ARLower
VR training only, no patient contactLowest

A vendor rolls in a shiny AR cart. They want to “support” a live case. The rep says, “We have used this at a few centers in Europe; it works great.” The department leadership wants to be seen as cutting-edge.

You are about to turn your patient into a demo opportunity.

Immediate red flags:

  • No formal evaluation or procurement process; just a “try it during this case” arrangement
  • No published validation for your specific procedure, anatomy, or setting
  • Unclear whether the system is FDA/CE approved for intraoperative use, or only for planning
  • Rep presence in the OR who knows more about system behavior than anyone on your clinical team

There is a simple ethical principle: patients are not test beds for unvetted gadgets. If you are using tech still in a beta phase, that is research, not routine care. Research requires protocols, IRB oversight, and explicit consent.

If your institution tries to bypass this with hand‑wavey “innovation days,” protect yourself and your patients:

  • Ask directly: “Is this IRB‑approved? Is this system cleared for this indication? Is there a written protocol?”
  • Document in the op note exactly how the system was used and its limitations.
  • Refuse to have any critical step depend solely on unvalidated guidance. You should be able to complete the case safely if the tech vanishes.

Promising technology deserves real evaluation, not guerrilla trials on whoever happens to be on the schedule that day.


8. Ignoring Ergonomics and Human Factors: Subtle, Cumulative harm

VR and AR bring physical side effects nobody talks about in the sales brochures: neck strain from headsets, eye fatigue from vergence‑accommodation mismatch, dizziness, subtle balance issues.

Alone, these feel minor. Across a long case, or across many cases in a week, they erode your performance.

Common human-factor mistakes:

  • Wearing heavy or poorly balanced headsets for long cases without trialing them in shorter procedures first
  • Accepting poor resolution or lag because “the concept is impressive,” forcing constant visual refocusing
  • Not having a “bail-out” plan if you start to feel motion sick, disoriented, or visually fatigued mid‑case
  • Ignoring staff feedback—scrub or assistant complaining of headache from watching a shared VR planning screen all day

Your brain and body are core instruments. Anything that degrades them, even slowly, is a safety issue.

Be disciplined about:

  • Limiting headset use duration initially; build up only if you tolerate it without subtle errors or fatigue
  • Having a clear, rehearsed path to revert to conventional visualization instantly if you feel off
  • Reporting and documenting side effects. If multiple surgeons experience the same problem, the system needs redesign, not more “adjustment” from users.

Do not tough it out because you want to look “tech-savvy.” Fatigue you can ignore. Micro-errors your patient cannot.


FAQ (Exactly 3 Questions)

1. Is it unethical to use VR/AR in the OR if it is not yet standard of care?
Not automatically. It becomes unethical when you misrepresent its maturity, fail to obtain meaningful consent, or allow it to drive decisions beyond what evidence supports. Using VR/AR as an adjunct with clear disclosure, robust backup plans, and honest communication with the patient and team is defensible. Using it as the primary determinant of what you cut, clamp, or remove when evidence is thin is not.

2. How do I protect myself legally when my hospital is pushing aggressive tech adoption?
Document. In the consent note, specify that VR/AR tools are adjunctive and that the case can proceed safely without them. In the operative note, describe how the system was used and acknowledge its limitations. Insist on written confirmation of regulatory status and data policies. When pressured to use unstable or unvalidated features, put your concerns in writing to leadership. If the tech fails or misleads, your prior documentation that you used it cautiously and transparently will matter.

3. What is the safest way for trainees to incorporate VR into their development?
Use VR heavily for practice and minimally for ego. Build volume on simulators, but cross‑check your perceived skills with attendings who observe you in real cases. Treat high VR scores as a sign that you are ready for more supervised responsibility, not independence. Ask explicitly: “Given my VR training and what you see in the OR, what steps can I safely perform, and which should I not attempt yet?” If your real-world performance does not match your VR record, trust the OR, not the headset.


Two final points, because this is what it all boils down to:

  1. VR and AR are tools. The minute you start treating them as authorities, you have already made a safety error.
  2. Your ethical duty is to your patient, not to innovation, not to vendors, not to your own desire to look advanced. If the tech ever conflicts with that duty, you shut it off.
overview

SmartPick - Residency Selection Made Smarter

Take the guesswork out of residency applications with data-driven precision.

Finding the right residency programs is challenging, but SmartPick makes it effortless. Our AI-driven algorithm analyzes your profile, scores, and preferences to curate the best programs for you. No more wasted applications—get a personalized, optimized list that maximizes your chances of matching. Make every choice count with SmartPick!

* 100% free to try. No credit card or account creation required.
Share with others
Link copied!

Related Articles