Virtual Reality for Health Education Advanced Learning (VR-HEAL) (2024-2025)

Background

Interventional radiology (IR) is an innovative field that uses image-guided techniques for treatments, often reducing the need for traditional surgery. Despite its growing relevance, exposure to IR procedures comes late in medical training, typically not until the fifth or sixth year of residency. Skills like tunneled and nontunneled catheter placements are fundamental for residents, yet training opportunities are scarce and traditionally rely on models that don’t capture the procedure’s complexity.

Existing IR training lacks interactive elements that can emulate the tactile feedback of live operations, an aspect critical for developing proficiency. The advancement of technology offers a solution through the development of simulation devices that integrate electrical, biomedical and visual studies expertise. These devices can potentially provide a three-dimensional, immersive training experience, augmented by real-time biometric feedback. Such innovative tools are essential for equipping medical trainees with necessary IR skills, ensuring they are prepared to deliver high-quality patient care from the outset of their careers.

Project Description

This project team will develop cutting-edge virtual reality (VR) hardware and haptic feedback mechanisms that can be integrated into intuitive and practice medical training tools for interventional radiology.

The project will include five phases:

  1. Development: Team members will design and construct VR simulation hardware focusing on haptic feedback to replicate the tactile sensations of catheter placement. Students working on technical aspects of the project will work with those focused on the visual studies element to ensure the VR environment is visually coherent and pedagogically sound, utilizing principles from visual learning theory.
  2. Testing and iteration: Team members will conduct initial testing with small group of IR experts to refine the simulations anatomical accuracy and procedural fidelity. Then, the team will iterate on the hardware with feedback to fine-tune the haptic mechanisms.
  3. Implementation and data collection: Team members will deploy the VR training modules in a controlled educational setting by enrolling a study population of undergraduates, medical students and residents. The team will use pre- and post-intervention surveys to assess changes in interest and appreciate for IR (among undergraduates) and understanding and comfort with IR techniques (among medical residents).
  4. Analysis: The team will use qualitative and quantitative methods to assess the impact of VR training on both populations, including using machine learning algorithms to analyze biometric data collected during simulations, identifying patters correlating with learning outcomes.
  5. Evaluation and feedback: Team members will gather comprehensive feedback from participants regarding the educational effectiveness and user experience of the VR platform. This data will support continuous refinement of the VR modules based on feedback and data analysis.

Anticipated Outputs

VR training modules; haptic feedback device prototype; research publication on VR training efficacy; data set for ongoing machine learning analysis; business plan for scaling VR system to other institutions; workshop curriculum for new procedure development; proposals for market analysis

Student Opportunities

Ideally, this project team will include 5 graduate students and 8 undergraduates interested in electrical and computer engineering, biomedical engineering, computer science, visual studies and 3D modeling, and medicine.

Team members will break into graduate student-mentored subteams to tackle specific tasks, including device development (engineering students) and the 3D learning environment (computer science and visual studies students). The entire team will meet weekly, with subteams likely meeting more often to ensure progress on each phase.

All team members will have the opportunity to design medical devices and learn about prototyping and user-centered design; create AR/VR clinical procedures and simulations; gain skills in 3D modeling and programming; analyze biometric data to refine educational tools; apply statistical and machine learning techniques; test prototypes and acquire hands-on experience in clinical application and quality control; and research and draft scholarly articles.

Graduate students will also gain skills in leadership, advanced problem solving with healthcare applications, and interdisciplinary knowledge integration.

Timing

Fall 2024 – Summer 2025

  • Fall 2024: Complete literature review; establish project management structure and internal timeline design; complete initial prototype of device and educational content
  • Spring 2025: Refine device and content prototyping; begin informal testing with small user group for early feedback; develop and iterate on educational materials based on prototyping insights
  • Summer 2025 (optional): Complete in-depth testing of prototypes and educational modules; analyze user feedback to inform further refinement

Crediting

Academic credit available for fall and spring semesters; summer funding available

 

Image: Interventional Radiology Residency, Duke Radiology, Duke University School of Medicine

Image: Interventional Radiology Residency, Duke Radiology, Duke University School of Medicine

Team Leaders

  • Jonathan Martin, School of Medicine-Radiology
  • Dominic Tanzillo, School of Medicine

/graduate Team Members

  • Sarah Eom, Electrical/Computer Engg-PHD
  • Yihang Jiang, Biomedical Engineering-PHD

/yfaculty/staff Team Members

  • Jessilyn Dunn, Pratt School of Engineering-Biomedical Engineering
  • Augustus Wendell, Arts & Sciences-Art, Art History, and Visual Studies