Introduction
For the last few decades, many technical advancements in the digital world have not realized their fullest potential in the real, or physical world. The two domains are very much siloed with each experience largely contained within its respective environment. A major challenge in bridging these worlds is connecting our sense of touch, feel and space, alongside our visual field—arguably our two most dominant senses - together.
Lex Fridman and Mark Zuckerburg discuss this division in the Lex Fridman Podcast #398 - Mark Zuckerberg: First Interview in the Metaverse. Mark suggests that in the next five to ten years, new integrative technologies will bridge these sensory gaps and spawn new markets, use cases and a change in our daily lives (Lex Fridman/Mark Zuckerburg Interview). When constructing these systems, developers, engineers, and designers need to meticulously analyze how users interact and experience a system, encoding sensory, spatial and motor information.
So, change seems to be on the horizon in Big Tech, but what about healthcare?
Medicine and healthcare present a prime setting for integrating immersive and haptic technologies, as they enable understanding how the body interprets sensory and motor data. It’s vital to comprehend the intricacies of human anatomy, physiology, psychology, and biomechanics for realistic virtual and physical interactions. Systems must be adaptive, considering individual cognitive differences and incorporate various sensory inputs, including pressure, touch, and proprioception, to ensure genuine interactions across realms (Gadhvi M (2023)). This exploration delves into designing systems rooted in medical scenarios, emphasizing the role and applications of immersive and haptic technologies in healthcare.
In this post, we’ll examine the convergence of immersive and haptic technologies broadly, and in healthcare. And importantly, its huge potential in closing the gap between our digital and physical worlds.
Immersive, Extended Reality (XR) Technology
Immersive technologies provide users with visual, auditory, and proprioception (i.e., your body’s ability to sense the body within space) stimuli, enabling them to perceive and interact within virtual and real-world settings. Although seen as one technology, immersive technologies or Extended Reality, often referred to as XR, is an umbrella term used to denote the variety of emerging 3D visualization systems (Interaction Design Foundation).
Extended Reality varies based on how the user perceives and interacts with virtual objects. Virtual Reality (VR) consists of fully immersing a user into a virtual or digital system, while Augmented Reality (AR) consists of overlaying virtual objects within a user’s environment (Nvidia Reference). Mixed reality not only overlays digital objects into the user’s environment but also enables them to interact with these objects in real-time.
Haptics and XR
Haptic systems offer tactile feedback reminiscent of the sensations experienced during gaming controller actions or smartphone alerts (Business Insider). These systems deploy specific technologies to generate vibrations, such as the eccentric rotating mass (ERM) actuator. Beyond vibration-centric systems, haptics also encompass force feedback mechanisms like the automobile’s Lane Keeping Assist (LKA) and innovations like ultrasound fields (Ultraleap).
While haptic feedback focuses on simulating touch, XR prioritizes the visualization of virtual entities and surroundings. Despite their distinct functionalities, haptics and XR amplify the overall experience of user sensation, interactivity, and immersion in the digital domain. As we delve deeper into the fusion of haptics and XR, it’s important to understand the key hardware components that ensure integration.
XR Hardware Overview
- User Interfaces: Equipment ranging from headsets to screens or mobile devices display virtual elements, facilitating user interaction. Additionally, dedicated controllers or sensors guide user-system interactions.
- Tracking Devices: These employ sensors, accelerometers, gyroscopes, and both internal and external cameras. Their role is pivotal in recording the user’s surrounding environment, motion patterns, and even some physiological metrics, ensuring system responsiveness.
- Computational Power: Graphics Processing Units (GPUs) handle parallel processing, crucial for graphic rendition and machine learning model applications. Central Processing Units (CPUs), on the other hand, orchestrate the overall system and take on the demanding task of data processing, ensuring users experience almost real-time interactions.
- Connectivity: To ensure a seamless experience, system hardware must guarantee minimal latency, fostering real-time or near-real-time user interactions. A conventional method involves leveraging HDMI and USB cables to tether hardware to the web, reducing any latency.
XR and Haptics for Medicine
The need for medical professionals—including doctors, nurses, physical therapists, and students—to understand complex anatomy and physiology opens doors for developers and engineers. They have the opportunity to create rich immersive experiences that have tangible impacts in the real world. Surgeons and students can now visualize and practice intricate surgeries without the fear of real-world consequences.
In a clinical setting, immersive experiences powered by haptic technology are revolutionizing the way medical practitioners engage with digital or distant environments. Feedback in the form of forces and vibrations allows these professionals to use their sense of touch more dynamically. This technology lets them distinguish between different anatomical structures—like bones, tissues, and organs—through real-time tactile feedback, such as varying levels of force and pressure.
In medicine, the sense of touch is critically important—whether it’s for palpation in a physical exam or handling delicate tissue in a laparoscopic surgery. Doctors heavily depend on the sense of touch, so it’s clear haptics have a key role to play. As more technology is adopted in healthcare, there will be a race to perfectly mimic, and perhaps surpass, the feel that healthcare providers deeply depend on. So what are some use cases for this?
- Medical Training and Simulation: Surgical simulation and training are critical, not only for trainees but surgeons learning new procedures. Haptics could increase the fidelity for real-life tactile sensations.
- Robotic Surgery: Robotic surgical systems are swiftly evolving, with haptic technology being crucial due to its role in interacting with the patient.
- Prosthetics and Bionics: Artificial devices that replace or enhance missing or damaged body parts, integrated with haptic technology to mimic natural sensation.
- Rehabilitation: Haptics can be used in physical therapy (PT) rehab to provide tactile feedback during exercises, enhancing muscle memory and guiding patients towards correct movement patterns.
- Enhanced Radiology: Using immersive technologies and imaging technologies, it may be possible to develop haptic technology to augment radiologic diagnoses non-invasively, to assess shape, size, and texture of a tumor, for instance.
Summary
In recent decades, the digital and physical worlds have remained largely separate, with touch and vision as key senses bridging the divide. As discussed by Lex Fridman and Mark Zuckerberg, the next decade is set to witness technological advances that seamlessly blend these two realms. The healthcare sector presents a prime arena for these advancements. Understanding the intricacies of human anatomy, psychology, and biomechanics is crucial in designing systems that enable meaningful virtual and physical interactions. Immersive technologies, also known as Extended Reality (XR), and haptic feedback are the cornerstones of this integration. XR offers visual, auditory, and spatial sensations while haptic systems provide tactile feedback. Together, they are set to revolutionize the medical field. From surgical simulations and training to advanced robotic surgeries and prosthetic enhancements, the combination of XR and haptics offers myriad opportunities. In essence, the fusion of these technologies will not only redefine medical training and procedures but also have the potential to revolutionize patient care and diagnostic processes.
Gadhvi M, Waseem M., Moore MJ. 2023. “Physiology, Sensory System — Ncbi.nlm.nih.gov.” https://www.ncbi.nlm.nih.gov/books/NBK547656/.