A wide range of sensor technologies is needed to support augmented reality and virtual reality (AR / VR) systems. Today’s AR / VR implementations focus primarily on visual and audio interfaces and rely on motion tracking and hearing / voice recognition sensors. This is beginning to change with the introduction of new types of sensors and different types of haptics. This FAQ begins with an overview of some of the key position detection technologies used in current AR / VR systems, presents a proposed AR / VR system for neurorehabilitation and quality of life, and discusses how thermal sensor technologies are being developed to provide more complete AV / VR environment and closes by considering emerging haptic technologies for thermal and touch-based feedback.

AR involves creating an environment that integrates the existing environment with virtual elements. In the environment of virtual reality, the system creates the complete “reality” and should know only about the relative movements and orientation of man. The main VR system uses an inertial measurement unit (IMU), which may include an accelerometer, gyroscope and magnetometer. The AR system needs to know where the person is, but they also need to understand what they see, what they hear, how the environment changes, and so on. As a result, AR uses more sophisticated sensors, starting with the IMU and adding in-flight sensors, heat map, structured light sensors and more.

Figure 1: Augmented reality, which integrates an existing environment with virtual elements, requires more complex monitoring than creating a fully virtual environment. (Image: Computer Science Association)

Both AR and VR are immersive environments, especially for VR, full immersion is necessary for a good user experience. The potential for motion sickness is a challenge in the development of AR / VR systems. AR / VR devices must capture the user’s movements quickly and accurately to avoid motion sickness. IMUs and other motion sensor technologies must be very stable with very low latencies. The IMUs used in AR / VR systems combine an accelerometer, gyroscope and magnetometer to make it easier to correct errors and quickly produce accurate results, allowing the system to track head movement and position.

In addition, both AR and VR systems can benefit from various forms of user feedback and interaction, including gesture and voice recognition. Gesture recognition can be based on real-time video analysis (which can be energy and computationally intensive) or more advanced technologies such as e-field sensing, LIDAR and advanced capacitive technologies. For a discussion of gesture recognition, see the FAQ “How can a machine recognize hand gestures?”

One of the keys to AR environments is accurate, fast, and continuous presentation of computer-generated images of the actual environment. Most AR headphones rely on one or more special types of image sensors, including; In-flight cameras (ToF), vertical surface laser radiation (VCSEL) based on light detection and range (LiDAR), binocular depth reading or structured light sensors. Some use a combination of these sensors.

ToF cameras combine a modulated IR light source with an image sensor with a charged connected device (CCD). They measure the phase delay (flight time) of the reflected light to calculate the distance between the illuminated object and the camera. Thousands or millions of these measurements are combined in a point cloud database, which is a three-dimensional image of the surroundings. Rather, a developed technology based on VCSEL LiDAR can produce point clouds with higher accuracy. In addition, VCSEL technology is used in smart glasses and other wearable devices to produce more compact displays with lower power and gesture recognition systems.

Structured light sensors project a specific model of light (IR or visible) onto the environment. Pattern light distortion is analyzed mathematically to triangulate the distance to different points in the environment: camera pixel data (a type of primitive dot cloud) is analyzed to calculate the difference between the projected light model and the returned model, such as take into account the camera’s optics and other factors to determine the distances to various objects and surfaces in the vicinity. While one structured light sensor can be used, more accurate results are achieved when two structured light sensors are used and their outputs are combined.

Additional sensors found in AR systems include directional microphones (which can be replaced in the future by directional audio transducers with a bone connection), various biosensors such as thermal sensors, ambient light sensors and forward and backward camcorders. In addition, a wireless connection is required to download all data from the sensor and upload video information to create a real-time immersive and dynamic environment.

Neurorehabilitation

AR / VR technologies are being developed for a number of medical applications. In one case, a system for immersive neurorehabilitation exercises using virtual reality (INREX-VR) ​​based on VR technologies is being developed. The INREX-VR system captures the user’s movements in real time and assesses the mobility of the joints of both the upper and lower limbs, records training sessions and records electromyographic data. A virtual therapist demonstrates the exercises and sessions can be configured and monitored using telemedicine connectivity.

Figure 2: Hardware components of the INREX-VR experimental system for immersion neurorehabilitation. (Image: MDPI sensors)

The INREX-VR software is designed to use ready-made VR hardware and is suitable for use by patients suffering from neurological disorders, therapists responsible for supervising the rehabilitation of patients, and trainees. The system can assess the user’s emotional state by monitoring heart rate and stress levels using skin conduction. Face recognition can also be implemented when the system is used with VR headphones.

Warming up the future AR / VR

Devices that can add a thermal dimension to AR / VR environments are a hot topic. Various thermal sensors and thermal haptic technologies are pursued (Figure 3). Thermal sensor technologies proposed for AR / VR systems include thermoresistive, pyroelectric, thermoelectric and thermogalvanic. While these sensors convert heat energy into electric currents or voltages, the power levels are too low to support energy collection, but are sufficient to allow thermal monitoring. In particular, polymer composites and conductive polymers are expected to find application in future thermal sensors. The mechanical flexibility of the polymers makes them particularly suitable for load-bearing thermal sensors. Some of the proposed polymer-based thermal sensors can operate in contact and non-contact mode, further increasing flexibility. Performance stability and delayed response due to thermal hysteresis are two challenges before thermal sensors are ready for use in AV / VR systems.

Figure 3: A number of technologies for the implementation of thermal sensation and thermal haptics in AR / VR environments are being developed. (Image: Advanced functional materials)

Thermal haptics or thermal stimulation is another emerging area of ​​AR / VR that will provide an improved level of immersion with AR / VR media. Thermal haptics are expected to rely on Peltier thermoresistive heaters or devices that control the temperature or target area, but are more powerful than polymer-based thermal sensors. On the plus side, active thermal haptics may have a faster response time than the proposed polymer-based thermal sensors. With thermal haptics, users will be able to feel the temperature of a virtual object and have a more realistic interaction with their environment.

Thermoresistive heaters may have lower hysteresis than Peltier devices, but thermoresistive heaters provide only heating, while Peltier devices can provide both heating and cooling. A key factor in the development of a thermal haptic will be the need for user controls that allow the selection of a range of temperatures that provide stimulation without causing discomfort or burns. New materials will be needed for Peltier wearables that are both flexible and lightweight. Material development is a key activity that allows both thermal monitoring and thermal haptics in future AR / VR systems.

Ultrasonic haptic

Ultrasound-based haptics have been developed based on the precise control of a series of speakers that send precisely defined ultrasonic pulses. The time differences are designed so that the ultrasonic waves from the different speakers arrive at the same place at the same time and create a pressure point that can be felt by human skin (Figure 4). The focal point where the ultrasonic waves meet can be controlled in real time and changed from moment to moment based on the positioning of the hands or other factors.

Figure 4: The combined vibrations of the ultrasonic waves at the focal point create a pressure point that can be felt by human skin. (Image: ultra jump)

The tracking camera serves the dual purpose of gesture recognition and ensures the exact position of a person’s hand, allowing the focal point to be positioned at a specific location on the hand. By controlling the movement of the focal point, it is possible to create multiple tactile effects. Haptic feedback is expected to use AV / VR systems outside of games, for example, system controls and interactions for shopping and vending machines, automotive interiors, building automation and other areas.

Summary

Sensors are a key area of ​​technology that enables immersive and immersive AR and VR environments. Due to the greater complexity of creating AR environments that connect the surrounding reality with virtual elements, AR sensor systems are more complex. New sensor modalities are being developed to increase the realism of AR / VR systems, including thermal sensors and thermal and sensor haptics.

References

Consumer AR can save lives, economyComputer Science Association
The new augmented reality thermal technologyAdvanced functional materials
Flexible virtual reality system for neurorehabilitation and improving the quality of lifeMDPI sensors
AR / VR sensorsBooks for the press
The touch becomes virtualultra jump

DesignFast Banner version: 0d5f034b

What sensors are used in AR/VR systems?

Previous articleApple is improving accessibility as Amazon’s Fire tablets rise
Next articleAndroid 13 Beta 3 will naturally support Braille displays