Cross posted from immersion.com
Immersion was founded as a gaming company over a quarter of a century ago. At that time, with the explosion of personal computing, the internet, and VR 1.0, the potential for digital technology to revolutionize the human experience seemed limitless. Our role was to bring the sense of touch online, letting gamers feel the physical effects of thrilling fantasy worlds, from the forces of flight to the intensity of battle.
The value proposition for haptics in games has always been strong. Games seek to transport you to new realities, and haptics helps that along by giving you touch sensations that correlate to the action of your avatar. Since the second generation of game consoles made their way into living rooms across the world, haptics in games have mainly taken the form of rumble, or low-frequency vibration. When game controller manufacturers added a second channel for rumble, the progress was real, but iterative.
In the past five years or so, we have seen a leap forward in haptic capabilities in game controllers. Rumble has transformed into HD vibration, which can render nuanced textures, reaching a new level of material realism. Even force feedback, which Immersion pioneered in joysticks and gaming mice back in the ‘90s, has made a resurgence with the DualSense triggers.
Given our position in the gaming market, we are often asked, what’s next? Here’s a peek at our view of how haptics for gaming will develop in the coming years.
The Trajectory of Touch Technology for Games
As you’ve probably gathered, while the DualSense controller for the PlayStation 5 is revolutionary, it’s only the beginning. The sense of touch is expansive. So much so that four channels of haptics in the hand barely scratches the surface of what’s possible. While the vision put forth in Ready Player One of haptic suits that can render arbitrary sensations anywhere on the body is still far off, it is not quite as far off as many think. Regardless of the time it takes, we are seeing milestones along the path on an ongoing basis.
The current era: Quad haptics
DualSense is an implementation of a generalized haptic gaming experience we call quad haptics. This type of system includes four channels of touch, two for each hand, and is typically packaged in a traditional game controller form factor. While quad-channel haptic controllers have been demonstrated in one form or another for many years, 2021 will be the year that multiple peripheral makers and game platforms begin to support it.
Much like the transition from stereo to multichannel audio, multichannel haptics refer to a haptic experience with many endpoints. Because touch feedback must occur on the body, multichannel haptic systems level will result in a set of haptic endpoints positioned on and around the body such as gaming mice and keyboards, gaming chairs, headsets, handheld mobile devices, and so on. With this set up, games will not be pre-programmed to know the number of peripherals available to the player, and what their haptic capabilities are. Therefore, there’s a need for layers of technology to transcode and orchestrate haptic signals for each endpoint to convey the game designer’s intent for the meaning of the haptics in the game. This vision is technically feasible today, but it is hindered by a classic chicken-or-egg problem. Without the widespread availability and mainstream adoption of multichannel haptic peripherals, content creators have little incentive to create experiences that take advantage of this opportunity. And, without content that utilizes multichannel haptics, consumers have little reason to invest in physical devices that can use it. Nevertheless, there are creative solutions to this problem, and the haptic technology industry is motivated to find them.
Going beyond multiple channels of vibration and force feedback, there are entirely new sensations made possible with advanced haptic systems. Thermal actuators, which can become hot or cold with precisely controlled temperature, can help simulate the material of virtual objects. We use our sense of heat flux when we touch materials to help identify what it is. For example, the heat flux of metal is higher than that of wood. Without moving our hand across the surface of a real object, but instead only making contact with it, we can figure out what we are touching by using our nerve endings’ sensitivity to heat or cold. Simulating this experience will be especially useful in VR, where realistic object interaction is fundamental to building up an illusion of presence.
Combining thermal feedback with programmable airflow (think Dyson fans linked to software) will create compelling illusions of virtual environments. A hot, dry breeze can help transport you to a virtual desert, while a cold, blistering wind can alert you to an approaching virtual storm.
Beyond airflow and temperature, new materials will let game controllers physically change and move. A controller can be made to feel “full” or “empty” by changing its shape. Social interaction can be taken to a new level with the nuance of a handshake or the affection of a loving touch transmitted in the virtual world. Eventually, these devices will allow people to develop muscle memory, blurring the line between gaming and athletic training.
At some point in the future, the need to interact with physical game controllers and other computer peripherals may go away completely. This day will come later than many have thought, as when the advent of gesture interfaces a few years ago failed to kill off traditional physical interfaces. When gesture and motion tracking became possible, many attempts were made to put them at the center of all interaction. These efforts were important, and much progress was made along the way. They ultimately failed because, no matter how well it’s done, gesture input cannot enable natural interaction with a device or a game. Tactile, physical feedback is needed, or the experience doesn’t feel grounded in reality.
Hypermodal haptics is a catch-all term for the solution to this problem. Instead of haptic technology being targeted at a “mode” of human touch, like sensitivity to vibration or temperature, the nerves are stimulated directly with an electrical signal that imitates a physical interaction with the world. It sounds far out, but researchers have made significant progress in stimulating nerve endings directly with electrical signals to generate sensations of touch in the brain. Because these techniques tend to be invasive, they will first find use in medical and therapeutic settings. However, gaming is a natural, eventual application of hypermodal haptics because the definition of gaming is ever expanding – it is how we entertain ourselves, how we socialize, and how we learn best as more and more of our time is spent in virtual spaces.
Gaming is swallowing the world. If you want to understand the future of the culture, look no further than gaming – which is not only about playing games but increasingly about using technology to virtualize as much of human experience as possible. The industry is on the threshold of an arms race to develop new haptic technologies that engage the body and create ever-higher levels of immersion. When gaming companies compete in terms of tech specs like resolution, spatial audio, latency, and so on, what they are really doing is striving to engage as much of the human sensory processing system as possible. Haptic technology currently lags behind graphics and audio, but not for long. As touch is central to human experience, it must be included in virtual experiences to the extent that technology makes possible.