I think in the near future we will be talking about Virtual Reality 2.0 and Virtual Reality 1.0.
VR 1.0 is everything we’ve seen so far – a system that deals primarily with simulating vision and hearing, but doesn’t address the finer details associated with other types of human physiology.
VR 2.0 will be a system that takes into account what scientists call the “sensory gap” – the feeling of dizziness or vertigo that comes with a system that is 90% but not fully made to deliver a true virtual experience, mainly due to the engineering limitations that come with it.
We experience every sensation
Kim Seung Jun
A close-up of an Oculus Go virtual reality headset resting on a light wooden surface at Facebook Inc. and … [+] Former Oculus Rift building in Dublin, California, August 23, 2018. (Photo by Smith Collection/Gado/Getty Images)
Getty Images
First, you need to understand that meaning is changing around the industry: People are using the term “augmented reality” to talk about all kinds of new platforms and capabilities.
As you can see from Nvidia’s blog, augmented reality is an umbrella term used when talking about virtual reality, extended reality, etc. It’s an umbrella term that encompasses systems that have either a full or unobstructed field of view, systems that have spatial boundaries, and systems with different criteria for replacing the physical world with a virtual world.
You can’t feel what you see
Kim Seung Jun
Researchers at MIT, for example, are talking about a project called “Generating Synthetic Actions and Sensations,” which makes you think about the possibility of entirely new ways of thinking about reality.
What does that mean?
What that means is that engineers today are trying to micromanage the entire human response to a virtual environment: They’re looking not just at vision and hearing, but at vestibular responses, sensory activity, muscle responses, and maybe even smell, but that’s another story.
Another aspect of this effort is addressing the “discomfort” that many people feel when experiencing a mismatch between their physical and virtual environments. A recent presentation by Kim SeungJun at CSAIL+IIA showed how this could work.
How do we deal with the sickness between what we see and what we feel?
Kim Seung Jun
Either way, people are paying a lot of attention to this. There’s an MIT program focused on augmented reality, and they’ve laid out some goals there.
“Businesses are increasingly recognizing that extended reality (XR) has the ability to reinvent the way we communicate, experience gaming and other entertainment, and transform industries such as healthcare, real estate, retail, and e-commerce. According to Forbes, XR technologies, such as virtual reality (VR) and augmented reality (AR), will be “one of the most transformative technology trends of the next five years.” Organizations of all kinds are seeking technology professionals with the knowledge base, vision, and skills to implement XR applications that will provide a competitive advantage. MIT xPRO’s Virtual and Augmented Reality program is designed to equip you with a foundational understanding and fluency in XR technology, as well as the ability to consider user needs when improving applications or developing new ones.”
A group of young people receiving technical vocational training with a teacher
Getty
To me, this really represents a groundbreaking approach to the idea that we’re going to be living in virtual worlds. If we’re going to replace the physical world with a virtual world, those other worlds have to be real — that is, they have to look like reality. They can’t just be Virtual Reality 1.0. In many ways, we’ve seen the limits of what we can do with that technology, and now we’re going to the next level.