Sponsored By

VRDC Speaker Q&A: Noah Falstein on how neuroscience can make VR better

Ahead of his talk at VRDC Fall 2017, game dev vet Noah Falstein shares some information about himself and how neuroscience can help us make better VR titles.

August 3, 2017

4 Min Read
Game Developer logo in a gray background | Game Developer

President of The Inspiracy Noah Falstein will be at VRDC Fall 2017 to present his talk A Game Designer’s Overview of the Neuroscience of VR, which will discuss what modern neuroscience can provide for VR.Here, Falstein gives us some information about himself and how neuroscience can help us make better VR titles.

Attend VRDC Fall 2017 to learn about immersive games & entertainment, brand experiences, and innovative use cases across industries.

Tell us about yourself and your work in VR/AR

I’ve been in the game industry since 1980, working on hundreds of titles over the years.  Like many others, I was exposed to VR early on (got a demo of VPL from Jaron Lanier himself in 1985) and was disappointed at how slowly it advanced – and then was turned from skeptic to believer by the latest round of VR systems a few years ago.  Until just recently I was Chief Game Designer at Google, where I worked closely with both the Tango AR and Daydream VR teams.  I left Google in April of this year because I wanted to work on development of VR titles – specifically in the areas of VR storytelling, and in Neurogaming/games for health.

Without spoiling it too much, tell us what you’ll be talking about at VRDC

I’ll be delving into what I’ve learned from working with neuroscientists,  researchers, and fellow game developers about how the workings of the brain (and eyes and inner ear) combine to create the strong impact many of us have felt from VR.  I’ll be focusing not so much on the neurobiology but rather on practical lessons and the sometimes counter-intuitive lessons we’ve learned.

What excites you most about VR/AR?

Its emotional impact.  VR triggers some very powerful feelings, particularly those mediated by a deep part of our brain called the Amygdala, including fear, anger, and arousal.  A lot of companies have started to explore the horror or action games that deal with some of those first two feelings, but it’s clear to me that VR has the power to evoke empathy at stronger levels than a lot of our other media, and that’s very encouraging when we consider possible games, storytelling approaches, and even medical assessment and treatment.

What do you think is the biggest challenge to realizing VR/AR’s potential?

Just the usual conflict of having a low number of capable platforms at first, with a wide open market and a very difficult path to profits – but by the time there are 100 million high-quality VR or AR headsets out there and huge profits, the successful early movers will have locked a lot of others out.  It’s a pattern the games industry has seen many times in the past with each significant new platform.  I’ve likened it to surfing in previous lectures – the trick is knowing when to start paddling to catch a big wave, too early and you may run out of steam before the wave comes, too late and others will be way ahead of you.

How does your background in design influence your view on how VR/AR applications are currently being developed?

I’ve always been interested in the cutting edge of new technologies and how they affect game design.  I worked on something called The 400 Project years ago that explored rules of thumb for good game design.  When we have a new medium like VR or AR they require new rules (and different for AR and VR).  I’ve learned over the years that considering the fundamental evolution and biology that makes us human is a good place to start with new technologies, so you can find truly appropriate, fresh approaches and not simply take the “shovelware” approach of trying to push old creative approaches from previous platforms onto the new technologies.

How can neuroscience help to create a better VR/AR experience?

By giving us a better understanding of how the brain works in concert with our visual perception and inner-ear vestibular system to create the experiences we have in VR/AR.  For example, visual system researchers have learned that we process foveal vision (the actually small area directly in our field of view) differently from our peripheral vision.  Peripheral vision does a lot to help orient us in 3D space, and if the information in our peripheral view is out of sync with how our inner ear tells us our body is moving, it can be disturbing – but we’ve learned that graying-out just the peripheral view when you’re moving the player in VR, while keeping a small circle of foveal view can convey the sense of movement without causing distress.  That’s just one example, there are a lot more I’ll be talking about, and providing some examples of how to use them.

Register for VRDC Fall 2017 to hear more about neuroscience of VR from Falstein and join other creators of amazing, immersive experiences at the premier industry event.

Gamasutra, VRDC, and GDC are sibling organizations under parent UBM Americas

Read more about:

2017
Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like