Trending
Opinion: How will Project 2025 impact game developers?
The Heritage Foundation's manifesto for the possible next administration could do great harm to many, including large portions of the game development community.
I will go over how Physics Driven VR delivers a more Connected Immersive Experience over Non-Physics Driven VR.
Hello VR friends! In a previous post, I went over a few tips for preparing a scene to be used in VR. In this post, I will be using that scene to demonstrate my core VR development package called PuppetJump. I will go over what makes PuppetJump unique to other VR starter packages and demonstrate how it's features can help create a more connected immersive experience. In this scenario, I will be referring to Connected Immersion, or Connected Immersive Experiences, as terms that define how physically connected a user feels to a virtual world and it's objects.
Just about any VR starter package you use provides you with the same basic set of features. Those features include the ability to touch, grab and throw virtual objects and the ability to explore virtual space using either teleportation or continuous motion methods. These features are the foundation for interaction in virtual worlds but they fall short in delivering a true connected immersive experience. Why do I say that?
The more you use VR applications, the more you might start to feel something is missing. You can move through spaces, you can look around, you can get a sense that you are somewhere else, you can reach out and interact with objects, you might even feel immersed but do you feel connected? And if not, why? One answer to this disconnected feeling could be that although you are surround by virtual objects that you can move, none of them can move you. Here's a few examples of what I mean by that, some of which you may have seen in VR applications you have tried.
You might reach out to touch a table only to watch your hand pass straight through it as if it wasn't even there.
You might approach a wall, lean into it, only to find there's nothing rendered on the other side but an empty void.
You might grab and object, and while holding that object, it now passes through other objects as if it, or they, were ghosts.
You might see two objects in a scene, one is a box of cereal and one is the table it is sitting on. Maybe both objects are separately able to be picked up, when you do this, there is no noticeable difference in weight between the two. Clearly a table shouldn't weigh the same as a box of cereal, or be moved as easily.
There are two things all of these examples have in common. First, they all have the potential to be immersion breaking events and second, they all are a result of the lack of any sophisticated force feedback technology. With out this feedback, there is no connection between the physics of the real world and the physics of the virtual world therefore, no virtual object can effect our real world bodies in motion.
One of the main reasons I created PuppetJump was to not solve this problem but, to at least, make it better. Mechanical engineers will eventually solve this with new hardware but until then it's up to us software developers to do the best we can with what we got. PuppetJump's core concept is to take input from current VR hardware and pass that information through the physics simulation of the virtual world before rendering it to your screen. This way, at least whatever we see, will be behaving under the same laws of physics and should ultimately result in a more connected immersive experience. Although I have been working with this concept for quite sometime the first absolute need for it came with a project I was working on and the first real evidence that it could be achieved convincingly came when I played Half-Life Alyx.
In the vast majority of VR applications, interactions play out like the examples I described above. I am using Rec Room to demonstrate this. You can see that my hands are not at all impeded by any virtual surface. The same issues are present while holding and object. There is no system in place to make that held object collide with other surfaces. This problem exist because the core system is simply taking the absolute positional data provided to it by the VR hardware. That positional data is never part of the physics simulation therefore it is never effected by it.
In contrast, let's look at some footage from Half-Life Alyx. Some of what is going on here is hard to show but you can clearly see that when my hand hits a virtual surface it stops, unable to pass through it. This makes the virtual world seem more real and it makes me feel more connected and bound by it. Also notice how my hand is able to push objects, not completely without resistance. This lag in position between where my real world hand is, and where my virtual hand is, shows that there are physical forces being applied both by me, and on me from the impact of the virtual object. This makes me feel as though the virtual objects have weight and that we are connected by the laws of physics.
Weight can also be felt when trying to grab and lift objects in Half-Life Alyx. There are many objects that I can reach out and grab with one hand. But then there are other objects that I can not move, or barely move, unless I grab them with both hands. It's difficult to convey this feeling by watching a video. It can be felt during the experience because the heavier the object is, the greater the lag between my real world hand and my virtual hand. Imagine stretching a rubber string between your hand and the object you are grabbing. You must continually supply more and more force to lift heavier objects. Again, this makes virtual objects feel more real and leaves me feeling more connected to the experience.
Let's take a look at PuppetJump in action to see if I can better illustrate some of these concepts. To help with that explanation, I am going to place blue cubes in the absolute position provided by the VR hand input controls (real world position, RWP) while the hand models will indicate the PuppetJump adjusted positions (virtual world position, VWP). The hand model positions (VWP) are being passed through Unity's physics simulation before being rendered to the screen while the blue cube positions (RWP) are not.
First notice that when I am not touching anything, the PuppetJump hands and the blue cubes are in the same location and rotation. This is exactly what we want to happen because moving our hands around in empty space should not be effected by any virtual forces other than gravity. The separation between RWP and VWP happens when I come into contact with virtual objects. Notice when I come into contact with a virtual surface the PuppetJump hand stops while the blue cube continues to move, following the absolute real world input. The PuppetJump hand is not ignoring the real world world input, it is simply following the rules of physics in the virtual world. The PuppetJump hand is constantly trying to get to where the real world hand is, and although it appears to have stopped because it is impeded, it is actually building up more and more force behind it as the separation between the RWP and VWP becomes larger. You can see the results of that build up of force when the RWP moves into a vector that allows the VWP to get to it unimpeded. This is that rubber string effect I talked about earlier.
This building up of force is also what creates the sense of weight for a virtual object. If VWP comes into contact with a light moveable object, the separation between RWP and VWP will remain small as the object moves. However, when a virtual object has more mass, the difference between the RWP and VWP will be greater. This is because a greater amount of force is required to move it. Although there is no physical effect on the RWP, because of the lack of force feedback in the real world, there is a physical and visual effect on the VWP. This tricks our brain into thinking virtual objects have weight and it makes us feel more connected to the virtual world because we are used to having objects in the real world with a variety of weights.
The same effect for grabbing objects is built into PuppetJump. Not only do objects that are being held collide with other objects but they also give a sense of weight. Objects with a lower mass can be grabbed with one hand while objects with a large mass might require two hands to pick up. Heavier virtual objects also can not be moved around as quickly as lighter ones. This is because physical forces, including gravity, are always in effect between virtual objects and the VWP of PuppetJump's hands.
There many use cases for having physics driven VR input. Things like levers, turn able wheels and even buttons on a virtual keyboard can feel more real by providing separation between RWP and VWP. In the end it's all intended to help a provide the user with a sense of connected immersion.
Although I have made an attempt here, the truth is there's no great way to explain how this type of connected immersion feels. Watching a video or reading an explanation does not do it justice. The inability to explain, or understand, without experiencing it, is exactly what can separate VR experiences from all other mediums. Connected immersion, as I have defined it here, not only can make VR experiences better but it also may be a clear justification for why an experience requires VR.
Read more about:
BlogsYou May Also Like