Trending
Opinion: How will Project 2025 impact game developers?
The Heritage Foundation's manifesto for the possible next administration could do great harm to many, including large portions of the game development community.
Featured Blog | This community-written post highlights the best of what the game industry has to offer. Read more like it on the Game Developer Blogs or learn how to Submit Your Own Blog Post
This post describes how we modified our multiplayer code for our GearVR game Daydream Blue in order to stream the game to other devices and displays, without any significant performance hit or framerate drop.
A couple of weeks ago I had a series of conversations around the fact that people were not streaming and recording video of GearVR games because there was not a universal solution for streaming from the device. The only easily available solution was sending frame data, which crushes frame rate as well as requiring significant bandwidth.
We have developed a method leveraging our multiplayer tech for Daydream Blue to render what the player is seeing in the GearVR on additional monitors and screens. Our technique has minimal impact on performance and allows for a undistorted camera view.
Unfortunately, we don’t have a solution for everyone, and to get something similar up and running will require you to write a bit of network code. Nevertheless, this is a solution that works well for us and we wanted to share a high level outline with others who may be interested.
We are using Unity and Photon Networking. We have a modified version of multiplayer networking code syncing on a PC also running the game. The difference from standard multiplayer is that we are syncing the player to the PC, as opposed to another avatar in the world.
We essentially drive a marionette version of the player over the network from the GearVR. This means we are sending a small amount of data each frame, to keep the world in sync and drive the camera, and letting the game code play itself out on the PC and render each frame.
The downside of this method is that there is a significant amount of information, especially around UI elements, that the player sees but we don’t normally sync with other players. This required a bit of extra finagling, and while we still have features we want to sync, it was worth it the first time we could watch what people were doing in the GearVR. Additionally, all of our player info is Diegetic or Meta in presentation, so maintaining social presence really requires us to sync almost all of this over network for multiplayer as well.
This technique requires two instances of the game. The player fires up the version on their PC, then runs the game on their GearVR. To keep our streaming mode separate from standard multiplayer, we use a constant string value, concatenated with a unique value for the event or Youtube/Twitch player handle, to tell the Daydream Blue network code this is streaming mode. We then automatically connect the GearVR into this defined room on the server. Using keyboard input on the PC version, we or the streamer connects to the same Photon room on the server and everything begins syncing.
For every frame, we send camera rotational data to keep the camera in sync and with as little latency as possible. In the same way that latency can disconnect a player from their actions in-game, we want to keep our PC/monitor within a threshold of the player’s head movement in order to preserve the feeling of a window into the virtual world. We update objects in the world to keep them in sync by interpolating and syncing transform values, sending state data, and sending event data. For some objects, we send both force and velocity data, as well as transform data, and interpolate values. This allows us to leverage the Unity Physics system but keep things in sync over time. The interpolation of transform values is necessary to prevent the object from drifting out of sync over time due to the nondeterministic nature of Unity Physics.
UI provides additional work and challenges. For example, our movement system is a blink-and-warp approach where the screen fades out, player moves, and screen fades back in. In standard multiplayer, there is no reason to code any syncing of the fade effect. Only the local client needs to see the fade effect. However, when we are syncing the same avatar, these effects and UI pieces all need to be sent over the network so that people watching the stream have all the information about the game world state and what the player is doing.
Watching a camera look about randomly because the entire UI system is not synced leads to confusion and some boring gameplay footage! We briefly considered sending input data for UI syncing, but realized that small discrepancies with gaze direction could yield very different results. Instead, we settled on sending RPC events when buttons are clicked . This makes for far less data passing over the network, and we let our UI system play out as it normally would. We are still working on getting everything synced, but so far everything is working well!
The number one struggle for us was the standard multiplayer code. While our streaming solution required additional code, the bulk of our time was spent just getting standard multiplayer up and running - which means, unfortunately, that a dev team without multiplayer couldn’t use our streaming solution without first building a multiplayer solution. The efforts made sense for us because multiplayer is a major feature of Daydream Blue.
The other significant challenge we faced dealt with UI state and data. Inventory data and numbers are not synced over the network in standard multiplayer, as other players have no need to know these values. Events are sent when things are pulled out or put in the inventory; what is inside the inventory is a mystery to other players. For streaming, we needed this information in order to display UI information correctly. This is one example of the many challenges we faced with keeping data in sync with the UI.
A few days ago we set up at a demo event and tried out streaming in public for the first time. It was incredibly exciting to finally see what players were seeing in our world, after six months of playtesting in the dark :p We also watched as the live-stream attracted attention from the event attendees, and unlike with a video, we could answer their first question by saying “Yes, that is what the player is seeing!”
We hope our solution will help some developers implement a method to stream their games. We are looking for Youtube personalities and Twitch streamers to play Daydream Blue, so if anyone has advice on that front please let me know in the comments! We hope other GearVR devs implement something as well, so the GearVR can get more attention on media outlets!
Thanks for reading!
Read more about:
Featured BlogsYou May Also Like