Trending
Opinion: How will Project 2025 impact game developers?
The Heritage Foundation's manifesto for the possible next administration could do great harm to many, including large portions of the game development community.
Spons: If you could create worlds with infinite detail and lighting, what would you make? And how would you go about doing it? Here are some of the things I’ve experienced as a developer, and where the technology is headed.
April 11, 2022
Sponsored by NVIDIA
Author: by Richard Cowgill
Presented by NVIDIA
Every gamer has experienced it. You're fully engaged in your favorite game of the moment. Tension is high, adrenaline is rushing, you are fully immersed…but then you see something that just does not look right. It could be something as subtle as a light, a reflection or a shadow that just does not react the way it should, it doesn’t look the way it does in nature. And that snaps you back to reality. It takes you out of the game’s world and puts you back in front of your keyboard and display.
This is the challenge we face as game developers, and now we have a powerful new tool to combat it. With the technologies found inside Unreal Engine 5, we’re approaching the ability to create worlds with a quality, fidelity, scope and scale we’ve never seen before.
It begs the question, if you could create worlds with infinite detail and lighting, what would you make? And how would you go about doing it?
I’ve done art and design for big and small projects, and the thing every project has had to figure out is: What is our visual target? How is it going to be achieved?
Every game I’ve been involved with, developers have had to figure out, often from scratch, how we were going to achieve our graphics goals. Each generation of hardware presented us with new goals to pursue. Usually that meant meeting higher, more realistic standards. Sometimes, however, you would need to think back — digging through your knowledge of late ‘90s techniques — to deliver high-quality graphics on a mobile phone.
All techniques and methods have their shortcomings. No solution gives you everything you want. For example, if you want both high-quality lighting and fast rendering, you’ll be baking your levels with lightmaps. This comes at the cost of increased memory usage, it’s not dynamic, and it might be unaffordable in open worlds (which represents a large number of games made today). Perhaps most dauntingly, time spent doing light bakes dramatically reduces work efficiency.
This is why I was excited in 2018 when I heard that real-time ray tracing was going to be a reality and not just a pipe dream. I knew that ray tracing was a foundation for consistent, realistic lighting under a unified renderer. I had grown up with 3D software that could render ray-traced images offline on consumer hardware. In the ‘90s, you could render one frame in hours if you took shortcuts and made it very low resolution at a rudimentary ray-traced level.
The technology was coming very fast to Unreal Engine, which I had spent much of my professional and hobbyist career using. I knew UE very well, and it seemed like a great fit for my needs.
Unreal Engine 4.22 was released almost exactly three years ago. Epic re-architected major portions of the renderer and how it handled geometry. I jumped on the technology as soon as I could. At the time I had a GTX 1080 that I’d purchased in 2016. It’s great that NVIDIA continued to improve performance over the lifetime of that GPU such that it had some basic level of real-time ray tracing capability.
Back then, I considered it a victory if I got anything more than one frame per second. Putting the words “real-time” next to “ray tracing” seemed impossible.
To my surprise, despite some early glitches and growing pains, I could get 20-30fps at a lower resolution on that hardware and was very happy with the quality. You could see how Unreal Engine 4 had the potential to enable a vast improvement in graphics, but it was going to take time for the software to advance and the hardware to catch up.
During my time at NVIDIA, we have seen dramatic improvements in both areas. The software is in a better place, being much faster and more capable than ever. We’ve seen advancements in Global Illumination, Shadows, Denoising and Upscaling. This technology — and all the associated technologies RTX hardware enables — moves fast.
It’s moved so fast that I don’t believe many developers have fully caught up with the kinds of options they have at their disposal. Unreal Engine has not held still. In fact, it has gone into hyperdrive. Lumen — a core feature of Unreal Engine 5 — is a real-time rendering technology that uses software-based ray-tracing techniques to approximate hardware ray tracing, and uses hardware ray-tracing on systems with the capability to use it. Nanite, another new UE5 technology, gets us a step closer to scenes of near-infinite geometry. This is important because if you’re going to simulate an entire city, you’ll need a system that can render geometric complexity at orders of magnitude beyond what was ever thought possible.
Our goal at NVIDIA has been to improve what rendering can do, and to make it faster and more capable, as well as higher quality. One day we hope to leave behind light bakes in favor of fully dynamic rendering. This will help development teams with improved workflows. If you can see your lighting in real time, you can be faster and more creative. And it will help games achieve more dynamic environments, and even more procedural environments.
It’s worth noting that we’re still transitioning between older rasterized techniques and newer forms of rendering like ray tracing. With Unreal Engine 5, however, we’re seeing several key advancements. Offline path tracing, a higher form of rendering, has been a usable option in UE since ray tracing was introduced. Combined with the MRQ (Movie Render Queue) system, it allows artists to bake out high-quality photorealistic movies and still images from their game content.
This leads to the current developments we have going on with ray tracing. We’ve developed a new way of doing real-time light and shadow, called RTXDI (Ray-Traced Direct Illumination) that allows for very high light counts. You can place all the shadow casting lights you want, of any type, and they’ll be rendered in a single pass. This means light counts no longer matter. In the RTXDI SDK, we have scenes with hundreds of thousands of shadow-casting lights. Given that for the past three decades we’ve been limited to scenes with one to four shadow casters, RTXDI represents a major step forward.
Shadow quality is equally important in RTXDI. The technique uses the same shadow lighting and penumbra algorithms that are inherent to offline path tracing, and matches it exactly. This is astounding in real time. Traditional real-time ray tracing (almost a funny thing to say, given that it’s only three years old!) approximates realistic shadow behavior, whereas RTXDI reproduces it.
We’ve developed RTXDI as an SDK, and the heart of this work has been brought into our branch of Unreal Engine, called NVRTX. With NVRTX coming to UE5, you have a system for near-infinite geometry (Nanite) married to a system for near-infinite lighting (RTXDI).
Of course, Unreal Engine 5 is new, and it’ll take time to fully bring these systems together and find ways to take advantage of them. But it may lead to one of the greatest challenges of all: What would you do with all this power at your disposal?
We’re not far off from this goal. In the span of a typical AAA development cycle, two or three years, we will likely see this mature and come together in ways that will enable us to create things that were thought impossible in 2022. It may take a big imagination to take advantage of it.
Where will graphics be in 2025? My hope is the maturation of today’s technologies will result in a world where you don’t have to make many compromises. You can have great graphics that run fast and are fully dynamic, without baking or interrupted workflows.
Learn about NVIDIA resources for Unreal Engine developers here.
- - -
Richard is a game developer with over 20 years of experience having contributed to the Battlefield and Borderlands series as well as many others. His indie game Stay in the Light is one of the first Ray Tracing-only games. As Ray Tracing Unreal Evangelist for NVIDIA, his goal is to help developers fully utilize Unreal Engine and Ray Tracing technology.
Read more about:
Sponsor Resource CenterYou May Also Like