Sponsored By

What makes for a believable, truly human game character?

What makes for a believable, truly human character? And why is realism increasingly important to the gaming community? Cubic Motion, home to the world's most powerful facial animation technology, reveals the pipeline behind AAA game characters.

David Barton, Blogger

February 22, 2019

5 Min Read
Game Developer logo in a gray background | Game Developer

Humans interact with each other every day, and as such, are able to recognise a wide range of complex facial expressions. We constantly analyse and communicate through body language, even if subconsciously. 

What if we could give a computer that same ability? What if technology could understand an image in the same intrinsic way humans can? At Cubic Motion, we use computer vision technology to quickly translate human expressions into data and apply the results to almost any medium – from video games to film, VR and holograms. The result is increasingly lifelike game characters, as seen in Sony's God of War, Ninja Theory's Hellblade: Senua’s Sacrifice, Insomniac's Spider-Man, and other AAA titles. 

Put simply, we teach the computer to read and record complex facial expressions and instantly reproduce this spectrum of emotion in a digital human. But what makes for a believable, truly human character? And why is realism increasingly important to the gaming community?

Capturing every nuance of performance

Translating computer vision data into a truly photoreal character involves a myriad of technologies and animation techniques – from hair shading to skin textures, lighting, and rendering. Every facet requires strict attention to detail and must be brought into a cohesive whole, if any one element drops out of sync, the entire illusion will disappear. 

Eye movement, for example, needs special attention. Accurate modeling of the sclera and cornea is key to making a character’s eyes as lifelike as possible. We can scan an actor’s eyeball for greater accuracy, making sure light reacts with the pupil correctly. Eyeball shape also affects skin around the eyes as a character looks around – this 'soft eye' effect is important for realism. Finally, pupil dilation is needed for close-up shots and can be achieved by capturing detailed eyeball movements from high resolution head-mounted cameras. Putting all of these nuances together, eyes convey a significant amount of emotion in the tiniest of twitches. They’re known as the ‘window to the soul’, after all, and it’s important to get them right.

The process of replicating live performance in a digital character has become more and more nuanced. Doc Ock from Insomniac Games’ Spider-Man is a prime example. Based on the iconic Marvel villain, Dr. Otto Octavius, this character is actually a true-to-life digital double of the actor. He is well matched to Doc Ock’s age and body type, so you get all the same skin creases and appropriate shapes. Why is this so important? Because when somebody smiles, and their face starts to crease up, those are the markers for a smile we are used to seeing in reality. Those markers can make or break a digital human. 

In fact, Doc Ock is the only game character to have been recognised in the 2019 VES Awards, a body celebrating excellence in visual effects. Keep in mind that the Visual Effects Society is not known for video game accolades, but for film. With titles like Spider-Man leading the charge – having received a nomination for ‘Outstanding Animated Character in a Real-Time Project’ – game graphics may soon be able to match the level of CGI used in modern blockbusters. 

Maintaining the cinematic illusion

Nowadays, we’re seeing a convergence in broadcast and game production technology, especially as the gap in visual quality starts to narrow between these two mediums. Many feature films and TV series use real-time renderers like Unreal Engine to turn over fast iterations of any given shot. Meanwhile, game developers are starting to provide the same feature-length, realistic narratives usually found on the big screen.

With the growth of RPGs, walking simulators and open world experiences, players are beginning to expect high-quality animation across the entirety of a game – not just in cinematic cutscenes. Developers are racing to deliver on audience demand and that’s why believable, true-to-life characters have become so sought after. For most studios, it’s standard to create 60 minutes or so of beautifully rendered cinematics. But step out of a cutscene, talk to an NPC, and there will be a significant drop-off in quality. The illusion is ruined. The future will come down to animation techniques and technologies that can deliver realism at large scale, capable of creating over 100,000 lines of performance to keep gamers immersed.

When producing a massive volume of characters at high quality, there are certain rules of thumb. At Cubic Motion, we’ve built up enough experience to know that consistency is key. Make sure to use the same scanning process across all talent and build a universal rig across your cast of characters. Well-designed, consistent mesh topology is also a must.

The volume of shots can be very high when capturing performances for video games. Inevitably, these performances are broken into individual lines of dialogue. It’s vital to blend between these lines and make a character’s attitude and expressions appear continuous. We recommend having a varied library of idle animations that can be triggered in between the specific performance animations, then trial different blending algorithms to make the transition as smooth as possible. It’s a real collaboration between Art and Technology teams.

We’re in constant development at Cubic Motion to increase the realism of all characters, across an entire world. There’s no point in creating a photoreal avatar whose movements then fall apart under a procedural animation system – because, once again, that would kill the fantasy. For those looking to invest in expensive, high-quality scanning techniques to make these game characters look more realistic, you’ll have to give the same level of attention to performance and movement within the cinematics.

Putting the pipeline in place

In some ways, we’ve been waiting for the game industry to realise that digital humans are achievable now. The pipeline and technology are already in place, it’s simply a matter of letting the industry know this level of fidelity is possible.

We can create immersion across hundreds of NPC characters and thousands of lines of performance, while developers focus on making the best gameplay experience possible. As seen in some of the biggest games in recent memory, including Spider-Man and Sony’s God of War, the results can make for a truly engaging experience.

Read more about:

Blogs
Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like