The tech tricks that make computer games look real

Scene demonstrating ray-tracing technologyNvidia

The police officer approaches a derelict, graffiti-scrawled industrial building. His bodycam, shaking as he steps forward, captures the scene.

It is overcast. Weeds push through paving cracks outside. A dog barks somewhere in the distance but nobody seems to be around.

Inside the shadowy, rubble-filled interior, it becomes clear people are hiding here. People who would kill him.

“This is the only game I’ve ever seen that actually confuses my brain into thinking it’s real,” reads the top comment below the YouTube video of gameplay from Unrecord, an upcoming title by French indie game studio Drama.

The video chalked up millions of views in just a few weeks and caused a sensation in the games industry. Some commentators on social media questioned if it really was a game and, if so, whether it was actually a little too real – too raw – an experience.

Drama declined an interview with the BBC, saying: “We are currently busy with investors and publishers.”

But graphics in many different games have been getting noticeably more sophisticated, arguably approaching what’s known as “photorealism” – indistinguishable from photos or videos of the real world.

The Unrecord demo looks so lifelike partly thanks to some clever techniques, says Piers Harding-Rolls, head of games research at Ampere Analytics.

Mr Harding-Rolls points to the shaky camera, which imitates actual crime scene footage. The dull lighting, grittiness and urban bustle audible in the background all help, too.

But could it make some people uncomfortable?

“That setting for it is quite reminiscent of some of the more horrendous footage you get out of real life,” notes Mr Harding-Rolls.

In a statement posted on Twitter, Drama said the game is not inspired by specific real-life events.

Also, look closely at still images from the video and it’s possible to see some objects and textures that don’t look realistic at all. This might not matter – but it does undermine the idea that the game is photorealistic.

Mr Harding-Rolls points out that, in general, advances in graphics matter for the games industry: “Consumers definitely want that. They love looking at things and thinking, ‘Wow, this looks amazing’.”

Rachel McDonnell, professor in creative technologies, Trinity College Dublin

Rachel McDonnell

Rachel McDonnell, professor in creative technologies, Trinity College Dublin agrees that the Unrecord video is impressive, though she notes that certain character animations are a little clunky.

They call to mind the movement of characters in other games falling over and dying in pre-programmed sequences.

“Animation hasn’t caught up with the rendering at all in games yet,” she says, adding that crowds are particularly tricky to make lifelike.

“You’ll still see them doing very strange behaviour, running around in circles and getting stuck – that instantly breaks you from your presence in the game.”

Experiments she and her colleagues have conducted suggest that, for players’ immersion in games, how an on-screen character looks is much less important than how they move.

Screenshot of computer generated lion and cub from Monster Emporium

MOnster Emporium

Marc Whitten, president of Create Solutions at game software firm Unity notes that today’s most realistic content relies on highly-detailed 3D modelling of objects.

Last year, Unity showed off a computer generated clip of a lion and its cub featuring two million individually rendered strands of fur.

“If you don’t do that, it does not come across as photorealism,” argues Mr Whitten. The firm has also developed highly lifelike models of humans, where digital puppetry controls their subtle facial expressions.

There’s still room for improvement, he adds. There are many other difficult-to-simulate materials, such as clothing, which are still a long way from looking photorealistic in games.

One important emerging technology for game graphics is neural radiance fields, or NeRFs. California-based Luma AI specialises in this and says it already has customers using the tech to make games.

A NeRF is an artificial intelligence (AI) system that can represent objects or scenery captured in photographs or video footage in the real world.

“When you show it these images from different sides, the network learns how light is bouncing off of everything,” explains Luma AI co-founder Amit Jain. “It measures light and it learns from light.”

The way light reflects off a motorbike’s leather seat versus a headlamp, for example, is completely different and simulating that in a game is very challenging. Nerfs could help to automate the process.

Some of the best game graphics today use what’s known as ray-tracing – accurate simulations of the way light bounces off surfaces or creates glowing effects around neon signs and so on.

AI is making it possible to produce these effects in games despite only modest improvements in chip performance, says Bryan Catarzano, vice president of applied deep learning research, Nvidia.

“We have to be smarter in how we construct the world and how we render it,” he explains.

A new mode for Cyberpunk 2077, an action-adventure game, called Ray Tracing: Overdrive, demonstrates the difference this can make.

Nvidia says its Deep Learning Super Sampling (DLSS) technology allows developers to create high resolution, high frame-rate graphics featuring ray-tracing with the help of AI.

“The model’s trained to know what things in the real-world look like,” explains Mr Catarzano.

Games are increasingly difficult to tell apart from real life at times, says Nick Penwarden, vice president of engineering, Epic Games.

However he says it is still very hard to render certain materials convincingly – such as an iridescent layer of oil on a puddle of water.

“Those are aspects that we don’t yet have the power to simulate in real time,” he says.

And doing that on games consoles or home PCs is what matters. For movies featuring computer generated imagery, it’s possible to use huge computers and take many minutes or more to render individual frames.

Presentational grey line

Presentational grey line

The most popular games of the future might not need to be photorealistic. Consider Minecraft or Epic’s own Fortnite – both enormously successful, both very far from photoreal.

Improved lighting effects and material simulations help artists working on all kinds of games, though, argues Mr Penwarden. It can give stylised or cartoonish environments more depth and complexity, too.

“One of the great benefits about having the ability to do photoreal images is the tech can start to do a lot of the work for you,” he says.


Leave a Reply

Skip to toolbar