Sponsored By

Framerate and Refresh, games and movies are not the same.

Why does a 24FPS movie look better than a 60Hz game? Real motion blur vs fake motion blur. 30FPS vs 60FPS gaming. What is the difference?

Roger Hågensen, Blogger

November 5, 2012

7 Min Read
Game Developer logo in a gray background | Game Developer

Framerate and Refresh, games and movies are not the same. Why does a 24FPS movie look better than a 60Hz game? Real motion blur vs fake motion blur. 30FPS vs 60FPS gaming. What is the difference?

I read this recently, Capcom Explains Why 30 FPS Isn't That Bad, and I'm sorry to say that I think this is pure nonsense.


A movie deals with light over time.

Pretty much any game engine for consoles or PC deals with time slices of light. A movie deals with light over time. What this means is that a movie at the cinema which "only" has 24 FPS (Frames Per Second) with 1/24th of a second of light per frame. While a game has (crudely explained) 30 FPS of 1/1000th of a second of light per frame. The same is true with 60 FPS.


You get to see "all" the light during 1/24th of a second despite only having a 24 FPS rate.

How can this possibly be so? Well, those familiar with photography should know the term "exposure time" quite well. For the rest of you this means that if you took 1000 frames and divided by 24 you would get 24 sets of about 41.66+ "frames", now those 41+ frames are blended into a single frame. This way you get to see "all" the light during 1/24th of a second despite only having a 24 FPS rate. Please note that although I'm saying "blend" this is not really true, check wikipedia for how film exposure really works as that article explains much better than I'm currently able to do.


I've yet to see a 1000FPS capable game.

A computer game would have to replicate this, and I've yet to see a 1000FPS capable game. In fact I can not recall any games that render at a higher FPS and blend down frames. Why? That is simple to answer, it would be wasting resources and you might as well show the frames at a higher rate instead (like 60FPS) if the hardware can handle it. Now if the hardware is able to render faster than 60FPS and the refresh rate is 60Hz and the rendering engine supported blending, then you could in theory get very good looking "film" motion blur by rendering twice as much.


Unlike film exposure a game just create a morphed blur over frames.

So what about the motion blur, can't computers fake the "movie blur" as well? Yeah they can. But unlike film exposure a game just create a morphed blur over frames (in other words 30FPS rendering with blur effect applied). I'm sure there are developers trying to use game world details to help tweak the interframe blur. And if done well a 30FPS game with interframe blur and displayed at 60FPS could look good. But again, I've yet to see this in any current games. Computers has enough issues keeping a steady framerate playing regular movies, so tackling this is obviously a major issue.


A lot of timing is millisecond based.

Part of the reason for these timing issues is the NTSC not-quite-60Hz problem. The PAL (and majority of the rest of the world) 50Hz refresh timing is easier to handle. After all 1000ms / 50Hz = 20ms per frame, with NTSC you get a fractional number instead. Why is this an issue? Well 60Hz (~59.97Hz) is very common, and 1000/60=16.66+ and as a lot of timing is millisecond based you can imagine the issue with matching 16.66. A lot of game engines try to compensate by varying between 15 and 17 ms so that over time it evens out to 60Hz. At 30FPS this jerkyness is less noticeable, but animation feels more stiff instead, and visual feedback vs user input is slower.


Do not confuse frame rate with refresh rate.

Also do not confuse frame rate with refresh rate. Refresh Rate refer to the rate at which a display (like a graphics card or TV or monitor) update what you see. While frame rate refer to how many frames per second is rendered. And rendering more frames than the refresh is just wasting resources as frames will then get discarded as they can not be shown. Ideally you want the refresh rate set to the native rate of whatever devices you use, and the frame rate to what can be smoothly rendered. This can mean 60FPS at 120Hz, or 24FPS at 60Hz. A high refresh rate generally makes a game feel more responsive. (there are ways to avoid this even at low refresh rates though) Sadly many games have the frame rate and refresh rate locked which ternd to decrease responsiveness.


With film/cinema a frame is light over a period of time, while a computer/console game a frame is a slice from a particular moment in time.

So please remember that with film/cinema a frame is light over a period of time, while a computer/console game a frame is a slice from a particular moment in time. Oh and 1000 is how many milliseconds there are in a second, milliseconds (or ms for short) are often used for timing of video on computers. A film frame (24FPS) contains 1/24th of a second of all light the camera see, and is not tied to 1000ms, it is instead tied to the speed of light itself (or spectrum of captured light). So instead of dividing 1000 by 24 it would be more correct to take the speed of light per second and divide that by 24. Light moves at 299792458 meters per second, so filming something 1 meter away at 24FPS means the light reaching the camera has a refresh rate of 299792458Hz. Obviously the math is weird on this as it's not easy to translate light to refresh rate, but it's just a very rough example.


Very costly computationally.

Will real film blur (aka the film look) ever be possible with games? Yes, one day they will. For many years now Anti-Aliasing has been used to improve the look of games/rendering, in it's most basic form this render a frame at a resolution higher than the display, then scale down. When it's scaled down the pixels are blended. This is not unlike how rendering at a higher framerate and then blending frames would be done. But doing it like that is very costly computationally. So what is the solution? Considering how computer hardware (and software) has anti-aliasing and tesselation (creates more detail using hints), something similar should be possible with frame rates as well.


The key is to keep the framerate and the refreshrate independent of each other and keep user input timings separate from both of them.

Hardware or sotware could be made that uses frame (or rather scene) hints to create interframes. There is still the issue of a stable framerate, and computers will probably struggle a long time with that before framerate timing vs refresh rate is fully smooth, some developers get it right while many don't. The key is to keep the framerate and the refreshrate independent of each other and keep user input timings separate from both of them. With Anti-Aliasing (not needed for really high DPI displays), Tesselation, improved timing, and in the near future (I hope) with Interframe Blending (my term for it) getting that film look should indeed be possible.

(It's not easy to explain something complicated in an simple manner. If there are any glaring mistakes as to light, speed, math, rendering technology, feel free to comment and I'll correct the text!)

 

Roger Hågensen considers himself an Absurdist and a Mentat, hence believing in Absurdism and Logic. Has done volunteer support in Anarchy Online for Funcom. Voicework for Caravel Games. Been a Internet Radio DJ with GridStream. Currently works as a Freelancer, Windows applications programmer, web site development, making music, writing, and just about anything computer related really. Runs the website EmSai where he writes a Journal and publishes his music, ideas, concepts, source code and various other projects. Has currently released 3 music albums.

Read more about:

Featured Blogs
Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like