Optimizing for videogames

Troubleshooting

If you’re having issues during your stream, check out the dedicated troubleshooting guide first.

If your stream is having issues, try all the ones below, even if you think it couldn’t possibly be your internet connection, CPU, etc.

My internet is acting up

A lot of people think internet connection issues for streaming come down to whether you have a fast enough connection to do it. Not so. Packet loss is a thing, even with fast connections.

Download the free/trial version of PingPlotter and ping google.com. Don’t show the app on your stream as it shows your IP.

Keep this running in the background and you’ll see whether your connection is losing packets (usually indicated by red colours). As with your frame issues, these tips don’t necessarily fix your issues as much as find the source of them.

Use [PingPlotter’s awesome troubleshooting guide][] to make sense of your results.

My games are having performance issues

If you’re playing Overwatch and have an AMD graphics card, update to the latest 16.x version as one of the 17.x drivers started messing up performance.

You can use Display Driver Uninstaller (DDU) to every last part of your old driver before you downgrade to a 16.x version.

If this doesn’t fix the issue, Alt + Tab in your game, open Task Manager (Ctrl + Shift + Esc), and sort by CPU use in descending order. Try looking for something hogging a huge percentage of your CPU there.

Common offenders include Chrome, iTunes, and Dropbox.

OBS is having performance issues

First of all, do not increase the Process Priority of your OBS. Just don’t. Reset it to its default Normal if you changed it.

Second, inspect the Stats dashboard under the View pane for anything amiss.

Third, up- or downgrading your OBS might fix it. The files you need to back up are in %APPDATA%\obs-studio\.

Fourth, you could have a memory leak. Check OBS’s log files in Help > Log Files > Show Log Files.

At the bottom of your log files, you may see this:

=================================================
Number of memory leaks: [number]

Disable sources, especially browser sources, in your current scene to rule them out. I guess it’s also possible sources in other scenes might be running in the background.

Make sure you check “Shutdown source when not visible” in your sources, too.

For further troubleshooting, you can to go your menu bar, click Scene Collection and create a new one with just one scene to rule out other scenes hogging your performance. Use Duplicate and delete the scenes you don’t need rather than recreating the scene from scratch.

Check out the OBS Project’s “General Performance And Encoding Issues “.

Official OBS troubleshooting guide

Turns out there’s a great official guide for fixing your OBS issues called “Dropped Frames And General Connection Issues”.

The gist is that the issue is internet-related and the idea is to troubleshoot accordingly.

See the section “My internet is acting up” right above this one for more on dealing with internet issues.

Viewers are saying my voice sounds weird

Twitch currently seems to transcode your stream in a way that might make your voice sound off on video settings below Source.

I’m getting input lag in Overwatch

Go to your video settings, turn off Reduce Buffering, hit Apply, turn it on again, and hit Apply. Alt-tabbing is one of the usual culprits behind the issue so just make a habit of doing this whenever you alt-tab from fullscreen.

Network optimization

Use an Ethernet cable

Do not use wireless.

Cable length doesn’t matter for latency below 100m so feel free to pull the cable across rooms or something crazy. Data is moving at the speed of light.

You can always switch between the two depending on what you’re doing if wired is cumbersome.

Use at least a CAT5e cable

No one ever pays attention to what Ethernet cable they’re using, but the difference between using an older cable and a newer means a significant change in speed and interference—so-called “crosstalk”.

Cat5, Cat5e, and Cat6 can deliver 100, 1,000, and 10,000 Mbps respectively, so check the printed text on your Ethernet cable to make sure you’re not still using a Cat5 cable. Throw out your Cat5 cables and make sure you don’t buy an old cable standard for the same price of a modern one the next time you shop for cables. As of this writing, Cat7 is very affordable.

And in case you weren’t sure, all cables are backward compatible.

Prices and standards keep changing so I can’t tell you which category is best for you at the exact moment you’re reading this. Just be mindful of the difference between cables the next time you go shopping.

Like me, you’ve probably got a billion devices in your house with an Internet connection; just make sure your desktop computer, NAS and bottlenecks like routers or switches get the good Ethernet cables instead of your toaster.

Just don’t forget the bottleneck on the other side; all devices don’t come with support for 100 Tbps Ethernet speed so don’t get caught off guard when your PlayStation 4 won’t go beyond 1,000 Mbps.

Port forwarding

The first thing you should do to deal with any network issues after going wired instead of wireless is to forward the ports for your game.

Some games are able to detect unforwarded port and will show you some kind of “NAT” warning. Your NAT type indicates how walled off you are from the Internet:

PlayStation 4 goes as far as to display an error prompt if your NAT type is 3. NAT 2 should work just fine; it does for me at least. Never disable all your security measures just to play a damn videogame. Open the specific ports for your specific device with just the specific protocol: TDC or UDP.

Look up the ports and their protocols you need to forward on your router; just google it.

Once you’ve got them, you have to find your local IP. On PC press Win + R and type cmd to open your Windows terminal. Hitting Enter to open the terminal and type ipconfig in it. Use the IP by IPv4. It’ll usually read 192.168.0.{x}.

On your consoles, you can just go to its Network settings and look it up. It will also usually read 192.168.0.{x}.

As an example, let’s take Overwatch whose ports are listed on Blizzard’s website.

TCP ports UDP ports
1119, 3724, 6113, 80 3478-3479, 5060, 5062, 6250, 12000-64000

Let’s say our desktop computer’s local IP is 192.168.0.10.

In our router settings, this should be set up something like this:

Port start Port end Protocol IP
1119 1119 TCP 192.168.0.10
3724 3724 TCP 192.168.0.10
6113 6113 TCP 192.168.0.10
80 80 TCP 192.168.0.10
3478 3478 UDP 192.168.0.10
5060 5060 UDP 192.168.0.10
5062 5062 UDP 192.168.0.10
6250 6250 UDP 192.168.0.10
12000 64000 UDP 192.168.0.10

Pro tip: add a few lines at a time. Sometimes, you’ll get an error that resets all your hard work. Port 80 should already be forwarded somewhere, so if you get an error that it’s already assigned, you should just move on to the next thing.

Most of the time, you won’t get any feedback if you’ve used the wrong IP or port. You’ll just be annoyed that you’re never able to join a server or form a party with your friends. Try to find a method to verify that you’ve set up things properly.

If you don’t have the right ports forwarded, you may get a NAT type error in your games, especially on the PlayStation Network.

Dynamic QoS

This is a feature on routers that keep other devices or members of your household from nuking your internet with a sudden spike in download or upload.

Stream-specific optimization

You’ve already learnt how to set up your OBS Studio from my guide. But you can still do more for your stream quality.

What you’ve learnt is how to encode your video.

But you also need to learn about what happens next: compression.

Compression is not either-or.

A thought exercise

Imagine you have some data:

jbabclkajclaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa

That’s 32 a’s at the end.

Maybe we should do something about this redundancy.

Rather than type this, you could compress it as follows:

jbabclkajcla32

You’ve cut your data by more than half. You’re better off doing this than sending the original data—assuming the compression is computationally feasible.

And speaking of computationally expensive, OBS Studio of course supports changing the CPU encoding preset from ultrafast to slower depending on your processor’s computational prowess.

Video compression

Video compression is a little different.

What is a video? A series of images: frames. Videos may also have a separate audio track.

Remember our thought exercise. Now imagine these are two consecutive frames:

asdjklajlkjjkljblkalda
asdfqlajjbjjkljblkfada

What if, instead of transmitting information about every single frame, we do something clever we only transmit information difference between frames.

asdjklajlkjjkljblkalda
   fq   jb        fa

Remember, the more information you have to communicate (transmit), the more data you need.

What this means is that the more the next frame differs from the current one, the more data is required to compress your video—or stream.

On Twitch, the measure of how much we can transmit to servers is the bitrate. Right now, the cap is at 6,000 kbps.

I’ll let Tom Scott piece all this together in his fantastic explainer; you don’t need to watch it if you follow so far, but it’s a splendid video in its own right.

So the more that “happens” on stream, the more it takes to compress our video at the expense of image quality.

But as you might expect, this is a simplified explanation. Today’s compression algorithms also utilize prediction between frames.

And with modern codecs like h.264, the difference is measured can be across not one, but multiple reference frames—which comes in handy for periodic movement such as a bird flapping its wings.

What really trips up compression algorithms is, what should we call it, “complex difference”? “Unique difference?”

What also sucks is that we have to use compression algorithms for extremely generalized purposes. Maybe we’d be better off with one compression algorithm for movies and another one for videogames. But we all know how much diversity is contained in videogames alone: car games, twitchy FPSes, RTS, cinematic games, lush and epic narrative games, and so on.

A paper from 2014 suggests that we employ an algorithm to detect the nature of a scene and choose the optimal compression codec for it. And the idea of Dynamic HDR is to dynamically optimize HDR for a given scene instead of merely it merely once for a video.

One imagines videogames and streaming might one day warrant their own codec—or codecs.

Update Jan 04, 2018: Streamlabs have released their own OBS fork doing just that.

Oh, and there’s also resizing or downscaling algorithms to consider. At least OBS Studio limits our options to Lanczos.

These are all the options in the otherwise userfriendly ImageMagick:

-sample
-resample
-scale
-resize
-adaptive-resize
-thumbnail

By default, ImageMagick’s convert --resize downscales images with a Lanczos filter, but upscales with a Mitchell filter. You can read the full ImageMagick guide to resampling filters if you want to go down that rabbit hole.

More on this insanity in my video editing guide.

Stream-optimized videogame settings

A cruel irony of Twitch is that its biggest game right now1 is PLAYERUNKNOWN’s BATTLEGROUNDS (PUBG). (And yes, they really want you to capitalize it like that.)

Why cruel? Because a game like PUBG looks terrible on streams due to how video compression works.

If you’ve watched the Tom Scott video, you’ll see that confetti and snow disrupt image quality significantly. A common element in videogames with a similar effect is foliage. As a result, not only does foliage worsen the video quality of the game; streamers tend to resort to lowering foliage to the lowest graphics settings, which makes the game looks more barren and artistic than it otherwise would.

But this is the age we live in, and videogame developers will eventually face the fact that if they want their games to look good on Twitch or YouTube—streamed, recorded or just the trailers alone—you will have to base your art design around what looks good with video compression.

Aside from dense foliage and heavy snowfall, we’re probably going to see a decline in film grain, or at least the high-noise variety. Good riddance in most cases anyway.

Similarly, large particle effects can a work similarly to confetti and snowfall—but in most cases, turning them off will most likely make the game uglier to everyone, so I would personally stick to enable or high particle effects as a rule of thumb. Just be mindful of what happens when all that stuff goes on for your audience.

Two other foils that may not be entirely obvious are motion blur and depth of field. Motion blur refers to blurring when you move the camera, or sometimes use some kind of ability, whereas depth of field blurs based on distance or focus, usually in FPS games. While their effect is harder to quantify, try recording footage with and without either of those two and look for any noticeable difference.

Hold the camera

Scott also brings up that video compression algorithms can compensate for him moving around on screen as his position changes. The thing about videogames is that having the character on screen move usually results in the whole camera shifting as well, which is a whole lot harder to process than Scott moving against a plain, static background.

This is where a second component to making your stream look good beyond tweaking settings comes in. You are basically the cameraperson in the videogame, so it falls on you to work within the limitations of video compression. On top of this, you want your playthroughs to feel immersive, so treating them like movies and “roleplaying” things will make the experience enjoyable for more people than just you.

This includes doing things like:

This is all the more important when streaming a console game where you rarely if ever can change graphics settings.

Games differ, as does advice

Last, remember that games are diverse and that some types and genres make these factors more and less pressing alike. A third-person MOBA like Paragon is easily hobbled by blur and particle effects, but Hearthstone doesn’t restrict you in what settings you can use and what you can do as a turn-based card game.

Overall, the main aspects of a game to consider are things like

As always, to help boil my advice down, you can always check out my settings repository on GitHub for an easy overview of I’ve set up my games. Check out both game > video in general.yml and the game-specific settings in games/.

General optimization

Windows 10 Game Mode

While FPS gains are slim to none, the video above suggests that Game Mode might reduce input lag when turned on in the Windows Settings—not the so-called Game Bar (Win + G). However, you won’t be able to run other applications like OBS Studio in the background, so Game Mode is only worth even considering for people who play without streaming.

Test it and see for yourself (TIASFY) is what I would suggest. It’s not going to be worth it for most people, but the edge cases matter, too.

Update your drivers

Especially your graphics drivers. Make sure the driver version you installed is reflected in the driver software’s interface.

Pay attention to whether a new driver introduces issues for you.

Install and uninstall drivers in Safe Mode if you want to avoid as many conflicts as possible.

If you ever run into issues with driver, Display Driver Uninstaller is useful, if nothing else then as a last resort.

Close apps

iTunes and Dropbox can really mess with your performance. While having around your game launchers can be convenient, you’ll definitely want to shut them down while playing, since they tend to download and install all sorts of updates in the background.

Make a file named something.bat and put this inside:

taskkill /f /im Dropbox.exe
taskkill /f /t /im iTunes.exe
taskkill /f /t /im iTunesHelper.exe
taskkill /f /t /im Slack.exe

taskkill /f /im Agent.exe
:: taskkill /f /im Battle.net.exe
:: taskkill /f /im Battle.net\ Helper.exe
taskkill /f /im GalaxyClient.exe
taskkill /f /im Origin.exe
taskkill /f /im Steam.exe
taskkill /f /t /im UplayWebCore.exe
taskkill /f /t /im upc.exe

pause >nul

You can probably figure out what the commands do except the taskkill options:

:: Comments out (ie disables) the rest of the line. You’ll want to use some of these, so customize it to your own preferences.

Battle.net also has an option to shut itself down when a game is launched. You may want to enable it.

You’ve just created a batch script. Double-click the file to run it; that’s all.

Disable fullscreen optimization

People in the Overwatch community are reporting that disabling fullscreen optimization under the Compatibility setting for the executable improves performance significantly.

This only applies to people who’ve installed the Windows 10 Creators Update. It sounds like the feature is related to Game Mode introduced in the same update.

The trick sounds like it’s GPU- and not CPU-related, but it takes a second to do, so why not test for yourself.

Here is a succinct video that shows you how it’s done:

Again: TIASFY.

Battle(non)sense’s optimization checklist

Chris has a general list to improve your overall experience in games, not just the performance.

Here’s a simplified list of his tips:

  1. Mouse
    • Use same mouse sensitivity for zoom, ADS, and regular aiming
  2. Keyboard
  3. Graphics card
    • Disable VSync
    • Always stay above 60 FPS on 60 Hz monitors—try capping at 120 FPS
  4. CPU
  5. Storage drive
  6. Monitor (144 Hz)
  7. Sound
  8. Ethernet, Powerline, wi-fi
    • Avoid wireless as much as possible
    • Use 5 GHz instead of 2.4 GHz
  9. Gaming router or SOS

One of his videos also shows a minor improvement in input lag by choosing Windowed Mode over Borderless Windowed Mode. It shows a significant improvement by choosing Fullscreen Mode over either, though. I’ll try to find it, but by all means always use Fullscreen Mode if you only have a single monitor or have a tunnel-like focus and never do stuff outside your games during your sessions.

You can switch fairly quickly between Windowed and Fullscreen Alt + Enter.

Another thing that increases input lag is triple buffering. Make sure you disable it–as well as double buffering.

Update Aug 18, 2017: Battle(non)sense has released a new optimization video:

Monitors

Calibrate

The default settings on monitors and TVs are absolute garbage meant to look interesting in stores. For monitors, make sure to install the most recent driver.

Second, copy a calibration profile of your monitor from a decent review site. Beyond some fine-grained profiles, this usually involves turning of a bunch of dumb post-processing features for TVs. Make sure your RGB Range is set to Normal/Limited instead of Full on both your console and TV. If you’re on your monitor which is usually Full RGB, you may have to change your console settings accordingly. Likewise, you may get better input lag from turning on Game Mode, although there’s a chance that you’ll lose some image quality in the trade-off, although a fully calibrated TV should minimize the difference between having it on and off. Calibrating Remembering to switch profiles is an absolute pain on the PS4 where many of us both play games and watch Blu-ray movies and TV.

For both monitors and TVs, make sure you’re using the right cable so you don’t use an HDMI 2.0 cable for a device and TV that support 2.0a (HDR).

A lot of HDMI switches advertise things like 2Kp60 but hide the fact that they only have HDMI 2.0. This means that anything that goes through them lose HDR. If you want to play with or record HDR, get a switch that supports 2.0a or find another way to route your input.

HDMI, DisplayPort, DVI

Your desktop monitor comes with a bunch of ports. To make it brief, you’ll usually want to use these ports in order of preference:

  1. DisplayPort
  2. DVI
  3. HDMI

Your TV will usually only have HDMI ports for you to worry about. (Make sure you use 2.0a cables for HDR. Meanwhile, DisplayPort is the “standard” port for desktop monitors, even though it has the clunkiest, ugliest cables.

There may also be performance gains:

If you go beyond 1080p60, risk running into feature or throughput bottlenecks with HDMI and DVI. If you aren’t going beyond that, who cares, use whatever.

Just don’t be like the people who think they’re watching 4K with HDR on the new TV even though they’re using the wrong cables or a playback device that doesn’t support it.

FPS, screen tearing, and input lag

Most discussions focus on frames per second—and whether it’s below the monitor refresh rate.

But two other major issues with videogames never get the attention they deserve:

  1. Screen tearing
  2. Input lag

Addressing a problem either usually makes things worse for the other.

What is screen tearing?

If your computer can’t render enough frames can’t keep up with your monitor refresh rate (MRR), the frames are repeated and make the your game jittery.

To understand screen tearing, you have to imagine how an image is sent to your monitor. The frame has to be rendered by your computer and then sent to your monitor. Keep this simple chart below in mind.

[Frame2] -> [Frame1]
  1. The frame is rendered by the computer
  2. The frame is saved to the GPU’s frame buffer
    • Don’t overthink what “buffer” means
  3. The frame is “sent to” the screen buffer
    • Again, people had to name it something by the end of the day
  4. The frame is “drawn” on the screen

If you remember the “progressive scan” terminology from the HD Ready vs Full HD days of flatscreens, you might know that an image is drawn one line at a time from top to bottom.

A 60 Hz monitor means it updates 1/60s ~ every 16.7ms. It does so without skipping a beat. Every time the frame is redrawn, it does so line by line from top to bottom.

With a framerate that goes up and down all the time, handing off the correct frame in perfect sync with the monitor is a terribly difficult task that ends up with some drawn lines on the screen being out of sync. This is screen tearing.

VSync

VSync was created to deal with tearing. But what is it and how does it work?

From VESA’s Adaptive-Sync whitepaper (.pdf):

With ‘VSync’ enabled, the display buffer is only refreshed during the vertical blanking interval between frames, so that a full frame is always displayed and no tearing is visible.

In other words, VSync doesn’t draw the frame progressively horizontal line by line. Instead, it keeps a full frame, “vertically”, and waits for the next one.

You can already imagine how that’s an issue, right?

This is great if the game’s rendering framerate is higher than the refresh rate of the display. If the game’s framerate drops below the refresh rate of the display (eg. during a short period of intensive action), then the new frame will not be ready in time for the display’s blanking interval, and the previous frame is repeated on the display.

(…) Render Frame B took so long to render, that Frame A had to be repeated. This effect manifests itself as stutter and lag to the end user. The alternative for the gamer is to disable VSync, which virtually eliminates stutter and lag, but can produce visible tearing, especially during scenes with fast movement.

Unfortunately, VSync introduces input lag and also starts doing really bad things when your framerate drops below your MRR.

In other words, you both need to make sure you don’t dip below your MRR to avoid stuttering, but you also have to avoid going above it to avoid tearing. And you get input lag with VSync either way. This is where companies started looking into replacements for VSync.

You’ll also see a setting for triple buffering to address this, but it also come with nags like input lag.

I never turn VSync nor tripple buffering personally. But some people seem to be really sensitie to tearing. If you’re a streamer, you may also weigh performance to how the game looks to your audience.

If you have a game where input lag doesn’t matter and where your FPS will always stay above your MRR—eg turn-based or slow RPGs/RTSes—you can try out VSync. Triple buffering will limit the hits you take from your FPS dipping at the cost of additional input lag.

I’d give you a link to a great VSync explainer, but they’re all pretty bad and contradictory. I didn’t know how this worked beforehand, so take my attempt at an explanation with a grain of salt.

Adaptive-Sync

The best and most expensive solution that address all things at once as much as possible is to get a monitor and graphics card that support either G-Sync or FreeSync.

Both are based on the Adaptive-Sync standard which is built on the idea of Variable Refresh Rate (VRR) where the Monitor Refresh Rate (MRR) is synced to the framerate. Amazing how someone managed to come up with a good name for something for once, right?

To summarize:

Lately, “Game Mode VRR” was announced as an optional feature in the HDMI 2.1 spec. They don’t mention Adaptive-Sync outright, so it might be a completely different standard.

NB: Game Mode VRR for HDMI 2.1 should not be confused with AMD’s FreeSync for HDMI. As AMD write in their HDMI-on-FreeSync slides:

AMD fully supports the addition of standardized variable refresh rate to future HDMI® specs, but we couldn’t wait

Why HDMI when we already have DisplayPort? TVs. AMD provide the GPU for most videogame consoles, so this is a big deal for VRR on consoles.

So why did we want to use VRR again? Two major reasons:

  1. Fixes tearing without hurting framerate and input lag
  2. Variable monitor refresh rate is more power efficient

For TVs, the power savings work particularly well for movies and TV shows that are usually recorded in 24p—a fixed framerate—which would only require 24 Hz compared with your standard 60 Hz.

The problem with optional spec features

If you want Adaptive-Sync, you’ll have to choose between FreeSync and G-Sync. AMD only supports FreeSync, and Nvidia only supports G-Sync. This means that if you have a Radeon card you don’t plan on replacing, you’ll get a FreeSync monitor. With a GeForce card, you’ll look at getting a G-Sync monitor. And vice versa, if you’ve got a *Sync monitor and want a new graphics card, you’ll be locked into one choice the same way.

*Sync support in monitors is implemented in the scaler chip. For G-Sync, Nvidia’s expensive proprietary scaler chip module is required. For FreeSync, manufacturers can just adapt their own scaler.

AMD named it “FreeSync” for a reason.

Adaptive-Sync is not mandatory in DP 1.2a, and whatever “Game Mode VRR” is isn’t mandatory in HDMI 2.1. You might think “that sucks, but people will just pass over the devices that don’t support it in favour of those that do”. However, a huge problem is what happens if companies like Nvidia refuse to support it.

Currenly, AMD’s graphics cards aren’t doing so hot compared to Nvidia’s, ceding huge marketshare to them.

July 2017 Steam Survey going back to February 2016 showing that 2/3 use Nvidia and 1/5 ATI

By choosing to not support Adaptive-Sync, Nvidia’s graphics cards still wouldn’t work with new monitors that do support Adaptive-Sync. Your only option as a GeForce owner is to buy monitors with Nvidia’s proprietary and expensive G-Sync, which earns Nvidia a comfortable additional revenue stream.

But don’t take my word for it: look up two monitors with the same specs and compare the prices for the FreeSync and G-Sync version. FreeSync is royalty-free; G-Sync isn’t.

So what do we do about this?

One silver lining is Intel’s commitment to supporting Adaptive-Sync, which will require native iGPU support on the CPUs. As of August 2017, two years later, this remains to be seen. But of course, we also haven’t seen any Adaptive-Sync monitors. Then again, between Intel and monitor manufacturers, it’s not clear who’s the horse and who’s the carriage in driving adoption.

Hopefully the HDMI 2.1 spec will light a fire under Intel who already feel AMD breathing down their neck. Ironically, their future iGPU will probably be from AMD.

When this happens, any monitor with Adaptive-Sync should work with either Intel CPUs or AMD graphics cards with G-Sync left in the cold to only work with Nvidia graphics cards.

Hopefully AMD will work towards CPU support for Adaptive-Sync, too.

We’ve already seen Apple adopt VRR with their new iPads and Microsoft with their Xbox Scorpio AKA Xbox One S that uses FreeSync.

Special thanks to Apple for coining it “ProMotion”, which makes it literally impossible to look up online.

Low-framerate compensation

If the framerate drops below the monitor’s minimum refresh rate, Adaptive-Sync stops working for obvious reasons.

Nvidia pulled some tricks to make sure this wasn’t a problem by using frame-repeating to artificially keep the framerate above the minimum. FreeSync, meanwhile, was initially unable to deal with this.

In a 2015 update, this was addressed with AMD’s so-called low framerate compensation (LFC).

Another really interesting thing here is that handling low framerates with tricks like frame-repeating could be used for more than staying above the monitor’s minimum refresh rate. This is now also used to smooth over the jitter we all know from low FPS.

I’ll let Anandtech from the link explain the rest:

Frame reuse is simple in concept but tricky in execution. Not unlike CrossFire, there’s a strong element of prediction here, as the GPU needs to guess when the next frame may be ready so that it can set the appropriate refresh rate and repeat a frame the appropriate number of times. Hence, in one of the few things they do say about the technology, that AMD is implementing an “adaptive algorithm” to handle low framerate situations. Ultimately if AMD does this right, then it should reduce judder both when v-sync is enabled and when it is disabled, by aligning frame repeats and the refresh rate such that the next frame isn’t unnecessarily delayed.

The good news here is that this is a GPU-side change, so it doesn’t require any changes to existing monitors—they simply receive new variable refresh timings. However in revealing a bit more about the technology, AMD does note that LFC is only enabled with monitors that have a maximum refresh rate greater than or equal to 2.5 times the minimum refresh rate (e.g. 30Hz to 75Hz), as AMD needs a wide enough variable refresh range to run at a multiple of framerates right on the edge of the minimum (e.g. 45fps). This means LFC can’t be used with Freesync monitors that have a narrow refresh rate, such as the 48Hz to 75Hz models. Ultimately owners of those monitors don’t lose anything, but they also won’t gain anything with LFC.

FreeSync 2 improves on this:

As a result, AMD has tightened the standards for FreeSync 2. All FreeSync 2 certified monitors will be required to support LFC, which in turn means they’ll need to support a wide enough range of refresh rates to meet the technology’s requirements. Consequently, anyone who buys a FreeSync 2 monitor will be guaranteed to get the best variable refresh experience on an AMD setup, as opposed to the less consistent presence of LFC on today’s FreeSync monitors.

In other words, you’re still off with just a solid framerate, but if you do manage to take a dip, it won’t be as miserable as it normally would be thanks to the LFC measures. Jitter—or “judder” if you’re AMD—will still occur, but at much lower framerates than before.

You’ll see more on why this is a big deal in the discussion of frametimes below.

Framerate limiting and *Sync

So you got your self a *Sync monitor; Battle(non)sense looked into what your ideal settings should be:

The video below shows what happens if you don’t configure your settings according to the tips above. He talks about setting your FPS cap below your MRR at 12:52. At 15:38, he brings up the need to lower your FPS cap to 130 on FreeSync while G-Sync manages with 142.

Since Battle(non)sense is just testing this on his own PC in just one game, you may very well have to lower the framerate by even more. The lesson here are not the specific numbers, but that *Sync isn’t just plug and play.

Of course, if your game breaks with *Sync forcing you to disable it, you should set your FPS cap somewhere above your monitor refresh rate.

Another important point here is that *Sync does not increase input lag, meaning there’s no reason for you to not turn it on. You’ll need to make sure you set your frame limit low enough, however, as outlined above.

For another extremely in-depth investigation, check out Blurbusters’ article on G-Sync.

Framerate limiting without *Sync

What do the rest of us mortals without a *Sync monitor do?

The case for FPS caps

I don’t want to make the anecdotal a science, so let’s call it a rule of thumb: take your MRR and add 60 to it to get a decent FPS cap.

60 + 60 = 120

This is what I use. It could be higher, but I don’t see the point. Some people have talked about the advantage of running without an FPS cap, and there may very well be wisdom to it, but until then, I’ll just cap it.

There are two basic reasons for this cap:

  1. The framerate must not dip below the MRR
  2. The computer should not be sweating frames I won’t see

My frames will occasionally drop during heated moments in Overwatch; this is normal. When this happens, I want to make sure the framerate won’t dip below my MRR. This is what the 60 FPS padding is for; it keeps the CPU and GPU at the right workload calibration so they won’t have to suddenly “spin up” to manage the heavier load. This probably also gives you a better idea of what settings like fan speed and graphics you’ll be able to pull off to meet your minimum of 60/MRR.

By setting your limit well above your MRR, you’ll also get a better idea of what “steady FPS plateau” is and just how close you are to dipping below 60.

I recommend using something like MSI Afterburner to bring up a real-time graph of your framerate as you play as the plain FPS number hides quick dips.

For the same reason, I don’t want the FPS to run higher than 120, because my computer will spin up to handle a load that’s way beyond what’s needed. At the very least, this more heat, more electricity, noisier fans, and potentially bottlenecking other processes like OBS Studio.

This is also why games that haven’t been coded by morons cap the FPS on menus to the MRR (or a little more) to make sure people’s computers don’t melt trying to render something inane. You’d think this was obvious, but even Blizzard has forgotten this, and there have been cases of people’s graphics cards feeling their oats and frying.

The case against FPS caps

Back in the olden days, some poorly designed graphics engines would give players an increased rate of fire the higher their framerate was. While edge cases like that should no longer be a major concern, there are people in the competitive scene who swear by a framerate in the hundreds well beyond the MRR, arguing it improves their input lag.

You may already have seen Overwatch pros running their games at an uncapped framerate (the maximum of 300). This is the reasoning behind it.

I’m not going to make the case; instead, I’m going to leave it to 3kliksphilip:

Update Apr 15, 2018: Here is Hardware Unboxed making the case—at least for fast twitchy games like FPSes (4:14):

There’s a lot of very anecdotal evidence in this, so let’s take a step back and focus on something more tangible as one Reddit commenter brings up:

Frame time

One thing is input lag and FPS for a given situation; another is inconsistent input lag and FPS that makes adjusting to variable conditions extremely difficult. While getting 120 FPS is great, the real performance measure comes down to your consistent framerate.

In light of this, you might argue that a consistent framerate around 120 is better than a framerate that bounces between, say, 150 and 180. But a stable framerate of 300 might still be more desirable in theory. And if your framerate hangs out around down at 90, maybe lowering your cap from 120 to 90 might be preferable. Hypothetically.

At this point, you start turning down graphics settings instead of fidgeting with FPS caps, not to get a certain FPS, but to get a stable FPS.

As someone with an ancient computer who mainly plays some casual Overwatch, all these is more of a theoretical science than something practical and applicable. That said, I’ve still got eyes, and this “optimization” guide was originally created as a troubleshooting guide due to all the issues I had running Overwatch.

As important as science, whitepapers, and promises are, you must always remember to trust your instincts and play around with the different settings in your preferred environments as well. If I hadn’t done that, I would never have discovered my obscure Overwatch issue where the increased tick rate resulted in microstutter, because my CPU was too slow to keep up with the additional simulations to be performed.

I even made a series of YouTube videos to document my experiment and slow descent into insanity.

Enabling Limit Server Send Rate in the Gameplay settins fixed this. My thanks to Battle(non)sense for responding to my inquiry at a time I thought I was losing my mind trying to figure out what was going on.

This is where frametimes come in.

As great as it is that some sites and YouTubers have started including minimum and average FPS in benchmarks, it’s still not enough for the tryhards out there who want to perfect their setup.

Digital Foundry don’t just make the ugliest and most opaque benchmarks; they were also one of the first to make frametimes an integral part of their benchmarking:

Checking for potential errors is also helped immensely by the frame-time graph, which visualises exactly how long any given image remains on-screen. Frame-time is enormously important for representing the ‘in the moment’ experience - effectively, it highlights hitches and stutters during gameplay and it’s also essential in verifying good frame-pacing, but it’s also useful for us in making sure that the analysis is accurate, in that potential oddities are highlighted. 16ms frame-time spikes in a 30fps game… really? 50ms stutter in a 60fps game? That’s odd and should be checked out. The vast majority of titles scan in to FPSGui and export out again with no real issue, but over the years we’ve found that as good as our algorithms are, sometimes external variables cause problems on a small number of titles - and that’s where our visualisation modes are required to fully understand what’s going on. At the end of the day, it’s all about human verification of the algorithm’s results in order to ensure accuracy.

“Inside Digital Foundry: How we measure console frame-rate”

Including frametimes—which they sometimes refer to as “frame pacing”—they’ve been able to expose that although “4K” consoles tout improved performance over their non-4K counterparts, better framerate or resolution may also come with frametime issues that ruin the overall experience with its jitter.

This also gets into why console games are framerate-locked at 30. When you know millions of people will play your game on the exact same console you have in your developer studio, you can look into things like this. Framerate-locked games don’t always improve the experience, however.

Check out their article on FPS-locking console games.

If not for Digital Foundry, console players wouldn’t know these issues existed, because most benchmarks only focus on framerate and graphics—not frametimes.

To do your own framerate and frametime research on PC, download FRAPS and Frafs Bench Viewer to log and plot your performance—including frametimes.

Check out the discussion in the FRAFS link that goes into more detail about frametimes and “microstutter” and links to a bunch of other interesting articles.

Some of that discussion is about FRAPS not being a perfect tool. Nvidia have released their own FRAPS alternative called FCAT: Frame Capture Analysis Tool. You’ll want to check out the Downloads section of FCAT.

There’s also a new version of MSI Afterburner that supports plotting your framerate and frametimes directly in your overlay; this is particularly useful for recording benchmark videos.

What is good enough when perfect doesn’t exist?

Remember, folks; there is no such thing a zero input lag or update rate. We can’t optimize for the impossible, “perfect”, but we can try to understand what holds us back.

A useful way to think of input lag and update rate is as sunblock. Sorry if I’m using an analogy unfit for the target demographic.

1,000/30  = 33.330 ms
1,000/60  = 16.670 ms
1,000/72  = 13.890 ms
1,000/120 =  8.330 ms
1,000/144 =  6.940 ms
1,000/240 =  4.170 ms
1,000/288 =  3.470 ms

These are the most useful figures. Going from the usual 60 Hz monitor to 144 Hz cuts down 9.72 ms, a pretty decent improvement. Going from 144 to 288, a 144 Hz higher refresh rate vs 84 before, only cuts down the rate by 3.47 ms.

Let me just plot this from 2–300 to make it easier to understand:

Chart of update rate vs time to update

You might see some discussions of frametimes with the axes flipped: ms per frame on the x-axis and FPS on the y-axis. This betrays the basic principle of y being the dependent variable affected by x, and not the other way around, so don’t flip the axes just because it looks good. (End of chart rant.)

As you can see in the chart, you quickly get asymptotally diminishing returns. At some point, increasing the rate of frames or monitor redraws stops making sense. Perfect is not a possibility.

One thing to note about this is the framerate compensation provided by FreeSync and G-Sync also addresses frametimes AKA microstutter. This is another huge feature of the two Adaptive Sync implementations that makes getting a monitor that supports it extremely compelling.

To put things into perspective, consider how we decide how much sunscreen we use to avoid burns and skin cancer—things that are a lot more important than our videogames.

SPF, sun protection factor, as explained by the Skin Cancer Foundation as:

Most sunscreens with an SPF of 15 or higher do an excellent job of protecting against UVB. SPF—or Sun Protection Factor—is a measure of a sunscreen’s ability to prevent UVB from damaging the skin. Here’s how it works: If it takes 20 minutes for your unprotected skin to start turning red, using an SPF 15 sunscreen theoretically prevents reddening 15 times longer—about five hours.

Another way to look at it is in terms of percentages: SPF 15 filters out approximately 93 percent of all incoming UVB rays. SPF 30 keeps out 97 percent and SPF 50 keeps out 98 percent. They may seem like negligible differences, but if you are light-sensitive, or have a history of skin cancer, those extra percentages will make a difference. And as you can see, no sunscreen can block all UV rays.

But there are problems with the SPF model: First, no sunscreen, regardless of strength, should be expected to stay effective longer than two hours without reapplication. Second, “reddening” of the skin is a reaction to UVB rays alone and tells you little about what UVA damage you may be getting. Plenty of damage can be done without the red flag of sunburn being raised.

The “exposure” has a simple formula: 1/SPF.

1/15 = 6.7%, which gives you 93.3% protection.

Let’s plot SPF in increments of 5:

Chart of SPF vs exposure

Look familiar?

In case you didn’t figure it out, it because the respective formulas we had a formula of 1s/rate for frames and monitor redraws, on the other, we had a formula of 1/SPF.

The maximum SPF tends to go to 50, which gives you a protection of 98%, but it’s still not 100%. And we haven’t even included pricing in our considerations.

But if 98% protection against the big yellow cancer ball in the sky, maybe you can find it in your heart not to chase a perfect gaming experience that doesn’t exist.

VSync alternatives

Not everyone can afford a monitor based on Adaptive-Sync (G-Sync and Freesync), so what are your options if the tearing becomes unbearable?

To make things as confusing as possible, Nvidia created something called Adaptive V(-)Sync:

NVIDIA's Adaptive VSync fixes both problems by unlocking the framerate when below the VSync cap, which reduces stuttering, and by locking the framerate when performance improves once more, thereby minimizing tearing.
Adaptive-Sync explained by Nvidia, regardless of what they call it.
“Adaptive VSync dynamically turns VSync on and off to maintain a more stable framerate.” #

What this does is enable VSync unless it dips below the MRR at which point it’s disabled.

More recently, however, Nvidia released Fast Sync, and AMD Enhanced Sync.

Like Adaptive V(-)Sync, Fast Sync is disabled when the framerate is lower than the MRR.

Fast Sync and G-Sync aren’t mutually exclusive and can be used together.

Because we can’t have nice things, Battle(non)sense found that Fast Sync results in less fluid motion than VSync:

To repeat an important point

As important as science, whitepapers, and promises are, you must always remember to trust your instincts and play around with the different settings in your preferred environments as well. If I hadn’t done that, I would never have discovered my obscure Overwatch issue where the increased tick rate resulted in microstutter, because my CPU was too slow to keep up with the additional simulations to be performed. Enabling Limit Server Send Rate in the Gameplay settins fixed this. My thanks to Battle(non)sense for responding to my inquiry at a time I thought I was losing my mind trying to figure out what was going on.

Same thing with HDMI RGB range on consoles, but that’s another story for another time.

Sync all the things

By this point, you need to be able to remember and distinguish between V(-)Sync, Fast Sync, and Enhanced Sync, Adaptive(-)Sync, Adaptive V(-)Sync, G-Sync, and FreeSync. ez.

Here’s a brief reminder:

The best simplified overview and comparison of the major *Syncs can be seen in Battle(non)sense’s at 11:52.

Nvidia can’t figure out how to hyphenate their *Syncs.
Losing your mind over how to capitalize and hyphenate all the *Syncs? Don’t worry, even Nvidia can’t figure it out.

Settings

Check out my own settings where settings.yml contains my general settings for videogames along with game-specific ones in the games/ directory. It’s definitely a lot simpler than combing through all this text.

These settings are specific to my setup (1080p60, no *Sync), but should still have 95% what you want.

  1. I can’t provide a link or screenshot right now as the TI is going on right now and taking up pole position, but check out the stats at the bottom of Twinge. ↩︎

  2. That a new technology is royalty-free of course does not mean there aren’t additional costs to supporting it. The monitor’s scaler chip is tasked with handling this. ↩︎