Nvidia – higher framerates means a better Kill Death ratio

What is a Kill Death ratio I hear non-gamers asking? Kill Death ratio defines how many kills a player gets before they die.

Gadgeteer gamer Arun Lal writes: A Kill Death ratio is Kills divided by Deaths. For example, if a player gets ten kills and five deaths in a game, they have a Kill Death ratio of two. The higher the better.

The casual gamer might not understand all this, so NVIDIA released a video showing how 60, 144, and 240 FPS/Hz affect gameplay in fast-paced shooters like Counter-Strike: Global Offensive (CS: GO). It also has a study on the relationship between framerate and Kill-Death ratio.

The truth is 60FPS (frame per second) is ideal for shooting games. But some games in the eSports scene run at insanely high rates – 200+ FPS range.

What’s the difference between framerate
(FPS) and monitor frequency (Hz)?

FPS is the average number of frames
rendered per second by your GPU. Note that while a human eye can
theoretically detect up to 1000 FPS – anything over 150 FPS is a ‘guess’ known
as ‘critical flicker fusion’.

The FPS can go as high as your GPU (and
CPU) allow. The FPS in a game rarely stays constant and varies from scene to

Hz refers to the refresh rate your monitor
can achieve without screen tearing (straight lines being jagged). The human eye
cannot see things above 60Hz.

What’s the relationship between the two?

It’s easy to think of it like this: The GPU
renders a bunch of frames per second and passes them on to the display.

The display picks these frames up depending
on the monitor refresh rate. So, if it’s a 60 Hz monitor, it’ll pick up 60
frames per second and output at 60Hz.

If the GPU produces fewer frames than the
refresh rate, it leads to screen tearing and choppy gameplay. If it provides more
frames than the display refresh rate, you still get screen tearing because the
screen cannot process al the extra information.

For the best performance, the frame rate
and refresh rate should be as high as…

Source Link