
The graphics settings used will also impact how hardware-intensive the experience ends up being. That means higher refresh rates might be attainable on less powerful hardware, depending on the game you want to play. Older games, or games that don’t emphasize the latest graphics technologies will be considerably less resource-intensive than a cutting-edge title. With that in mind, games vary in how much load they put on the CPU and GPU. Generally speaking, the higher a monitor’s refresh rate, the more FPS your CPU and GPU will need to supply, and the more benefit you’ll receive from higher performance options. The level of hardware required to drive a higher refresh rate varies based on the refresh rate you hope to achieve, as well as the games you’re playing. If your monitor has a refresh rate of 144Hz but the GPU is only supplying 30 frames per second, that higher refresh rate is not being utilized.

If your CPU and GPU are incapable of supplying the monitor with a sufficiently high number of frames then your monitor won’t be able to produce a high-refresh rate image regardless of how good its specs are. The monitor can only display an image at the rate the system produces it, so it’s important that your CPU and GPU are capable of completing this process quickly.



When paired with the high frame rates produced by a GPU and CPU working together, this can result in a smoother experience and potentially higher FPS. For example, if your display has a refresh rate of 144Hz, it is refreshing the image 144 times per second. The refresh rate of your display refers to how many times per second the display is able to draw a new image. The time between these updates is measured in milliseconds (ms), while the refresh rate of the display is measured in hertz (Hz). As illustrated above, a higher refresh rate refers to the frequency that a display updates the onscreen image.
