What’s the ideal resolution for a PC game? Ask most players and they’ll immediately answer, “Whatever your monitor can support.” That’s the obvious solution—after all, it would hardly make sense to render graphics at higher than your equipment can actually output and your eyes can see, right? …Right?
A Quick Primer on Supersampling
Perhaps not. Now that PC game developers have become experts at getting their games to run at 60 frames per second even on middling hardware, and even under-$200 graphics cards are getting ridiculously powerful and efficient, a new technique to make games look better has emerged. It’s called “supersampling,” among other names, and the basic gist is that the game renders its graphics at a resolution above what the monitor can display…then scale it down to your monitor’s native resolution. Various software-only solutions for this have been around for awhile, but now video cards are powerful enough to brute force the technique onto games that don’t natively support it.
The benefit is that you “see” graphics at a much higher level of detail, avoiding some basic pitfalls like aliased polygon edges and lighting artifacts. You’re basically using your GPU’s graphical power to render images at a much higher resolution than your eyes can see on the screen, causing various subtle but pleasing enhancements to the way polygonal edges and lighting effects appear. This can be achieved in other ways with more complex anti-aliasing techniques, but GPUs now have enough juice to dispense with the subtlety and just render things much more sharply behind the scenes. The downside, of course, is that your graphics card has to work harder render super-high-res graphics and then down-sample the image to fit it in your display…which can make the game run below 60 frames per second (or whatever your monitor’s refresh rate is), giving you diminishing returns in terms of visual performance.
Here’s an Overwatch character being rendered with standard, screen-matching resolution on the left and a 200% super sampling technique on the right. Both are displaying at 1080p, the maximum resolution of many standard monitors. But the image on the left is being rendered in the game’s engine at 1080p, while the image on the left is rendering at 4K (3840×2160). Note the smoother, less jagged lines at the edge of rendered objects like Lucio’s googles and the more even transition of shadows and skin tones. And predictably, I observed a significantly lower framerate while the game ran at 200% of its normal resolution, dropping down into the 40s and 30s during complex battle scenes where previously the game ran at a rock-steady 60fps.
The results of…
I have a crazy passion for #music, #celebrity #news & #fashion! I'm always out and about on Twitter.
Latest posts by Sasha Harriet (see all)
More from Around the Web