It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
Sachys: Gameplay above all.
Performance / graphical fidelity etc second.
Thats it.
I couldn't find a game with a gameplay option in the graphic settings ^^
avatar
dtgreene: The problem I have with integer scaling is that, if the ratio between resolutions is not an integer, you end up with black bars on all 4 sides of the screen, making only a small portion of the screen being usable.

Personally, for scaling, I prefer to preserve the aspect ratio, use as much of the screen as possible while preserving the aspect ration, and I think I prefer nearest rather than linear filtering. (I note that nearest is definitely better for pixel art games; using linear just looks too blurry.)
avatar
Cavalary: By integer scaling down I was referring to a lower resolution that's a result of dividing the native one by an integer, like 800x600 if native is 1600x1200 or 1280x720 if native is 2560x1440 or full HD if native is 4K.
Even still, if you have, say, a 1024x768 monitor, and the game runs internally at 1600x1200, integer scaling down is not going to fill the display.

avatar
Sachys: Gameplay above all.
Performance / graphical fidelity etc second.
Thats it.
avatar
neumi5694: I couldn't find a game with a gameplay option in the graphic settings ^^
Sometimes, graphic settings do affect gameplay.
* There may be some visual cues that disappear or become less obvious under certain settings.
* As has been mentioned, some games can break if running at the wrong framerate.
* Some accessibility options are graphic options. This can include things like captions, text size, and the option to reduce flashing.
Post edited March 05, 2023 by dtgreene
avatar
StingingVelvet: What is your top five?
1. "Actually looks sane / sensible". By that I mean there are some settings I always turn off regardless of performance (eg, Depth of Myopia, Chromatic Abhorration, etc) that just look visually ridiculous to me. "I now can't see anything beyond 8ft distance because I'm speaking to someone 4ft away" is not remotely how eyesight works, never has been and never will be. "Bokeh photography" is a nice static photo best hung on the wall but makes no sense at all in games. If your in-game avatar was as blind as some overly cinematic games try and represent, you wouldn't be out saving the world, you'd be at home phoning your Ophthalmologist for much needed corrective eye surgery...

2. Max draw distance. In some games like Morrowind with very low 'vanilla' defaults, using MGE XE / OpenMW to increase it can massively positively impact gameplay to actually be able to see where you're going / use distant structures as navigation landmarks instead of switching to a map every few seconds to see through the "fog".

3. Native resolution is always sharper than upscaling. DLSS is constantly hyped (mostly because people over-bought on the monitor pixel count during the same era as GPU prices went through the roof) yet as long as games have to be specially individually written for it (and specific versions further gated behind certain generations of GPU on top), then 99% of the 70,000 available PC games won't support it. For games that have to be upscaled (DOS, ScummVM, AGS, etc) I'd prefer the ability to control whether it's pixel-perfect or not and / or being able to control the ratio (3x, 4x, 5x) can stop things getting overly large especially on larger 32" monitors.

4. 60fps (I can 'see' 144Hz vs 60Hz, but I've never found it some "life changer" that some are claiming nor am I interested in 'ESports' games). Many older games engines are capped to 60fps (physics issues) whilst others even have caps below this anyway, eg, DOS 30fps, Adventure Game Studio's 40fps cap is a non-issue for most people due to the nature of the genre. Diablo 2 is way down at 25fps cap yet after 2-3 mins you're "habituated" and don't notice it. I'd always prefer a "stable" frame-rate to a high one and that applies psychologically between games too, ie, if you play a lot of older games, constantly switching between 144-240Hz vs 60Hz can be a lot more irritating / jarring that just becoming "normalized" to 60/75fps in everything. FreeSync has also done a lot to solve the "tearing vs stutter & lag - pick one of two" problems without needing high frame-rates to try and hide it.

5. Non-blurry AA. I agree with TAA looking like a smear-fest in some games. No idea why it ever became popular. Even SMAA / old fashioned MSAA doesn't look as "smudgy" as some temporal "solutions" do.
avatar
StingingVelvet: I've given up the fight against TAA, games don't even look right with it turned off now. They expect the blending TAA does, among other things. I turned TAA off in Read Dead and sudden;y the trees had barely any leaves. Fun fun.

I also think with 4ks of pixels TAA looks a lot sharper and better too.
Yeah, can't blame you really... it's an unfortunate trend nowadays with TAA. Agreed, in most games I find leaving it on (with some sharpening at times) is better at 2160p than turning it off completely, though it is pretty backwards approach and a balancing act of sorts. I'm more inclined to disabled it on lower resolutions though. When TAA is disabled some games will show bright artifacts on reflections, like super bright specks of magic dust seemingly out of place all over the wet ground etc. Quite distracting to some ... so pick your poison really. Devs are getting used to hide stuff under smearing again and one would think we left that behind in the 90s.
This is my usual procedere of setting priorities (not restricted to graphics):

1) Reducing music volume to "barely audible" - or completely off (if I find it nerve grating).
2) Resolution set to native screen resolution (if possible - else the highest available resolution below my screen resolution).
3) Setting the highest details possible, that still allow me to play the game stutter-free.
4) Enabling blood/gore details (if the game has such a setting, and they aren't enabled by default).
5) Changing the controls to what I'm used to (if possible).
Fullscreen
Native desktop resolution if new game, acceptable legacy resolution if old game (e.g. 1024*768 at least)
Keep aspect ratio (although my monitor/GPU setup should force that on games automatically)
Vsync on (just in case, to prevent possible screen tearing)

I think usually that's all. I don't really worry about FPS and all that unless the game feels choppy, laggy or prone to make me motion sick / give me a headache (which is not a frequent occurence).
Post edited March 05, 2023 by Leroux
avatar
BreOl72: This is my usual procedere of setting priorities (not restricted to graphics):

1) Reducing music volume to "barely audible" - or completely off (if I find it nerve grating).
2) Resolution set to native screen resolution (if possible - else the highest available resolution below my screen resolution).
3) Setting the highest details possible, that still allow me to play the game stutter-free.
4) Enabling blood/gore details (if the game has such a setting, and they aren't enabled by default).
5) Changing the controls to what I'm used to (if possible).
I don't lower music volume.

On the other hand, if the game has voice acting, I will turn it off (making sure to have subtitles enabled, of course). (I don't like it when a game has voice acting with no option to disable it; to me, setting the voice acting volume to 0 just isn't enough.)
avatar
AB2012: 1. "Actually looks sane / sensible". By that I mean there are some settings I always turn off regardless of performance (eg, Depth of Myopia, Chromatic Abhorration, etc) that just look visually ridiculous to me. "I now can't see anything beyond 8ft distance because I'm speaking to someone 4ft away" is not remotely how eyesight works, never has been and never will be. "Bokeh photography" is a nice static photo best hung on the wall but makes no sense at all in games. If your in-game avatar was as blind as some overly cinematic games try and represent, you wouldn't be out saving the world, you'd be at home phoning your Ophthalmologist for much needed corrective eye surgery...
I struggle with this stuff a lot. I hate that developers want their games to look like movies, but I also feel like games look weird when you don't view them as intended. Like I recently replayed Resident Evil 3's remake and the depth of field and chromatic aberration are insane, and yet turning them off makes the game look kinda flat and shows poor textures obviously meant to be hidden by blur. Outer Worlds had insane chromatic aberration, but turning it off shows how much it was used to make the world feel more alien and abstract. Now it looks like AA budget rock textures.

So while those effects annoy me, and I wish devs wanted to be Warren Spector instead of Steven Spielberg, I tend to leave them on nowadays because games are so designed around them.
Depends on the game, if I'm playing a competitive game, then I try to aim for 60FPS, but most of the time I prefer higher graphics fidelity and 30FPS.

1- Native Resolution.
2-Texture Quality.
3- Eye Candy (SSAO, SSR, AA, God Rays, etc.).
4- 60 FPS.
Post edited March 06, 2023 by Ruvika
Aim for 60fps, even if it means potato-graphics. If I can't get 60fps in a twitch action game, I don't play it. Stuff like the newer Fallouts, I can accept FPS drops
1920x1080 (native res) fullscreen, but will play in a smallish window if necessary for FPS
Bloom off, because it sucks
Motion blur too
Shadows, AA and Ambient Occlusion are the first settings I plunder for FPS if necessary
AFAIK textures and anisotropy have little impact on FPS so they're among the last to lower
Vsync on by default if full-screen, nowadays it seems necessary on most games because of screen-tearing
BOOBS = > amigoxxx.tk
I try to go for native resolution when possible. For older games, I prefer to keep the aspect ratio.

I don't really mess with the graphics settings all that much unless the game is running too slow for my liking. Usually, anti alias is the first to go.
avatar
Lesser Blight Elemental: AFAIK textures and anisotropy have little impact on FPS so they're among the last to lower
I don't think I ever lowered texture settings in my life until last month. New games this year like Dead Space and Hogwarts gobble up VRAM like pigs at a trough. It's ridiculous.
1) 100 frames per second + Vsync on + Motion Blur off. These settings allow me to play with backlight strobing activated on my monitor and reach CRT-like motion nirvana (seriously, more people should try this).
2) 1440p/1620p resolution, downscaled to 1080p (monitor res) with DLDSR. Combine with DLSS 2.0, if available.
3) Disable any form of temporal anti-aliasing, if possible.
4) If it is not possible to get 100 fps and 1440p, then set 1080p + nasty TAA.
5) If it is not possible to get 100 fps, then favor visual quality as long as frame rate >= 40 fps.

In lightweight games capped to 60 fps (such as many sidescrollers), sometimes I set the monitor to 120 Hz and use a Black Frame Insertion software. The flickering is very noticeable, but I prefer that over the awful sample-and-hold motion blur.
What a great idea!

Tinkering all the settings and trying some magic performance/quality combinations is what made me switch to PC.

Here is my list:
1) Antialiasing: I know, I know, it may be strange to see it here, but one thing that I hate in videogames are all the jagged edges around the borders: I play on a 1080p 144hz monitor, since I still don't have the money nor I really want to do it since I'm good with this resolution.
I hate the fact that almost every developer dropped support for MSAA (which in my opinion is the best AA method available), since looks really well, and with MFAA enabled can also be not that performance hungry. I totally hate every temporal or post processing AA method available, like FXAA, TAA, since they only blur the image. There is also DLAA from Nvidia, which seems really good, but is implemented in like 10 games

2) fps: holy God how I regret not buying an high refresh rate monitor, and with gsync/vrr isn't really important to get always high frames or to enable vsync, since I hate all those broken horizontal lines on my screen.

3) upscaling method: this is a new entry of the past years, which gained a lot of popularity thanks to Nvidia. DLSS isn't perfect in every game tho: on Remedy's Control I couldn't play with it enabled, since with the amount of fog and small text made everything too blurry for my eyes.

4) texture res: I love seeing all those little details in the scenery: from little notes, pictures, NPCs,... and I also like spending some time modding the games I play, just to see better and better.

5) render distance: I want to see more, all along the hills and over!

Bonus: I tend to max out shadows, since I hate seeing all those jagged edges on the floor under my character, but I completely don't understand why everyone is going mad with rtx, we don't have still reached the point where we can implement those things without burning our GPUs, and not every game is well optimized (like on Hogwarts Legacy, where normal shadows are better than rtx ones)
avatar
StingingVelvet: I don't think I ever lowered texture settings in my life until last month. New games this year like Dead Space and Hogwarts gobble up VRAM like pigs at a trough. It's ridiculous.
Same for me, but I have to say that on Hogwarts Legacy, if you choose low or high textures, nothing changes, it's just a placebo.
Post edited March 07, 2023 by pippo-san