It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
bevinator: The eye can't really see much of a difference (or any difference) past around 35 fps,
That's not the case with most games. I can cleary see the difference between 30 and 60 fps. That's because in games the fps fluctuates, so that means different time intervals between every frame, which causes you to see it. At 60 fps this is very hard to see and at 120 it might be impossible. At 30 it's very visible in games.
Movies are only 24fps, but the intervals are constant, that's why you don't notice the low framerate.

Maybe everyone should read this: http://www.100fps.com/how_many_frames_can_humans_see.htm
avatar
Whitewraith: I can't seem to make up my mind, I hate losing FPS and hate the screen tearing so In some games I am constantly turning it on and off. I usually can find that sweet spot on games where its fast enough but without tearing but some newer games I can't (i am look at you Kingdoms of Amalur and Space Marine.
Probably for everyone who has very modern PC vsync is one of the options that must be switched on while playing Legend of Grimrock - in other case your graphics card will overheat. Just look at video FAQ in Legend of Grimrock subforum. :)
Post edited April 16, 2012 by Lexor
I think I've checked/changed the vsync settings the last time on my old 3Dfx Voodoo 2 card, sometime in the late 90s. I was probably trying to make Descent 2 or glQuake run as fast as possible by disabling vsync.

Ever since I haven't really paid attention whether it is on or off. If I had to decide, I'd probably want to keep it on, in case that helps running some old games at a correct speed. For example, why the heck does "The Reap" run so damn fast on this PC?

With single-player 3D games, I think I've been quite happy if I get constant 20 fps. That's plenty for me in most cases. Competitive multiplayer action games were and are a different matter.
Post edited April 16, 2012 by timppu
Why would I hate it? Admittedly there are some poorly coded games that work better without it, but that's more a matter of hating poor coding than vsync.
Triple buffered or GTFO in my opinion. Don't want drops from 60FPS to 30FPS in an instant.
Screen tearing is one of the most obnoxious things ever, so yeah I like vsync
To all the folks replying to me:
My statement was true. The second half of my sentence (that nobody's quoting) is the reason games look choppy at 30fps. Games don't have a "set" fps, it changes constantly, and if you've ever had a fps counter open you'd realize that even if it's running at 100+ fps, it can change over 40 fps in either direction almost instantly depending on what's onscreen. If you're running a game at 60, if it drops to 40 it still looks fluid. If you're running a game at 30, and it drops at ALL, it looks choppy as hell. I wasn't saying that games don't need higher than 35 fps to look good.

*The goal of a high fps is to make sure that no matter what is onscreen at any given time, the fps never drops below a point where your eye can see the difference.* From frame to individual frame, if you're playing at 60 fps, your "real" fps can be 40 or less, even though the average (or base) fps is higher. Any lower than 60 average/base and you can see major changes.

Also Gromuhl's link is good, as it helps spell out the differences between gaming fps and other things, like identifying a short blip of light.
Some time ago I read an article that simply and succinctly explained the troubles with v-sync, so let me briefly quote what many of you probably already know:
"(...) you want to use v-sync to prevent tearing—an artifact that occurs when in-game frame rates are higher than the display’s refresh and you show more than one frame on the screen at a time. Tearing bothers gamers to varying degrees. However, if you own a card capable of keeping you above a 60 FPS minimum, there’s really no downside to turning v-sync on.

Dropping under 60 FPS is where you run into problems. Because the technology is synchronizing the graphics card output with a fixed refresh, anything below 60 Hz has to still be a multiple of 60. So, running at 47 frames per second, for instance, actually forces you down to 30 FPS. The transition from 60 to 30 manifests on-screen as a slight stutter. Again, the degree to which this bothers you during game play is going to vary. If you know where and when to expect the stutter, though, spotting it is pretty easy."

Personally - I feel uncomfortable if I get anything less than a consistent 60 FPS. 30 FPS can be playable but as soon as I switch to 60 from it, it's a true relief (and a moment when I regain faith in my aiming skills).
Always off.
...and to think that I sometimes play games at 10-15 fps and enjoy them :)
It's set to 'always on' in my gpu options, wouldn't have it any other way.
Some game on certain hardware, you need to turn that on for stable frame rate. For example Pro Evolution Soccer series and League of Legend.
avatar
bevinator: To all the folks replying to me:
My statement was true. The second half of my sentence (that nobody's quoting) is the reason games look choppy at 30fps. Games don't have a "set" fps, it changes constantly, and if you've ever had a fps counter open you'd realize that even if it's running at 100+ fps, it can change over 40 fps in either direction almost instantly depending on what's onscreen. If you're running a game at 60, if it drops to 40 it still looks fluid. If you're running a game at 30, and it drops at ALL, it looks choppy as hell. I wasn't saying that games don't need higher than 35 fps to look good.

*The goal of a high fps is to make sure that no matter what is onscreen at any given time, the fps never drops below a point where your eye can see the difference.* From frame to individual frame, if you're playing at 60 fps, your "real" fps can be 40 or less, even though the average (or base) fps is higher. Any lower than 60 average/base and you can see major changes.

Also Gromuhl's link is good, as it helps spell out the differences between gaming fps and other things, like identifying a short blip of light.
I'm not trying to pick a fight or anything ok? But you missed the link I posted:

http://frames-per-second.appspot.com/

If you set the FPS any less than 60 there, you'll notice a difference.
30 is crappy. 40 is less crappy. 50 is sorta good. 60 is good. Above that, I can't see it.

40 doesn't look fluid to me at all
Fluid, to me, is when you are playing at 60fps at all times. That's fluid. That's why I don't use VSYNC very much because of the very reasons you posted (FPS drops). I find the the more FPS I have, the less likely it is for it to drop below 60.

Hope I didn't sound rude or anything. It wasn't my intention. :D
Certain "virtual console" applications tend to run really fast, too fast to play, if you don't have v-sync on.
avatar
OmegaX: Like it. I can't stand screen tearing and I don't need more than 60 FPS.
This, especially if it was managing more than 60fps.

I'd even rather have 30fps synchronised with 60Hz than have a tear-y picture because 53.9fps doesn't go into 60Hz cleanly.
Post edited April 17, 2012 by SirPrimalform