It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
Was hoping for some feedback on a topic I've been mulling over recently...

I've come to the realization that many of the games that I play on PC these days are either: A.) not that demanding to begin with (lots of indie titles come to mind, as well as games like Stacking and Amalur) or they are highly scalable for great performance on many systems (like Blizzard games and Source titles).

Perhaps it's just the games I play, and I'm in a minority, but I'm running a PC now that's over two years old with no major component upgrades, and I am still having no trouble running most things at full bore (or close enough for my tastes). The rec'd specs for a majority of PC games, furthermore, don't seem to be going up in dramatic fashion. There are still certain benchmark titles out there, like BF3 and Witcher 2, but the boundary pushers just don't seem as prevalent as they once were. Maybe it's due to the length of this console era, but when almost all games are ports of consoles (7 year old hardware and still likely to keep plugging away) or indie titles built on less demanding engines, is it really worth it to shell out the cash for top-spec hardware anymore? I'm starting to wonder if I should skip upgrading altogether and just pick up a new laptop instead, as the portability is starting to look more worthwhile than an actual hardware boost. Again, I know this is just my experience, but I'm wondering how other people here feel about it, based on what they play and their preferences/tolerances are.

TL:DR- With a lot of (not all, benchmark titles still exist) PC games these days existing as either console ports or less demanding indie efforts, is it still worth the large investment in top of the line hardware?
Post edited April 13, 2012 by EC-
It really only seems a more powerful PC is needed if you want to run games at high resolution and get high frame rates. For that, high end video cards needed. And for those to run properly a decent to very good cpu is needed.
Problem with games at the moment is a lot of games are console first (only needed moderate PC requirements) - games dont seem to be cutting edge these days ie like Crysis years back etc. Until game makers make more games which has better or higher tech, probably wont need to get those $$$$$$ video cards etc
Ok, I think there are 2 factors here:

1) First, I think the hardware push has slowed down a bit.

Hardware is still improving fast, but not as fast as it did 10 years ago.

Yes, you have multi-processor hardware now, but taking advantage of it is trickier... just because you have 4 cores doesn't mean that that CPU-heavy applications will run 4 times as fast the way it did when you had a single CPU that was 4 times the speed.

So, the end result is that hardware you get now takes longer to become truly obselete.

2) Where the bulk of the resources are being used for most games...

into fancy graphics and physics engines.

However, technology in graphics improved so much that I'm guessing the return on investment is diminishing rapidly.

I would theorize that pushing the envelope graphics-wise is becoming riskier and riskier financially (you need to spend more $$$ to get an advancement that will be readily apparent to most users).

With less meaningful advancements in graphics and physics engines, the need for more and more hardware is diminished.
Post edited April 13, 2012 by Magnitus
You never buy a PC with a specific use in mind, I mean say you only wanted a computer for word processing and internet browsing, would you buy a computer so old and crumby that's all it's hardware was capable of? Of course not.

You buy the best computer you can for what you can afford, because you don't know if in a month or a year you might want it for tasks which are more taxing.
We still get great exclusives on the PC, some of which needs more power than others. Then there's the fact that just about every multiplatform game that gets a PC release looks and plays better on the PC, due to better framerate, graphics and resolution if you have the power.

For me, games like Civlization V is worth owning a PC for, and that would never work on a console for me. It can also be a very demanding game on the PC, due to the amount of calculations and whatnot that has to be done, that only escalates as the game progress. This needs a decent CPU to run optimally.

Then you get games like The Witcher 2 and Skyrim, both of which can require a powerful graphics card if you want to run it with most of the bling intact.

On the question of longevity, I'm pretty sure that if you build a medium-to-high-ranged gaming PC at the same time as a new generation of consoles come out, your games on the PC will look better than the console counterparts from day 1. And with today's compatiblity with gaming controllers (like the X360 pad, which is worth buying even if you only own a PC), and HDMI ports, you can play your better looking PC versions on your TV with ease.

There are still a few games you can miss out on if you don't own consoles at all, but if I had to choose, I would rather build a powerful PC than own all of the consoles for a few extra exclusives. But that depends entirely on what your favourite genre is. I don't think JRPGs are very well represented on the PC, unless you dive into the world of emulation of older titles.
It depends dramatically which games you like, but in the long run... no, there's no need of new hardware.

First: Memory. A 32 bit thread only can use 4Gb of memory, independently of OS or total available memory. I sill have to see a 64bit game.

Second: Processor. A lot of games whine or just hangs in multicore gigs. I have a 6core and I can count with one hand the number of games which uses more than two... and even these are sparse, being the majority running in only one core.

Third: GPU. Upgrade if, and only if you game in 1080p. If you have a 1440x900 (as mine) a high tier GPU from two generations ago (a rad4870 for example) is fine for the majority of games.

En resumen: A beefed dual core with 8Gb and a high tier 2/3 year old GPU is enough for most games.
Post edited April 13, 2012 by isaac.camin
Keep in mind the current console generation is winding down. Soon enough the average game will start pushing the limits of a few year old computer. Maybe don't need to rush out and do it today, but in a year or so it won't be a bad idea.
I think the "console generation is keeping back the PC game requirements as well" was already true in Playstation (1) time. Quite many PC games were the same as the Playstation version, only with higher resolution and smoothed textures (and maybe perspective correction & Z-buffer, something that the PC 3D accelerators offered "for free" over Playstation graphics). Maybe some like ID and Epic Megagames still pushed the envelope with some of their PC-only titles, but many others didn't.

It wasn't until PS2 came out that most PC games' HW requirements seemed to suddenly jump ahead as well, with much higher polygon counts on games etc. than in the Playstation 1 era. So, maybe the consoles do give some kind of yardstick for PC HW requirements (and I am not talking about the actual HW units in consoles, but what they are capable of vs. on Windows PC, with DirectX APIs etc.). PC versions mostly are like the console versions, only with higher resolution, and bigger textures if we are lucky. Polygon counts etc. are still the same.

And in a way, I like that, since I'm not in a habit of constantly upgrading my PC piece by piece, especially as it seems my PC gaming will be on laptops mostly anyway. I'll buy a powerhouse gaming laptop, and I hope it will be enough for quite some time.
Post edited April 13, 2012 by timppu
avatar
timppu: especially as it seems my PC gaming will be on laptops mostly anyway. I'll buy a powerhouse gaming laptop, and I hope it will be enough for quite some time.
This is kind of what I am getting at. The way I see it, I feel we are where you can efficiently game on a laptop that isn't even necessarily a 'powerhouse gaming' machine. For instance, my brother games solely on a base model Dell XPS 15 with a mobile 525 and a core i5, and he does pretty well for himself. He's able to run Dead Space 2 at full bore on the native res of the laptop, output Rainbow Six Vegas 2 to a 1080p TV with no issues and run BF3 on medium settings, which is (IMHO) better looking than the 360 version.

I guess I just remember a time when guys would spec high price cards and procs to meet benchmark-smashing software, and upgrades were required on a much more frequent basis. I wonder whether that era (for better or worse) is over, and whether the entry cost for PC gaming has lowered considerably, and whether laptops are now a viable alternative to meeting gaming needs.
Post edited April 13, 2012 by EC-