It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
- okay ... so the more ROP announcement was a succesfull 1 april joke!

- maybe you guys also heard about this new upcoming GPU from Bolt? called ZEUS ? atm as fast as a 4090?
check out the vid --- https://www.youtube.com/watch?v=8m-gSSIheno&t=225s ---- it might be another april fools joke though

- yea yea but so, it is better to buy a xx70 series nvidia card every season than invest any higher from an efficiency Dutch standpoint??
Probably a 1. April joke, yeah... because honestly i can not even in my dreams find it any realistic that Nvidia is offering MORE than they actually are advertising... just impossible. Although a really good joke... because it hits the truth so hard, it goes deep.

Regarding VRAM... according to my current tests, 12 GB VRAM is the minimum for 1080P at high or even highest settings. Because some games are able to go pretty close to 12 GB VRAM, if RT and all the other shinys are used. On 1080P. a modern mid range GPU is surely sufficient with performance. The best of them may even handle it without frame generation or upscaling.

However... at highest settings and above 1080P, any card with the exception of the 4090 and 5090 will have to use some upscaling and/or frame generation help.

Currently, my probably most demanding game... which is more demanding than the newest Cyberpunk 2077 or the most up to date Witcher 3 at highest settings, is probably Spiderman 2 now.

At highest settings, which means native DLAA and full RT, highest possible, it can barely keep up 40+ FPS in most scenarios, sometimes lower, sometimes higher. It surely is still playable but not "the easy way", on 1080P.

On 1440P... the 4090 will as well only deliver around same FPS such as the 3090 TI. On 4k, even the 5090 may only deliver around 40+ FPS... most likely even lower than that, so it can "dive" into the problematic performance level WITHOUT frame generation and/or upscaling.

The VRAM demand is surely higher than 12 GB on any highest settings above 1080P... but i can not say by "how much".

Nonetheless... most games surely are fine with 12 GB VRAM, for now... but i would not consider it future proof. 16 GB+ is future proof, i would say.

On modding... more VRAM is always useful (16 GB is minimum i say, 24 GB surely very safe, for 32 GB i currently see no use)... but this is for heavily modded games only, for example a maxed out Skyrim.

https://ibb.co/Fk5cR9TJ
https://ibb.co/G44hyr7G
https://ibb.co/V0CWcrQv

More infos:
https://www.gog.com/forum/general/good_games_that_are_available_drmfree_but_currently_are_not_sold_on_gog/post127
Attachments:
Post edited April 06, 2025 by Xeshra
avatar
P. Zimerickus: - yea yea but so, it is better to buy a xx70 series nvidia card every season than invest any higher from an efficiency Dutch standpoint??
Personally, i hate it when i have to buy a still expensive GPU (i mean, 700+ coins is NOT free, sure Nvidia was dropping prices now, because they finally got competition) with a somewhat cut down VRAM interface. Yes 12 GB is probably sufficient but it is the MINIMUM; not good and surely not the maximum. So, why should a sane human pay a lot of coins and still get the minimum? I do not get it... unless i would be a fan of a company, which is not the case for me. If AMD does a better job i can go to AMD, no issue here. If Nvidia does a better job, i stay with Nvidia.

Currently, AMDs 9070 XT is "the best bang for the bucks°°"; my personal view. However... i would rather enjoy a even more powerful card, as this card is about same performance such as my old card, just with way higher efficiency. Not enough to make me buying it... unless my current card would be weaker than that.

°°Some 9070 XT are sold at not much more than 700 coins... clearly a fair priced card.

My own approach is to wait for the next gen to arrive, around 2027, which is no issue for my long lasting 3090 TI. In 2027 it will be either the UDNA flagship from AMD or hopefully a Nvidia flagship which is probably priced at 2500+ but AT LEAST way better than the pretty lame 5090 (for its price).

The only exception is if i may get a true bargain on a 5090 "luxury design", mainly the MSI Supreme LIQUID, if it is around 2500 (which is most likely impossible, but who knows...) then i may risk it...
At 3000+... no way! Go home Nvidia, along with your just as greedy vendor.

If it can not go into a more fair priced territory, then simply go pump up the AI industry with... until the point they may have to vomit, when it gets just over the bloody edge of any amount!

avatar
P. Zimerickus: - maybe you guys also heard about this new upcoming GPU from Bolt? called ZEUS ? atm as fast as a 4090?
check out the vid --- https://www.youtube.com/watch?v=8m-gSSIheno&t=225s ---- it might be another april fools joke though
Currently there is a lot of hot air but no "empiric evidence". So, i suggest to wait for some true revealing and take this stuff with a huge grain of salt. Stuff not available is simply not much use... it is good to be rational on this.
Post edited April 06, 2025 by Xeshra
avatar
Xeshra: Regarding VRAM... according to my current tests, 12 GB VRAM is the minimum for 1080P at high or even highest settings. Because some games are able to go pretty close to 12 GB VRAM, if RT and all the other shinys are used. On 1080P. a modern mid range GPU is surely sufficient with performance. The best of them may even handle it without frame generation or upscaling.
avatar
Xeshra: The VRAM demand is surely higher than 12 GB on any highest settings above 1080P... but i can not say by "how much".

Nonetheless... most games surely are fine with 12 GB VRAM, for now... but i would not consider it future proof. 16 GB+ is future proof, i would say.
This is something that bothers me about the 5070 compared to the 9070. Even if 12GB is fine for now, 5070 is restricted in some scenarios. Testers have shown that the 5070 can fail using settings that are fine for the 9070.

Game developers also have different ways of approaching VRAM which adds to confusion. Some games will allow you to exceed your limit which results in stutters and crashes. Other games will secretly swap out textures, object quality, or hide things to stay within your VRAM limit.

Another confusion comes from allocated VRAM which is higher if a card has more VRAM. Some gamers think this allocated VRAM is useless but it seems to me that the game is just storing more information since it has more VRAM to work with. This is a good thing for stability and loading textures and objects.

In any case, a mid-range GPU should be offering 16GB at minimum. This has already been the standard by AMD since RDNA 2. NVIDIA continues to restrict its buyers.
Yeah i do agree. The VRAM amount someone is able to detect during gameplay is not showing "the full story" and often the game is trying to set a "own limitation" by using many "tricks" in order to stay within the limit which has been set up by the GPU.

However... as i use a 24 GB VRAM card, in my tests i surely do not run into such restrictions.

Anyway, i said already... the 9070 XT is currently the best value and my opinion is still the same.

Sure interesting to see that the 5070 (non TI) is losing a lot of performance on a good bunch of 1440+ P games, compared with any card 16 GB and above. To be fair, this card was designed to run on 1080 P, yet, without frame generation it even will lose vs. a 3090 TI, which is not exactly the card i would ever switch out my card for: Not even if the very old flagship is 2 times more power hungry... as in high settings it can not even match the very old flagship (2 generations behind).

In general, i do expect at least 50% more performance for an solid upgrade and in usual at the same price (which means 1300 coins). However... since the money was falling trough the roof into a sinking hole with its value... "the same price" is nowadays rather a myth... in term i do want 50% more performance at least.

Money just got almost no value anymore... face the truth. The money hoarders made it happen.
Post edited April 06, 2025 by Xeshra
avatar
Xeshra: However... as i use a 24 GB VRAM card, in my tests i surely do not run into such restrictions.
I'm on 16GB and have had long play sessions in Cyberpunk where my VRAM reaches 14GB. This is due to the game storing more and more information as I play. But upon initial boot up the game only says 9GB so one may think it would never go above 12GB. For your GPU it may even go above 16GB since it would have more room than mine.

9070/XT are great deals at the moment if they can be found at close to MSRP prices. My 9070 will last me a very long time.
Well, true.., upon initial boot up, the VRAM value tends to be a good amount lower, because obviously the game is not loading all the VRAM data required right from the start. It usually will add more data with extended playing time, in case there are sufficient VRAM available. If it lacks VRAM... the game will try to store many of the data on slower parts of the system (main RAM, or even SSD cache file) which can make the game slower over time.

Testing simply is a very extensive procedure and most people, including me, may not take sufficient amount of time for it. At least not "just in order to prove something". Truth is... more resources simply may offer way more benefit than what it seems to appear "on the surface level".

So, yeah, people should not let themself fooled with "surface data" available to them... often many of the stuff is hidden to them.

There are even some games that will try to make use of as much main RAM (not only VRAM) as possible and will try to load as much cache data into the main RAM as possible... which depends on the main RAM available to them. As soon as it will run out of any safe amount of main (or VRAM) it will start to store many cached data into some SSD space... which will make the game much slower... dependable on the kind of data stored there.

Just as you told already... the truth is way more complicated than just "a game is exactly using so and so much VRAM, or even main RAM", which is not truly the case..., it can sometimes become pretty dynamic.

RAM is in general a nasty thing, because... as long as it is sufficient, it will barely become noticed. But as soon as you run out of RAM... it will become noticed the most of all parts inside the system.. as the performance can drop pretty drastically. Although, a modern game is usually trying to "avoid this worst case scenario" by trying to store the "overflow" inside some free cache-locations on the system... but if even this location is running out of space, the bomb may finally explode.

Main rule simply is: Filled up RAMs are bad RAMs... under optimal circumstances there should always be some headroom left for some "worst case scenarios". Using 12 GB VRAM... there is simply to less valuable headroom.
Post edited April 06, 2025 by Xeshra
avatar
Xeshra: Main rule simply is: Filled up RAMs are bad RAMs... under optimal circumstances there should always be some headroom left for some "worst case scenarios". Using 12 GB VRAM... there is simply to less valuable headroom.
100%, that's the keyword: headroom. It's not about what I need at the moment, it's about having the space for worst-case scenarios.
However... there are many gamers that seems to be happy with the 12 GB 5070 card. They even say that the price of 700+ coins was good value, the performance is top, the VRAM value is good, it got 3 fans and can be overclocked...

And way more in this direction... perhaps all they care for is "Nvidia", no matter if AMD is actually offering a better product.

Humans simply enjoy "to be fans" and they enjoy "to believe", important to realize.
Post edited April 06, 2025 by Xeshra
This is maybe a nice addition to the discussion about 'what do you want ?'

After several comments noting how unnecessary 1080P CPU testing actually is the guys from Hardware Unboxed decided to review the 9800X3D, this time in non scaled 4K ultra settings...

Guess whot happened?

view it here ------- https://www.youtube.com/watch?v=jlcftggK3To -----------
avatar
P. Zimerickus: This is maybe a nice addition to the discussion about 'what do you want ?'

After several comments noting how unnecessary 1080P CPU testing actually is the guys from Hardware Unboxed decided to review the 9800X3D, this time in non scaled 4K ultra settings...

Guess whot happened?

view it here ------- https://www.youtube.com/watch?v=jlcftggK3To -----------
A trolling April's fool video!
Well, you see... without "dirty tricks" even the 5090 looks like it lacks many muscles, Native + RT... it pretty much can kill any GPU and the CPU will not matter anymore.

The 5090 in my view is not a strong card, it is just the strongest available. Guess we have to get used to it using many frame generation and upscaling methods on many games and "faking stuff" seems to be part of "how we are able to play" or how a GPU is actually able to make it work.
Post edited April 07, 2025 by Xeshra
avatar
Xeshra: Well, you see... without "dirty tricks" even the 5090 looks like it lacks many muscles, Native + RT... it pretty much can kill any GPU and the CPU will not matter anymore.

The 5090 in my view is not a strong card, it is just the strongest available. Guess we have to get used to it using many frame generation and upscaling methods on many games and "faking stuff" seems to be part of "how we are able to play" or how a GPU is actually able to make it work.
With the increased ever growing influence of gaming, i wouldn't be surprised if GPU's will soon expand on their pricing again. Making it more of an above average income young adult expenditure than anything else, and of course the extra's will probably have merit but will lack any sense of logic like we are used from the pricing in every other segment that is able to reflect a certain amount of authenticity
Well, 2, actually 3, things:

-Amount of gamers are growing: Correct.

-Price is going up due to increased popularity: Not necessarily. Prices are being made in a way "so it works for those demanding the price", which is based on a lot of different approaches and different philosophies. But one mentality is same for everyone: Never charge lesser than required.

For Nvidia, there is currently not much reason dropping any prices on their highest range cards as they got barely any "fine amount" left. The almost entire production capacity is still being burned on the very hungry AI market, which is capable to pay a very solid price for. As long as this issue, with the coin-owning AI competition, remains... gamers do not see a lot of daylight anymore, neither with the prices nor its availability.

The mid range GPUs are a different matter because they do not directly compete with the AI market (and its leftover chips), so they are designed for "the poor", yet still not necessarily high capacity because this capacity is being controlled by the current market demand, so there are never "way to much chips", making it more scarce than it need to be... for a higher pricing in mind. The economy is simply charging the highest possible price and in general the overall income is slowly going up worldwide (some countries got up to 50% inflation a year, it is crazy) while the taxes are going up as well... so this means in general the prices knows only one way, and this is "upward"... in longterm.

Gaming is going to be more of a luxury (some hints you made there): Well, i guess this is correct because the price of gaming-capable hardware is increasing faster than the average income worldwide... no matter be it a PC or gaming consoles (Switch 2... PS6... they all will be pricier than their last generation). However, the industry may not care making gaming a bit more "difficult to access" because the number of gamers is sufficiently high.... what they truly want is gamers to buy more games in average, as well pay more for each game; so to become "less casual". Because the very casual gamers tend to be pretty unpredictable = trend based and difficult to manage; so it can cause a lot of troubles in predicting any market demand. It is being countered by making a mainstream game extremely casual = easy to approach, but it may create the risk of making many of the more solid gamers upset, and those are usually very loyal ones. Nintendo for example got a strong foundation based on a lot of very loyal fans... and those fans are still important for Nintendo. So it is not all about "making it as casual as possible in order to attract the big mass"; but as well to keep the foundation as stable and steady as possible, because this is something a company can "count on"... and in usual easy to predict. So, increasing amount is not bad, but it is important not to make the foundation upset. Else... we can see on Ubisoft what may happen after...
Post edited April 08, 2025 by Xeshra
nVidia chips compete with themselves for fab space, including AI chips, but that also applies within the 5000 series since they all use the same process/ node. So a 5060- or 5070 now- will compete with the 5090 for fab space and profitability wise the 5090 wins by a large margin. You can make more low tier chips from the same wafer, but you still have more profit per mm^2 on the bigger chip. That spells bad news for anyone on any sort of budget.

Companies also have to book fab space up to a year in advance and there is seldom any excess space to be had short term because the limiting factor for most tech company's chip production is fab space. If there is spare capacity loyal/ long term customers get first refusal, and nVidia is not seen as a loyal customer.

People tend to let TSMC off the hook for the price increases but they've been consistently increasing prices even on mature nodes when historically nodes only got cheaper as they matured. It's pretty clear price gouging, but with no alternative there isn't much that can be done in the short term.

OTOH the 9070s from AMD don't have much internal competition for space. They're the top end chip of the line, their CPUs don't compete as they are made off a slightly different node as are the console chips. But again short term fab capacity to cope with acute demand is basically non existent and there's no way AMD would or could book enough space to compete with nVidia, on volume, a year in advance.

Perhaps the best illustration is the constant shortage/ out of stocks on AMD's x3d class CPUs. Clearly, AMD would love to make more as multi month month delays result in people being forced to buy other chips including the direct Intel competitor, but they simply can't juggle the fab space to get enough supply.