It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
Gaming only makes up about 8% of nVidia's business so I'm surprised they even bother with it at all. I think it's mostly a fallback position in case AI does not pan out.
There is a extremely high amount of cash to be made in the digital industrial complex, as those people are very close to "where coins are made" with big political influence. So... indeed... guess Nvidia currently only makes around 10% of their income out of their gaming business and the only reason they did not stop supporting gamers is because they indeed want to have some "backup plans"... kinda the same GOG would be compared to Steam.

So, Nvidia simply doesnt feel "bothered" and they are, according to what i know... already delivering the "AI industry" with tons of cards which are priced comparable to a new Tesla car... each... of course.

So, we pretty much had a "paper launch" and almost no one seems to get their cards... no matter at what price. Although, as long as the price paid is less than that of a brand new car... it is always a "second class consumer".

Paper Launch Gamers Nexus

AMD may have some potential but their entire PR in the last few years was nothing short of a "catastrophe". They was focusing on Nvidia way to much and was trying to offer something competitive, which simply did not work out, so they even stopped competing. A very big failure... as they actually had some own potential, for example a steady and "on par with Nvidia" rasterization performance (Nvidia did not make any advancement there at the current gen) and as well a fine RT performance at the current generation. Instead of making something out of it... with good marketing... they chose to "nearly disappear" and let Nvidia take over almost everything that is still left... out of a just huge incompetence in "marketing terms".

So, ... not the best time for gamers, for sure... and the only affordable cards are in the mid range, so most of us play in "mid range performance", at best... which is suitable for me at 1080P at least. So, at least my Plasma TV still got several more years of making me happy.

Besides for those enthusiasts who want to get a 4090 instead... no... there is no affordable 4090... actually their prices was increasing more than what it used to be several years ago... 3000 now... but at least some of them are available.

Just forget it... the "high end market" is for a unknown period destroyed and what is left... out of it... is the AI market.

Guess i can be happy my 3090 TI is made with very high quality parts... as well the rest of the system including the almost invincible Plasma TV... so i can wait for a very long time for better stuff to arrive.
Post edited January 31, 2025 by Xeshra
avatar
drxenija: So anyone know if founder editions from NVIDIA are better or worse than manufacturer versions 4080 or 5080?
avatar
AB2012: TechPowerUp have reviewed all the models....
https://www.techpowerup.com/review

...and done a comparison page.
Thank you! Purrfect. It looks like they are all a rip-off.
MSI is clearly my most "recommended" brand, and in fact... my MB and my GPU are from MSI.

My 3090 TI is from MSI using their cooler design, yet the PCB parts are comparable to the Founders Edition with the industries best controller, which is one of the best combos i can think of. Indeed, on the 5000 series... i would still prefer a MSI product.
Does not seem like anyone unable or not willing to "ride the Nvidia hype (?) train" is missing much... unless they are masochistic in nature:
Serious issues with NVIDIA 572.16 Driver! Beware!

Besides, the 5090 is realistically... not on the paper, not a 2500 USD card... rather a 3500 USD card... and not even only in my country... almost everywhere. So i was right with my "crazy" predictions considering pricing. The 5080 is a 2000 USD card... as well at almost any place around the world... if people even get one that is.

JayzTwoCents seems to be pretty neutral in this surely not "one sided video".

The official price... sure... is 1999 for the 5090... we are faaaaaar away from this... and on the 5080 i dunno but surely as well faaaar away from the MSRP. 999 they say? Thanks for the joke... just forget it and dream a fairy tale.

As for me... my personal "red line" has surely been reached, on both cards, 5090 and 5080... means it can not handle any sufficient value for "what i realistically can get out of it" so... for me, as long as no very high price drops, a great aftermarket model and in general a suitable availability of those cards... i am out... this is "to much" to handle... even for a apparently "very wealthy" Swiss citizen.

As long as no big changes here... i do recommend simply to use "a upper middle class", which i already got... so no need for an upgrade. As well rather invest the saved money into a big enterprise HDD, so even more backups can be made.
Post edited February 06, 2025 by Xeshra
Someone telling some hard... bone breaking, truth... GPUs will probably never become "the same anymore". Gamers are just to poor compared to the AI-datacenter-industrie. Either demand a higher salary, which will not happen for most of us, or say farewell to "high end".

Not much different on HDDs... as they are sought after by datacenters... the per TB price, despite the increased size.. is not going down anymore. Reason why the biggest HDDs are reserved for datacenter only (normal customers got no access) and the "leftovers"... which are 26 TB or smaller... are over 500 coins. So... "playing" a home-datacenter is a pricey thing... in which GPU and HDDs, both are part of this "imitation". Reason why most people seems to trust on foreign servers, as well for the high cooonvenience... and uhm... OOKAY...

PlayStation Network Is Down... And Sony Isn't Saying Why
https://www.youtube.com/watch?v=K8WztvGcIck

GPUs Will NEVER Be the Same:
https://www.youtube.com/watch?v=3LQOToy6e4c

The thing with AMD is, they as well do not care GPUs that much anymore as they as well got some datacenter-buisness... in the CPU market. The 8 cores for gamers are just some cheap leftovers, and 500 is just all they could charge for... until most gamers may quiet on it. I guess the PS5 Pro was as well a "market test" and it seems Sony was kinda hitting the red line of a usual customer. It may have a effect on the PS6 hardware and its pricing...

Yet, honestly... do we all want a Switch-console, just because it is more affordable? Whats clear; it is the most sought after console: Despite this fact i would never buy it... just a nuisance in performance. The PS5 Pro is simply a luxury, albeit one which was worth it for me, in the end. My "life" time is incredible valuable and i got no time for playing "worse":.. sorry for being that selfish.

Looking at the now worse new GPU generation with barely any advancement... i would say it was the right thing to do, as a backup for my DRM free games.
Post edited February 09, 2025 by Xeshra
It seems like the "melted cable issue" is not gone yet... and even people who care a lot may be getting this issue.

In my mind 600 W is just to much power going through a single cable over 12V... they should use 2 cables with 350 W each. I surely do not feel safe with so much power over 12V. Usually... 24V or more would be the way to go... or simply several cables.

12VHPWR on RTX 5090 is Extremely Concerning:
https://www.youtube.com/watch?v=Ndmoi1s0ZaY

Sure, they made a cable revision and even changed the name but in the end the only true change was the shorter sense-pin... so in theory the cable should be more quick to detect a lack in contact, i assume.

The most important thing is not even multi rail or what else... simply a PSU sufficiently powerful at any load, and a cable able to withstand the high current.

Surely challenging because with so much power i usually am almost gonna cook my water with and usually over 220 up to 400 Volt.

I really hope that MSI is creating a 2 cable design for their most "top notch" model... it seems the safe route this way.
Post edited February 11, 2025 by Xeshra
Awesome...

Another card no one will be able to buy for 2 years just so my electric bill can go up another $100 a month so some server farm can mine fake money that no one can use but somehow makes exactly 4 guys rich beyond belief.

Well, at least the massive corporate data collection structure will be able to steal even MORE of my personal information and lose it even quicker!

Thank God I get a single piece of paper in the mail letting me know they're going to do exactly NOTHING about it!
avatar
Ixamyakxim: Awesome...

Another card no one will be able to buy for 2 years just so my electric bill can go up another $100 a month so some server farm can mine fake money that no one can use but somehow makes exactly 4 guys rich beyond belief.

Well, at least the massive corporate data collection structure will be able to steal even MORE of my personal information and lose it even quicker!

Thank God I get a single piece of paper in the mail letting me know they're going to do exactly NOTHING about it!
Don't forget to mention the GAMES !

I can't wait to look back in disgust from prioritizing capacity rather than style.
I CAN'T HEAR YOU FROM ALL THE TRIANGLES BEING RENDERED ON SCREEN RIGHT NOW.
avatar
Ixamyakxim: Awesome...

Another card no one will be able to buy for 2 years just so my electric bill can go up another $100 a month so some server farm can mine fake money that no one can use but somehow makes exactly 4 guys rich beyond belief.

Well, at least the massive corporate data collection structure will be able to steal even MORE of my personal information and lose it even quicker!

Thank God I get a single piece of paper in the mail letting me know they're going to do exactly NOTHING about it!
avatar
.erercott: Don't forget to mention the GAMES !

I can't wait to look back in disgust from prioritizing capacity rather than style.
I CAN'T HEAR YOU FROM ALL THE TRIANGLES BEING RENDERED ON SCREEN RIGHT NOW.
People still use these to play games? Weird! Hehe

This bolded line is hilarious!
avatar
Xeshra: It seems like the "melted cable issue" is not gone yet... and even people who care a lot may be getting this issue.

In my mind 600 W is just to much power going through a single cable over 12V... they should use 2 cables with 350 W each. I surely do not feel safe with so much power over 12V. Usually... 24V or more would be the way to go... or simply several cables.

12VHPWR on RTX 5090 is Extremely Concerning:
https://www.youtube.com/watch?v=Ndmoi1s0ZaY

Sure, they made a cable revision and even changed the name but in the end the only true change was the shorter sense-pin... so in theory the cable should be more quick to detect a lack in contact, i assume.

The most important thing is not even multi rail or what else... simply a PSU sufficiently powerful at any load, and a cable able to withstand the high current.

Surely challenging because with so much power i usually am almost gonna cook my water with and usually over 220 up to 400 Volt.

I really hope that MSI is creating a 2 cable design for their most "top notch" model... it seems the safe route this way.
RTFM still counts i guess
Did you even watch the movie? Guess nope...

The thermal camera was showing over 100° on many of the cables, this is so dam hot... your finger would become burned in probably less than 10 sec. Imagine how it could become worse over time if it is heating up even more... and with increased temperature of ANY part the currency conductivity will go down... until at some point there is a sudden increase of resistance in conductivity... which will produce a sudden "domino effect" and it can in rather fews seconds heat up to hundreds of C up to the point stuff is melting and burning.

However... according to my logic... in 99,9% of the cases the cable will simply "hold" the temperature at a steady 100-150 C and not failing... until... something gets a little bit to weak because of the overheating all the time... this is the moment stuff can pretty quick go boom.

So, i guess almost anyone got cables which are "to hot". probably even me at full GPU load... however... i may still be safe on my 3090 TI because it is firstly not using more than 450 W even at full load, and secondly... i usually use Vsync... so my 3090 TI usually stays below 300 W, which means my comparable 12VHPWR cable only need to handle about "half the current" which is probably keeping my cable below 100°... and this is indeed considered safe... even for long term use.

Fact is simply... the whole design is at its very limit and instead of trying to downplay the matter all the time, those involved should simply revise the entire design and not allow the old cable design to use more than 300 W on a "single cable", which means 300 W each slot on the GPU side and 150 W each slot on the PSU side, and of course perfectly distributed (which is probably not the case). It simply can not handle it in a absolutely safe way... i dunno how many times stuff have to go very hot until they realize this flaw.

Nope... the heat will build up just about everywhere, even directly at the PINs... the only parts not directly affected is inside the PSU or at the PCB because those parts got way better "thermal cooling" so it may never go above 100 C there.

However, as soon as the conductivity has become busted, somewhere at 200 or 300 C i predict... means the domino effect will kick in... (means, lower conductivity... more heat... even lower conductivity... even more heat... a pretty fatal chain reaction) at this point if the OCP is not quick enough and not even the short circuit protection... then many parts can become irreversible damaged. This is still not the maximum damage possible because the very worst case... that did probably not happen yet (just way to less owners of such cards) is that the entire PC may start to burn, causing fire,... the fire will spread and in the end a entire house may become burned down. This worst case scenario is surely almost impossible because usually... at the required load... a PC owner is nearby and may notice "the burning smell" and... shut down the PC after... which was always the case up to this point.

Guess i do recommend a temperature sensor at the cables and the Pins...

Ah yes, and i may even predict why the OCP or short circuit protection is not sufficiently quick:

1. The malfunction is not happening INSIDE the PCB or INSIDE the PSU... so the failure is somewhat "externalized" and with lower signature.

2. The over current and short circuit is usually not perfectly spread along every single cable... instead it usually starts to affect 1... maybe 2 or 3 cables... and the protection is not attached to every single cable but to the entire package instead.

There is only 2 good solutions: Either make a protection at every single cable (means dozens of rails, very expensive) or simply spread the cables way more among way more connectors. Although this only works well if the power is perfectly spread... if not it may still cause "single cable overload situations".

However... what i can say is: Using another cable may not help at all because the design simply got clear boundaries. Albeit... it is important not to mix different plating with each others... and in optimal circumstances we got gold plating everywhere (male and female PIN).
Post edited February 13, 2025 by Xeshra
avatar
Xeshra: In my mind 600 W is just to much power going through a single cable over 12V... they should use 2 cables with 350 W each
To be honest, 600w space heaters in average Joe's PC's were a bad idea anyway. Even if the cable does work properly and even if it can expel that heat out of the case, it's not even pleasant playing games in the middle of a 35-40c summer heatwave with a 600w heater turned on, on top. Same thing.
Guess i have to agree... 600 W is just over the top... even for me. I think my personal limit is 450 W, which is my current cards limit. Sure... i usually use V sync, so i may not exceed 2/3 of its total headroom (which is 400 W on a 600 W GPU, or 300 W on a 450 W GPU) but in general... 300 W with V sync is more than enough.
The issue is simply, the did not raise the rasterization efficiency one little bit because of the "AI industry"... all they ever cared the past 2 years is AI, this is the hard truth. So we got kinda a 4090 TI Super that got more performance but unfortunately i would not consider it safe anymore... outside a server room with 24/7 excessive cooling and a lake of cables everywhere.

Nonetheless... i can not "step down" to a 4090 anymore because those cards are now selling for crazy prices... even at the used market. So i simply will have to deal with 50% less performance for some more years to come... which is fine on 1080P, but not on 4k for modern games.

All the other cards barely or only marginally are better performing than the card i already got... so not much use.

Sure, you can still power limit a 5090 to 450 W (minus 30%, about 1/3, is the limit Nvidia seems to allow for lowering TDP, yet the card still may run at 80-90% performance they say) "to be safe"... it surely works... but the price will be insane and the performance "only" about approximately 30% above a 4090 running at the same consumption.

We simply got a very big cost increase but at the same time, in comparison to the cost, very low gains.

In my mind AMD had a real chance competing in rasterization and RT with Nvidia, even for a flagship model... yet AMD simply did not care anymore... and is leaving this market to Nvidia, and even Nvidia does not really care anymore... the situation is a mess.
Post edited February 11, 2025 by Xeshra
avatar
Xeshra: Did you even watch the movie? Guess nope...

.

However... according to my logic... in 99,9% of the cases the cable will simply "hold" the temperature at a steady 100-150 C and not failing... until... something gets a little bit to weak because of the overheating all the time... this is the moment stuff can pretty quick go boom.

So, i guess almost anyone got cables which are "to hot". probably even me at full GPU load... however... i may still be safe on my 3090 TI because it is firstly not using more than 450 W even at full load, and secondly... i usually use Vsync... so my 3090 TI usually stays below 300 W, which means my comparable 12VHPWR cable only need to handle about "half the current" which is probably keeping my cable below 100°... and this is indeed considered safe... even for long term use.

Fact is simply... the whole design is at its very limit and instead of trying to downplay the matter all the time, those involved should simply revise the entire design and not allow the old cable design to use more than 300 W on a "single cable", which means 300 W each slot on the GPU side and 150 W each slot on the PSU side, and of course perfectly distributed (which is probably not the case). It simply can not handle it in a absolutely save way... i dunno how many times stuff have to go very hot until they realize this flaw.



However... what i can say is: Using another cable may not help at all because the design simply got clear boundaries. Albeit... it is important not to mix different plating with each others... and in optimal circumstances we got gold plating everywhere (male and female PIN).
still the most arrogant person in the world who thinks proper forum addressing policies are for other people

I read the article

short sum :

Oh my god, there was a slight deviation in an otherwise perfectly working device or support .....

The user had absolutely nothing to do with this ......

I think people just should limit new their new toys until they know for certain, what's up