It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
low rated
After reviewing the latest nvdia update i found the mentioning of updated protocols for their auto tuning software encouraging enough to try and see how and if i would have benefit from such a tool. Also i decided on checking out their new OSD feature.

After deciding upon warhammer as a first testbed it wasn't for long that i found that there is an staggering 80W difference in consumption between using nvidia's overlay against not using the overlay.

In my head this seems not the be what you want to aim for when your lower tier customers are running a 100 to 200 W consumption cards. I mean that is 50% of your power consumption aimed at keeping that background program fed. I can imagine even 2080Ti users with 300W cards wrinkling their eyebrows.

Now i'm not sure if the programs consumption wil maybe go down after some minutes or..... maybe decreases in higher user case scenario's but even then, this is something the conscious consumer won't be liked to be confronted with?! Especially since the overlay is enabled by default. And we all know how many conscious users are out there!

So in my feeling this is another topic that closes the line nicely with the current ongoing complaints against NVIDIA.
NVIDIA did it again, jayce'stwocents episode that aired yesterday had a nice comparison going on. Comparing the actions of late NVIDIA with the schoolkid or friend, that clueless hopelessly nervous extravert type known to get into all sorts of trouble.

And in case you were wondering about what went wrong this time. NVIDIA refuses to respond to accusations of releasing a card on a certain date while there is not even stock to show..... stupid fanatics who set up camp for shops only to receive no as an answer the next morning the store opened up its doors etc etc ebay sellers you know the drill.

--attached are 2 screenshots in 2k, taken right after the game loaded.
Attachments:
nvidia.jpg (493 Kb)
Post edited June 26, 2021 by Zimerius
low rated
hmm hope amd is better

overlays are pretty lame
Post edited June 26, 2021 by Orkhepaj
low rated
avatar
Orkhepaj: hmm hope amd is better

overlays are pretty lame
putting my hope on intel, lets see what their new line of cards will bring with their next generation
low rated
avatar
Orkhepaj: hmm hope amd is better

overlays are pretty lame
avatar
Zimerius: putting my hope on intel, lets see what their new line of cards will bring with their next generation
did they start making gpus?
avatar
Orkhepaj: did they start making gpus?
Yeah, the first model is the Intel Xe DG1. Although it's only slightly better than integrated graphics from an AMD APU.

However, early benchmarks from prototypes of their next model -- Xe HPG -- are showing it performing around an NVIDIA RTX 3070. It's supposed to be released in 2022. While it won't compete with AMD or NVIDIAs absolute top of the line, if Intel can price the cards aggressively they could carve out a niche for themselves as a good value for non-4K gaming.
Post edited June 26, 2021 by Ryan333
I really don't know what I'm supposed to be looking at beyond the fact the GPU is under-clocking on the latter (1350MHz = 73.5w) pic but not the former (1980MHz = 150w) when both aren't under load could be anything including changing the Power Management Mode. Also taking screen-caps in two different and very strange resolutions (2253x1267 vs 2176x1224) obviously makes it harder to cross-compare.
avatar
Orkhepaj: did they start making gpus?
avatar
Ryan333: Yeah, the first model is the Intel Xe DG1. Although it's only slightly better than integrated graphics from an AMD APU.

However, early benchmarks from prototypes of their next model -- Xe HPG -- are showing it performing around an NVIDIA RTX 3070. It's supposed to be released in 2022. While it won't compete with AMD or NVIDIAs absolute top of the line, if Intel can price the cards aggressively they could carve out a niche for themselves as a good value for non-4K gaming.
cool
more competition is better
hope they will have good value/$ ratio
low rated
avatar
BrianSim: I really don't know what I'm supposed to be looking at beyond the fact the GPU is under-clocking on the latter (1350MHz = 73.5w) pic but not the former (1980MHz = 150w) when both aren't under load could be anything including changing the Power Management Mode. Also taking screen-caps in two different and very strange resolutions (2253x1267 vs 2176x1224) obviously makes it harder to cross-compare.
Fair point
low rated
after receiving some critic on the chosen method i decided upon making my effort a bit higher.

started a couple of games, with the overlay enabled and disabled

finds so far, the on screen display function from nvidia seems to be the culprit atm

I did not expect that, the last thing you'd expect from a osd function is that it actually influences your play

So, with nvidia overlay enabled consumption seems to be around the same line as with no overlay enabled but their osd function increases consumption in different games with around 60 to 80W's

Silly them ;p

added some other screenshots from mortal shell
Attachments:
bought a RTX2070 Super, 2 years ago
and would also be my last nvdia card i will buy

the privacy is a thing, (for example a active account on experience? wtf is that requested?)
i disabled countless logs and services that they keep launching (just search telemetry)

i sick of nvdia overpriced products that gives little to nothing back.
avatar
Zimerius: So, with nvidia overlay enabled consumption seems to be around the same line as with no overlay enabled but their osd function increases consumption in different games with around 60 to 80W's
It's possible that it could be increasing the load slightly. To be honest though I don't understand the point of running 2x overlays at the same time (MSI Afterburner / Rivatuner + nVidia's on top of that). I just install the driver itself (without GeForce experience or any overlays) and just use MSI AB on its own and don't seem to have any issues.
low rated
avatar
Zimerius: So, with nvidia overlay enabled consumption seems to be around the same line as with no overlay enabled but their osd function increases consumption in different games with around 60 to 80W's
avatar
BrianSim: It's possible that it could be increasing the load slightly. To be honest though I don't understand the point of running 2x overlays at the same time (MSI Afterburner / Rivatuner + nVidia's on top of that). I just install the driver itself (without GeForce experience or any overlays) and just use MSI AB on its own and don't seem to have any issues.
I don't see using almost 50% of the gpu's stated powerdraw as slightly, maybe you need to adjust your own sense and don't try to focus on why i am running 2 separate instances to measure something while in fact there could be a lot more to it then just the osd aspect
The power TDP of nvidia cards increased dramatically for the geforce 30 generation and this is in part from the switch to GDDR6x as well as nvidia upclocked the speed of the processor by 10% to edge out gains over AMD cards. This 10% increase in clock equates to about 80-100 watts increased TDP. If you downclock your GPU by 10% you will save that much more watts.

You can see this reflected in the TDP of a quadro card compared to its equivalent geforce, the RTX A4000 card is rated for 140watt tdp and has 6144 cuda cores, where the 3070 ti has 6144 cuda cores and a TDP of 290 watts.
avatar
Samweisse: The power TDP of nvidia cards increased dramatically for the geforce 30 generation and this is in part from the switch to GDDR6x as well as nvidia upclocked the speed of the processor by 10% to edge out gains over AMD cards. This 10% increase in clock equates to about 80-100 watts increased TDP. If you downclock your GPU by 10% you will save that much more watts.

You can see this reflected in the TDP of a quadro card compared to its equivalent geforce, the RTX A4000 card is rated for 140watt tdp and has 6144 cuda cores, where the 3070 ti has 6144 cuda cores and a TDP of 290 watts.
Saaaam