[Tested] ASUS ENGTX580 1536MB at Geeks3D Labs

ASUS ENGTX580 Review Index

7 – ASUS ENGTX580 Power consumption and Overclocking

ASUS ENGTX580, FurMark 1.8.2

For the power consumption, I used FurMark 1.8.2 and did a quick test with the upcoming FurMark 1.9.0. The graphics workload in the new FurMark 1.9.0 has been slightly increased, leading to more power consumption (few watts in more).

The total power consumption of my testbed in idle is 117W.

The GTX 580 comes with a power draw limiter (see here: GeForce GTX 580 Power Monitoring Details). To bypass the GTX 580 power draw limiter, I used a special version of GPU-Z (see here: GeForce GTX 580 Unlocked: 350W Under FurMark Thanks To GPU-Z).

When the power draw limiter is enabled, the total power consumption of the system is 304W (FurMark settings: 1920×1080, fullscreen, Xtreme burn-in, no AA, no postfx). When the power draw limiter is disabled, the total power consumption reaches 439W for a max GPU temperature of 90°C (the ENGTX580 is used with its default clocks).

We can the calculate the power consumption of the ASUS ENGTX580. The Corsair AX1200 PSU has an efficiency factor of around 0.9 (see this article, there is a graph of the AX1200 efficiency).
P = (439-117) * 0.9
P = 290 watts

I tested ASUS SmartDoctor to overclock the ENGTX580. I quickly modified the VDDC (or Vcore or GPU core voltage) as well as the GPU core clock to try some overclocking settings:

  • GPU core: 820MHz, Vcore=1.068V. Total power consumption: 450W (GTX580 alone: 299W), GPU temp: 90°C, GPU current: 96A and GPU power draw: 96W
  • GPU core: 851MHz, Vcore=1.075V. Total power consumption: 468W (GTX580 alone: 316W), GPU temp: 91°C, GPU current: 100A and GPU power draw: 101W
  • GPU core: 871MHz, Vcore=1.088V. Total power consumption: 484W (GTX580 alone: 330W), GPU temp: 92°C, GPU current: 104A and GPU power draw: 106W

ASUS SmartDoctor

ASUS ENGTX580 - GPU core clock=871MHz
ASUS ENGTX580 stressed by FurMark 1.8.2, res 1920×1080, no AA, no post FX, Xtreme burn-in mode

To reproduce such overclocking test, I recommend you to have some quality hardware, because we exceed the electric specifications of the system. In my case, a GIGABYTE A-UD5 motherboard and a high-end PSU: Corsair’s AX1200!

As you can see, with high quqlity products, stressing an overclocked GTX 580 with FurMark is not a problem.

Power: 330W – ASUS ENGTX580 (core: 871MHz, Vcore:1.088V)
Power: 290W – ASUS ENGTX580 (default settings)
Power: 272W – EVGA GTX 480
Power: 185W – ATI Radeon HD 5870
Power: 147W – ASUS EAH6870
Power: 135W – MSI N460GTX Cyclone 768D5 OC

The max total power consumption of the ENGTX580 is 330W while the GPU one is 106W. The difference, 330-106 = 224W, is the power consumption of the rest of the board (especially the VRM).

Just for the fun, I did a test with a non-blacklisted FurMark (1920×1080 fullscreen, Xtreme burn-in disabled, no AA, no postFX) with OCP enabled and OCP disabled (thanks to GPU-Z):

ASUS ENGTX580 + OCP enabled + non-blacklisted FurMark

ASUS ENGTX580 + OCP disabled + non-blacklisted FurMark

We clearly see the GPU throttling in action in the first picture while the second picture shows a more conventional temperature curve. In Xtreme mode, when OCP is enabled, the GPU is permanently throttled back while this not the case when Xtreme mode is not used.

9 thoughts on “[Tested] ASUS ENGTX580 1536MB at Geeks3D Labs”

  1. Psolord

    Great stuff indeed! Awesome test JegO! 😉

    Some minor mistakes at the parts where you mention Asus Smartdoctor and at 4.5 Direct X SDK.

    All this is tesselation is nice and cool, but the problem is that from all these benchmarks you can only actually play LP2 and Hawx and Hawx is running pretty nicely everywhere anyway.

    Personally I am having my eyes turned at the 570 or 69XX since the 580 is too expensive for my taste and I am aiming for SLI anyway since I am upgrading from 5850 crossfire and just one card wouldn’t cut it. I will be CPU limited anyway even with the 570s, as were a couple of the tests in this review.

    PS Aquamark FTW yay! 😀

  2. Farzid khan

    wth!! wait till HD6970 launches
    bakre ki amma kab tak khair manayegi

  3. Krian

    Excellent material in this review, but as mentioned by Psolord only two tests are playable games and we would like to see some more heavy game testing; IE. Crysis, Metro2033, Call of Duty: Black Ops, Supreme Commander 2 and so on.

  4. JeGX Post Author

    @Krian, @Psolord: I’ll try to improve my next review with some games. Thanks for the feedbacks!

  5. Sachin

    The thing is GTX 580 is the overall best video card ever made giving you an awesome experience of PhysX and DX 11 Tesselation

    All hail Nvidia

    mighting GTX 580

    ati can never overtake the GOD (Nvidia)

  6. Psolord


    In one word, I agree about the 580 being the best card right now. The problem is that it is still slower than the 5970 and that’s a one year old card, not to mention that it’s God awful expensive.

    Also there are solutions that are both cheaper and faster right now, like 6870 CFX or GTX 460 SLI.

    Now using expression like “all hail Nvidia” and such, quickly demotes you to the fanboy realm.

  7. Pingback: Multipage Post Test: ASUS GTX 580 | JeGX's HackLAB

  8. Blackice504

    the problem with real life gaming and benchmarks are really two different things Nvidia cards MIGHT SEEM be slower in a benchmark but what people Really need to remember is games as far back as i can rememeber even that crazy dos version of GTA is FRAME LIMITER when you play games all games are frame limited to 60fps MAX some are still on 30FPS a dvd is 25fps then you must remember ATI or AMD who ever you want to call them is does not work with every title some games need Physic’s or some other instruction that Nvidia can only do,Nvidia offer Compatabilitie
    with every game, every application,so that been said there are many games you will get better performance out of because they written the game for Nvidia in mine or they need instruction sets that only Nvidia cards have, the hardware fact is they last far longer then any ATI card I have a TNT card that still works I use that card in a media server as for ATI cards back then no chance.

Comments are closed.