Sunday, 8 April 2018

Relation between GPU utilization and graphic card's power consumption


I wonder what the relation between GPU utilization and graphic card's power consumption is.


E.g. in the screenshot below, GPU 2's utilization is 92%, while the power usage is 129 watts out of 250. Why isn't the power usage around 250 * 0.92 = 230 watts?


enter image description here



Answer



The load factor shows how much more of the same computation could be done, not how much of the chip's total processing capability is being used for that computation.


For example, your 92% shows that on average, the GPU did something during 920,000 out of every 1 million clock cycles. It doesn't mean that 92% of every single circuit of every single shader processor was active, let alone 92% of every single circuit on the whole board (VRAM controller, DAC, shaders and raster units and branch predictors and texture lookup units and so on).


If your usage only takes advantage of a few GPU features, you might well run at 100% of the throughput of those features, while leaving half the chip asleep. But the half that's asleep couldn't be used for that type of work at all.


No comments:

Post a Comment

Where does Skype save my contact's avatars in Linux?

I'm using Skype on Linux. Where can I find images cached by skype of my contact's avatars? Answer I wanted to get those Skype avat...