bit-tech.net

What does TDP mean, Nvidia?

Posted on 11th Nov 2010 at 16:09 by Clive Webster with 53 comments

Clive Webster
TDP is typically defined as Thermal Design Power, the amount of power (heat) that a cooler must dissipate in order to keep a silicon chip within its operating temperatures. While Intel and AMD disagree as to what test to run to measure this, both agree that it’s a measurement of waste heat output from a chip.

Reading through the Nvidia GeForce GTX 580 1.5GB review guide threw up a new definition, however: ‘TDP is a measure of maximum power draw over time in real world applications.’ Even if Nvidia’s definition is the correct one, I have to wonder why it wanted to use the term in this way at all.

The use of the term TDP will inevitably lead to comparisons between the power consumption of Nvidia’s graphics cards and heat output of CPUs and GPUs. By defining TDP as an input– and, in fact, as the maximum power draw of an entire card, rather than just the GPU – Nvidia is making its product look comparably terrible. A 6-core LGA1366 Core i7-980X Extreme Edition has a TDP of 140W, likewise an Athlon II X6 1090T Black Edition, but with a ‘TDP’ of 255W Nvidia is making the GTX 580 1.5GB seem to be twice as hot or twice as power hungry as those CPUs.

And for what? Detractors and the flippant will say that all this power is merely being use to play some games that can run just as well on an Xbox – what a waste of valuable resources!

Of course, you and I know better. For a start, some of the best games around require fast, modern PC hardware and many aren’t even available for consoles. More pertinently, we know that TDP should be a measurement of waste energy (heat, unless something goes drastically wrong) and not, as Nvidia says, a measurement of maximum input power.

Lars Weinand, Senior Technical Marketing Manager, EMEA, told us that, ‘The problem with TDP is there is no “standard” for this. So everyone is measuring TDP in a different way and TDPs are only really comparable within the same manufacturer… We are using TDP in a way that makes most sense for us.’ Even Lars ultimately suggests that we put more faith in our power consumption tests than a number written on a spec sheet.

But for all that, why would Nvidia invite the kind of criticism and confusion I’ve briefly outlined? It’s not as if a maximum board power of 255W is especially high for a top-end card – others have hit 293W and no-one has complained. Nvidia’s definition of what it means by TDP may be sufficiently clear that I don't care what term is uses, but Nvidia’s marketing department really should.

53 Comments

Discuss in the forums Reply
Snips 11th November 2010, 16:29 Quote
You can't open up a can of worms purely based on a marketing department. ALL the manufacturers do it, Nvidia, Intel, AMD, Ann Summers claiming their products do this or do that. Real world benchtests can prove otherwise so let's just keep it at that. I constantly moan about the AMD marketing, something I'm serving time for, but I went down that trap of believing the pap and then seeing the crap. Be warned, you gonna get it bigtime Clive.
wuyanxu 11th November 2010, 16:34 Quote
perhaps it is the power draw for the actual GPU, rather than the 293w measured for the whole card?

but agree with article, why did nvidia choose to use power consumption rather than heat output? it doesn't exactly make them look good, only adds more to confusion
Picarro 11th November 2010, 16:47 Quote
Unless they went for the "MOAR NUMBERS IS MOAR!!1!!!!1" tactic .. Wouldn't really surprise me
greigaitken 11th November 2010, 16:50 Quote
besides the 1/2w for sound - surely ALL the power consumed is turned to heat? surely it's better to know how much power the device will consume so that you know how much you will need to supply it. which should also be the same as how much heat it produces. Where else would the energy go?
barrkel 11th November 2010, 16:51 Quote
Thanks to the magic of conservation of energy, the power used by solid-state hardware (no moving parts) is all but equal to the amount of heat it needs to dissipate.
mclean007 11th November 2010, 17:01 Quote
This is kind of an odd blog post - essentially all of the power input of a computer component is output as waste heat. This isn't some kind of physical machine where some of the power is actually used to move something from A to B or to light a room or whatever. A CPU or GPU which uses 100W outputs 100W of heat. The logic (work) done by the chip comes from the entropy gap between (high entropy) electricity and (low entropy) low level waste heat. So to draw a distinction between measuring TDP by power input and by heat output is pure semantics - the figure is the same. Okay so Nvidia may be selling themselves short by using the TDP for the entire card rather than just the GPU, but ultimately this is a less misleading measure - what users really care about is (a) how much electricity does it use (i.e. how much is it going to cost me in electricity bills); and (b) how much heat will it dump into my case (or room, if it exhausts to the rear as many GPUs now do). The two things are the same.
sleepygamer 11th November 2010, 17:06 Quote
Quote:
Originally Posted by Snips
You can't open up a can of worms purely based on a marketing department. ALL the manufacturers do it, Nvidia, Intel, AMD, Ann Summers claiming their products do this or do that. Real world benchtests can prove otherwise so let's just keep it at that. I constantly moan about the AMD marketing, something I'm serving time for, but I went down that trap of believing the pap and then seeing the crap. Be warned, you gonna get it bigtime Clive.

Ann Summers Scientist does a press conference:

"Our new product, the 'Mysterious Passion' nightdress, complete with stockings and boots has been shown in tests to increase the TDP of your significant other by upto 20%. Please ensure that you have adequate hardware to tame this increase in heat."

Hee...
Bakes 11th November 2010, 17:27 Quote
Clive,

nVidia is actually using that definition in order to fool the consumer. Whilst Intel and AMD/ATI are using the definition of the power output over every condition (so an Intel TDP would be the absolute maximum that chip/unit would use), nVidia is defining 'over standard operating conditions', that being games. So, when you get an ATI TDP and an nVidia TDP, they can mean wildly different things.

For example:

http://www.behardware.com/medias/photos_news/00/28/IMG0028713.png

Original source: http://www.behardware.com/articles/787-5/report-nvidia-geforce-gtx-480-470.html

That site measured the actual power input using a special PCIe riser card - as you can see the ATI 5870 is very close to its published TDP of 188w, but the GTX 480 is far higher - the published tdp is around 260w but under furmark (ie full load rather than the load you'll see in games) it increases by around 40 watts.

This makes the nVidia tdp values even more incomparable - since whilst ATI shows tdp values that roughly correlate to the power draw, nVidias only correlate to the power draw whilst playing games - doing a high-stress test such as furmark or folding will increase the power draw massively.
HourBeforeDawn 11th November 2010, 17:41 Quote
@Bakes cant get the image to load and also do they have a pic of this special riser card???
MaverickWill 11th November 2010, 17:44 Quote
Not helped by the fact that nVidia are apparently using circuitry on the card itself to prevent full power draw on the 580. Anandtech managed to circumvent it here, which shows, if we subtract the SLI result from the single result, a difference in power usage of 325W. Calling an efficiency of 87% (good, efficient PSU), I make that to be 283W.
jrs77 11th November 2010, 17:44 Quote
TDP is totally uninteresting tbh.

Maximum powerdraw at the AC-plug is a way better measurement as this is what you pay for with your electricity-bill.

So it would be great if manufacturers would just introduce a new term, which then would be called: MPD.

As it is currently it's totally flawed, and with a short example you can picture it very well.

The fuel needed for cars is measured in litres/100km (standard triple-mix) for example, and they speak about the fuel you need to fill in your tank.
Now, if a manufacturer would instead measure the whole thing in kW used to drive 100km instead, that would totally ruin the picture, as motors used in cars vary between 30-50% efficiency.
Jampotp 11th November 2010, 17:47 Quote
Quote:
Originally Posted by barrkel
Thanks to the magic of conservation of energy, the power used by solid-state hardware (no moving parts) is all but equal to the amount of heat it needs to dissipate.

+1 ;)
Phil Rhodes 11th November 2010, 18:33 Quote
Quote:
More pertinently, we know that TDP should be a measurement of waste energy (heat, unless something goes drastically wrong) and not, as Nvidia says, a measurement of maximum input power.

But those numbers should be so nearly identical as to make no difference.

Other than the power required to run the fan (a couple of watts) and the energy required to drive the PCIe bus lines and DVI outputs (not insignificant, but not much), all of the energy supplied to the card will eventually be dumped as heat.

If everyone measured TDP as power input, we'd be much better off.
alwayssts 11th November 2010, 18:46 Quote
I think this discussion is interesting, and I've mentioned it in the comments of articles from a couple different sites because it really could change the perception of a product, or what is allowable under pci-e spec. I'm glad the people at Bit-tech publicly exposed this issue, as I'm sure I'm not the only one that thinks there needs to be a standard.

Not everyone looks at charts and graphs like me, or perhaps you. Even if they choose to inform themselves beyond a TDP rating, independent reviews have variables, such as power supplies and their efficiency. A consistent test platform to test/meet a pci-e spec TDP does not seem much to ask. Both nVIDIA/AMD conform to the pci-e spec, yet can define how the spec is measured? That's ridiculous. Furmark/OCCT seems like a test fair for a workstation card, but something like Crysis makes sense for consumer products.

I compared the 4870x2 for example, versus the 5970. 4870x2 has a TDP of 286W. 5970 has a TDP of 294W. In reality, under a normal gaming scenario, the former uses close to it's TDP while the 5970 uses less than 225W. Hell, 5970 uses less power gaming than GTX580! The confusion is simply because AMD made a stupid decision starting with the 5000-series to change their TDP testing. Furmark/OCCT are not representative of the power/heat for the 5000/6000 series at all, and I hope AMD changes back. I understand some may think nVIDIA is twisting numbers, and that's a fair argument if you agree with AMD's method, but I happen to believe nVIDIA is calling this correctly and AMD is shooting themselves in the foot. Who is to say a 5870x2 wouldn't use under 300W using nVIDIA's method? It likely would. Why cock-block themselves?

As I've said before, it will be interesting to see if and/or when dual GF104/Cayman products are released, and if this discrepancy still exists.
jrs77 11th November 2010, 19:03 Quote
Quote:
Originally Posted by Phil Rhodes
But those numbers should be so nearly identical as to make no difference.

Other than the power required to run the fan (a couple of watts) and the energy required to drive the PCIe bus lines and DVI outputs (not insignificant, but not much), all of the energy supplied to the card will eventually be dumped as heat.

If everyone measured TDP as power input, we'd be much better off.
Quote:
Originally Posted by alwayssts
I think this discussion is interesting, and I've mentioned it in the comments of articles from a couple different sites because it really could change the perception of a product, or what is allowable under pci-e spec. I'm glad the people at Bit-tech publicly exposed this issue, as I'm sure I'm not the only one that thinks there needs to be a standard.

Not everyone looks at charts and graphs like me, or perhaps you. Even if they choose to inform themselves beyond a TDP rating, independent reviews have variables, such as power supplies and their efficiency. A consistent test platform to test/meet a pci-e spec TDP does not seem much to ask. Both nVIDIA/AMD conform to the pci-e spec, yet can define how the spec is measured? That's ridiculous. Furmark/OCCT seems like a test fair for a workstation card, but something like Crysis makes sense for consumer products.

I compared the 4870x2 for example, versus the 5970. 4870x2 has a TDP of 286W. 5970 has a TDP of 294W. In reality, under a normal gaming scenario, the former uses close to it's TDP while the 5970 uses less than 225W. Hell, 5970 uses less power gaming than GTX580! The confusion is simply because AMD made a stupid decision starting with the 5000-series to change their TDP testing. Furmark/OCCT are not representative of the power/heat for the 5000/6000 series at all, and I hope AMD changes back. I understand some may think nVIDIA is twisting numbers, and that's a fair argument if you agree with AMD's method, but I happen to believe nVIDIA is calling this correctly and AMD is shooting themselves in the foot. Who is to say a 5870x2 wouldn't use under 300W using nVIDIA's method? It likely would. Why cock-block themselves?

As I've said before, it will be interesting to see if and/or when dual GF104/Cayman products are released, and if this discrepancy still exists.

Like I said there allready. Manufacturers simply need to agree upon "maximum powerdraw" at 100% load and state this number as "MPD" instead of using anything else.
Xtrafresh 11th November 2010, 19:28 Quote
Clive, do i understand correctly that you are criticizing the nVidia Marketing department for not spinning this number to their advantage?

@jrs77: nVidia and ATI disagree about 100% load though. A card runs hotter and uses more power in furmark then when playing games, yet both generate 100% load if you believe GPUz.
Altron 11th November 2010, 19:46 Quote
The real issue here is the wording

"TDP is a measure of maximum power draw over time in real world applications"

What's "maximum power draw over time"? That's not an engineering term. You can have a maximum instantaneous power draw, and an average power draw over time, but not a "maximum power draw over time".

And what's "real world applications" - I feel like this would exclude articifial benchmarks that run every part of the chip at 100% utilization.

But splitting hairs over "input power" and "power wasted as heat" - that's just stupid. The only thing that produces an actual energy output other than heat is the fan (produces kinetic energy in air and produces sound) and the very low power signal going out over the video cables or the PCIe data lanes. That amount of energy is marginal compared to the power spent as heat.
thehippoz 11th November 2010, 19:50 Quote
I didn't even know there wasn't a standard set until read this.. you know nvidia though.. they will spin it to look better for themselves every time- even if what they are talking about has wood screws in it
jrs77 11th November 2010, 19:53 Quote
Quote:
Originally Posted by Xtrafresh
Clive, do i understand correctly that you are criticizing the nVidia Marketing department for not spinning this number to their advantage?

@jrs77: nVidia and ATI disagree about 100% load though. A card runs hotter and uses more power in furmark then when playing games, yet both generate 100% load if you believe GPUz.

100% load = 100% of all the card can possibly deliver before it starts friying it's components.

I don't give a **** about benchmarks etc...
Bakes 11th November 2010, 22:14 Quote
Quote:
Originally Posted by HourBeforeDawn
@Bakes cant get the image to load and also do they have a pic of this special riser card???

Sorry, fixed :P hotlinking doesn't always work :P

Yeah you can see here: http://www.behardware.com/articles/781-1/report-the-true-power-consumption-of-73-graphics-cards.html a picture of their riser card - i'm not totally sure why it's necessary considering that you can just assume 75w are going through the pcie slot, but anyway.
Phoenixlight 11th November 2010, 23:59 Quote
Quote:
Originally Posted by Clive Webster
likewise an Athlon II X6 1090T Black Edition, but with a ‘TDP’ of 255W

A what now?
Xtrafresh 12th November 2010, 00:20 Quote
Quote:
Originally Posted by jrs77
100% load = 100% of all the card can possibly deliver before it starts friying it's components.

I don't give a **** about benchmarks etc...

When i buy a car, i couldn't care less about how much gas it uses at top speed, i wanna know how much it'll cost me at the speed i'm going to drive it.
Siwini 12th November 2010, 00:38 Quote
Real world apps only. Your not fooling anyone with that TDP crap. And xtrafresh if u want to buy a ferrari and cruise at 25mph go abead.
jrs77 12th November 2010, 01:18 Quote
Quote:
Originally Posted by Xtrafresh
When i buy a car, i couldn't care less about how much gas it uses at top speed, i wanna know how much it'll cost me at the speed i'm going to drive it.

When I buy a car based on that method, then I don't need no car that can run faster then 130 km/h :)

And then... the fuel-consumption for cars is measured at a triple-mix with the speedlimits on 95% of the roads in the world 50/80/130kmh taken into account. Additionally the test involves no driving-styles and is done at standardized revs... It's not even measured on the road, but in a lab, to make the results 100% comparable.
So you actually won't know, how much fuel you'll need in average, until you've driven your first 10000km.

This is why the maximum powerdraw of the whole card and not only the chip would be the only fair comparison between GPUs, as this would be the only situation where they're 100% comparable at. Other PC-components and drivers do influence the benchmark-results and draw an inexact picture.
If I use the same GPU as you, but with a slower CPU and an older gfx-driver, then my GPU might actually draw less or more power then yours in the same gaming-benchmark.
And as there's no testbeds for GPUs, where the conditions never change over time, the only hard and uninfluenced number is the absolute maximum powerdraw of the card.
Tangster 12th November 2010, 01:27 Quote
Quote:
Originally Posted by jrs77
This is why the maximum powerdraw of the whole card and not only the chip would be the only fair comparison between GPUs, as this would be the only situation where they're 100% comparable at. Other PC-components and drivers do influence the benchmark-results and draw an inexact picture.
If I use the same GPU as you, but with a slower CPU and an older gfx-driver, then my GPU might actually draw less or more power then yours in the same gaming-benchmark.
And as there's no testbeds for GPUs, where the conditions never change over time, the only hard and uninfluenced number is the absolute maximum powerdraw of the card.

Which would mean partners that do not make reference designs(be it PCB or coolers) would have to measure the power draw separately and list it differently to the reference manufacturer card.
jrs77 12th November 2010, 02:27 Quote
Quote:
Originally Posted by Tangster
Which would mean partners that do not make reference designs(be it PCB or coolers) would have to measure the power draw separately and list it differently to the reference manufacturer card.

So?
new_world_order 12th November 2010, 03:09 Quote
Quote:
Originally Posted by wuyanxu
why did nvidia choose to use power consumption rather than heat output?

I've been to a few exhibitions where I saw 1200 Watt power supplies in deep freezing rigs, but even under few load (all cores and all threads doing some sort of intense benchmarking or calculations) the meter was showing that only 1050W was being drawn.

I think power consumption is a good metric to use for you worst-case-scenario power supply estimations. Remember, the closer you get to the max output of your PSU, the higher the probability that the signal-to-noise ratio will degrade somewhat. This has a tendency to cause instabilities and crashes.

The heat load should be used for estimations of the type of cooling solution you would need.

The power consumption should be used to determine the spec for the PSU.
greigaitken 12th November 2010, 04:49 Quote
but as i and others have said,
heat load = power consumption so both numbers are the same
Xtrafresh 12th November 2010, 08:14 Quote
Quote:
Originally Posted by jrs77
When I buy a car based on that method, then I don't need no car that can run faster then 130 km/h :)

And then... the fuel-consumption for cars is measured at a triple-mix with the speedlimits on 95% of the roads in the world 50/80/130kmh taken into account. Additionally the test involves no driving-styles and is done at standardized revs... It's not even measured on the road, but in a lab, to make the results 100% comparable.
So you actually won't know, how much fuel you'll need in average, until you've driven your first 10000km.

This is why the maximum powerdraw of the whole card and not only the chip would be the only fair comparison between GPUs, as this would be the only situation where they're 100% comparable at. Other PC-components and drivers do influence the benchmark-results and draw an inexact picture.
If I use the same GPU as you, but with a slower CPU and an older gfx-driver, then my GPU might actually draw less or more power then yours in the same gaming-benchmark.
And as there's no testbeds for GPUs, where the conditions never change over time, the only hard and uninfluenced number is the absolute maximum powerdraw of the card.
Not to be a bore here, but do we really need cars faster then 130km/h? For the most part, our carss top speed is marketing fluff, as we'l never use it that way. It's interesting to know, but we really don't care much how much fuel it uses under those condtions.

Back to the cards. Now back to me. Now back to the cards....

...

...

I'm on a forum.

sorry, couldn't resist.

Ok, so, back to the cards. I don't care to know how much it will use in furmark, because i hardly ever run it.
Consumption while gaming is much more interesting. Just like the normalised cartest, we'd need some repeatable test, but that test should include scenarios that simulate gaming,
Enzo Matrix 12th November 2010, 12:05 Quote
Quote:
Originally Posted by Phoenixlight
A what now?

On top of it being a phenom IIx6, it is also NOT 140W TDP. I remember this quite clearly because I was surprised at launch. The TDP of it is 125W. Plus there are even 95W versions.

I signed up to comment on this to try to get it corrected. Too many mistakes on this site, if you guys aren't careful, you're going to become like Tom's Hardware.

And about your i7 950 review: It's the i7 950, NOT the i5 950. First page, in the section with the prices.:(
new_world_order 12th November 2010, 14:49 Quote
Quote:
Originally Posted by greigaitken
but as i and others have said,
heat load = power consumption so both numbers are the same

If you were correct, I would agree with you. Not all power supplied get transformed into heat. That would imply your computer needs no energy of any kind in order to run, and that you have a process more efficient at heat creation than a nuclear power plant.
new_world_order 12th November 2010, 14:58 Quote
Quote:
Originally Posted by greigaitken
but as i and others have said,
heat load = power consumption so both numbers are the same

Case in point: One of my computers has a 750W power supply. I also have a 750W space heater.

I turn both of them on. I have internal temp sensors inside the computer. I turned off the cooling solution by removing power leads to it. I have an external thermometer placed on the space heater.

The space heater reaches 50C fairly quickly, in just under 1 minute.

I can feel the heat radiating off of the space heater.

My computer, case off, takes 6 minutes to reach 50C. By this time, my space heater is at 75C, which is its max temp for safety reasons. The room is much warmer near the space heater, within a 3 foot radius.

I can be 12 inches from my computer, and not feel any real noticeable heat on the back of my hand.

If you are saying I can use my 750W power supply to warm my house for the winter, I would have to disagree.
Xtrafresh 12th November 2010, 18:07 Quote
Quote:
Originally Posted by new_world_order
Case in point: One of my computers has a 750W power supply. I also have a 750W space heater.

I turn both of them on. I have internal temp sensors inside the computer. I turned off the cooling solution by removing power leads to it. I have an external thermometer placed on the space heater.

The space heater reaches 50C fairly quickly, in just under 1 minute.

I can feel the heat radiating off of the space heater.

My computer, case off, takes 6 minutes to reach 50C. By this time, my space heater is at 75C, which is its max temp for safety reasons. The room is much warmer near the space heater, within a 3 foot radius.

I can be 12 inches from my computer, and not feel any real noticeable heat on the back of my hand.

If you are saying I can use my 750W power supply to warm my house for the winter, I would have to disagree.
What a great way to demonstrate you don't understand it!

The 750W power supply is capable of powering a machine that uses 750W, but that does not mean that any machine you hook up to it automatically uses that amount. Looking at your sig, your PC will probably use around 200W in idle, on account of that overclocked 860.

And yes, power drawn IS heat produced with a PC, aside from the bit that goes to moving parts (HDDs, Fans).

As for the power consumption of all the chips involved: mclean007 said i better then i could:
Quote:
The logic (work) done by the chip comes from the entropy gap between (high entropy) electricity and (low entropy) low level waste heat.
TheBlackSwordsMan 12th November 2010, 19:56 Quote
Turbo Diesel Powered ? ^^
rickysio 13th November 2010, 15:52 Quote
Quote:
Originally Posted by new_world_order
Quote:
Originally Posted by greigaitken
but as i and others have said,
heat load = power consumption so both numbers are the same

Case in point: One of my computers has a 750W power supply. I also have a 750W space heater.

I turn both of them on. I have internal temp sensors inside the computer. I turned off the cooling solution by removing power leads to it. I have an external thermometer placed on the space heater.

The space heater reaches 50C fairly quickly, in just under 1 minute.

I can feel the heat radiating off of the space heater.

My computer, case off, takes 6 minutes to reach 50C. By this time, my space heater is at 75C, which is its max temp for safety reasons. The room is much warmer near the space heater, within a 3 foot radius.

I can be 12 inches from my computer, and not feel any real noticeable heat on the back of my hand.

If you are saying I can use my 750W power supply to warm my house for the winter, I would have to disagree.

As Xtrafreshed mentioned, absolutely wrong.

More simply put, RATED CAPACITY IS NOT POWER USED.
new_world_order 14th November 2010, 01:11 Quote
Quote:
Originally Posted by Xtrafresh
What a great way to demonstrate you don't understand it!

I understand it perfectly well.

If what all of you are saying is true, your computers are nothing more than heaters, and they would do no computation at all, and they would not illuminate a single light or LED of any kind.

Energy is: work performed, heat given off, or light emitted.

Does the computer give off heat? Certainly.

Any lights on your boxes? Of course. Where is this energy coming from? It needs power from some source.

Does the computer "do anything"? Yes, therefore that requires power also.

Or do your electrons migrate through the 32 nm instantaneously as well, meaning any computer would be the functional equivalent of "infinite frequency?"
Cthippo 14th November 2010, 01:22 Quote
Quote:
Originally Posted by new_world_order
If what all of you are saying is true, your computers are nothing more than heaters, and they would do no computation at all, and they would not illuminate a single light or LED of any kind.

Perhaps not nothing more than a heater, but it does perform that task quite well
D B 14th November 2010, 14:28 Quote
As has been pointed out, the rating on your power supply has nothing to do with how much power your PC is using ... a 750watt power supply does not mean that your PC is using 750 watts .
Think abour this ..
That 750 watt heater been designed to convert that energy into heat in the most efficient way, example .. does it have a reflector?
And, Is your PC designed to efficiently heat your room using the same total power draw?
Xtrafresh 14th November 2010, 15:24 Quote
perhaps he will read it ifi put it in bold letters:
Quote:
The logic (work) done by the chip comes from the entropy gap between (high entropy) electricity and (low entropy) low level waste heat.
Just to be claer, i don't 100% understand the pysics of that, but that there is the heat that comes off a chip. In your theory where you put a 750W PSU in a PC and expect it to draw 750W, i'd like you to explain this:
http://www.bit-tech.net/hardware/graphics/2010/11/09/nvidia-geforce-gtx-580-review/8

Notice it says power drawn at socket.
Bakes 14th November 2010, 15:28 Quote
Quote:
Originally Posted by new_world_order
I understand it perfectly well.

If what all of you are saying is true, your computers are nothing more than heaters, and they would do no computation at all, and they would not illuminate a single light or LED of any kind.

Energy is: work performed, heat given off, or light emitted.

Does the computer give off heat? Certainly.

Any lights on your boxes? Of course. Where is this energy coming from? It needs power from some source.

Does the computer "do anything"? Yes, therefore that requires power also.

Or do your electrons migrate through the 32 nm instantaneously as well, meaning any computer would be the functional equivalent of "infinite frequency?"
Quote:
Originally Posted by new_world_order
If you were correct, I would agree with you. Not all power supplied get transformed into heat. That would imply your computer needs no energy of any kind in order to run, and that you have a process more efficient at heat creation than a nuclear power plant.

A little question that might help you understand what is wrong with what you are saying: If only a portion of the energy that is drawn from the wall by a computer is converted into heat, what happens to the energy that is not? Does it evaporate into thin air?

Just research the principle of conservation of energy. It should enlighten you and might empower you to study past assumptive science instead of making ignorant statements on topics you know nothing about.
Guinevere 14th November 2010, 22:10 Quote
Quote:
Originally Posted by Xtrafresh
Not to be a bore here, but do we really need cars faster then 130km/h? For the most part, our carss top speed is marketing fluff, as we'l never use it that way.

That's quite a general thing to say. One of my cars is a VW T5 based camper, and the thought of it being limited to 80MPH like you suggest makes me shiver. I'd never get anywhere if I had to drive that slow.

And that's just my camper van!
Xtrafresh 14th November 2010, 22:50 Quote
Some people always feel the need to drive way faster then allowed, some even feel the need to tell everyone about it.

Anyway, if cars only ever drove 130km/h, i don't think anybody would suffer too dearly from it. Do some maths on it: how much time does it save you to drive 160 vs 130 on a 100km strech? And that's assuming no lorrys are overtaking or people are sticking left at 120km/h.

How did this turn into a debate about maximum speed? Oh wait, i did that by bringing up the stupid car analogy again. :D

My apologies.
Ljs 14th November 2010, 23:45 Quote
Quote:
Originally Posted by Guinevere
One of my cars is a VW T5 based camper

Marry me!
Adis 15th November 2010, 09:13 Quote
Quote:
Originally Posted by Altron
What's "maximum power draw over time"? That's not an engineering term. You can have a maximum instantaneous power draw, and an average power draw over time, but not a "maximum power draw over time".

You do understand that if you have an average then you have a maximum and a minimum, else how would you calculate the average ? Then the "maximum power draw over time" wold be the maximum value you reach during your testing.

That said I do agree that that the way it is worded "maximum power draw over time" is misleading/incorrect. You don't need the "over time" part, just "maximum power draw" would be enough.
Adis 15th November 2010, 09:29 Quote
Quote:
Originally Posted by new_world_order
If you were correct, I would agree with you. Not all power supplied get transformed into heat. That would imply your computer needs no energy of any kind in order to run, and that you have a process more efficient at heat creation than a nuclear power plant.

+1
Altron 15th November 2010, 17:30 Quote
Quote:
Originally Posted by Adis
You do understand that if you have an average then you have a maximum and a minimum, else how would you calculate the average ? Then the "maximum power draw over time" wold be the maximum value you reach during your testing.

That said I do agree that that the way it is worded "maximum power draw over time" is misleading/incorrect. You don't need the "over time" part, just "maximum power draw" would be enough.

Yes, "maximum power draw" or "power draw over time" would be two separate things. A maximum power draw isn't an over-time measurement, because once you find the maximum you just discard the other points. It's splitting hairs, but clearly "maximum power draw over time" was not a term created by an engineer, unless he has something to hide.
Wingtale 16th November 2010, 16:38 Quote
What happens if the water leeks all over the components?
Bakes 16th November 2010, 16:59 Quote
Quote:
Originally Posted by Wingtale
What happens if the water leeks all over the components?

What, the water in the vapour chamber? There'd be a problem then - but remember that a vapour chamber is simply a larger version of the heatpipes that are in all our coolers.
Xtrafresh 16th November 2010, 17:15 Quote
Quote:
Originally Posted by Wingtale
What happens if the water leeks all over the components?
leeks?

GCO62VNm67k
new_world_order 16th November 2010, 19:50 Quote
Quote:
Originally Posted by Bakes
A little question that might help you understand what is wrong with what you are saying: If only a portion of the energy that is drawn from the wall by a computer is converted into heat, what happens to the energy that is not?

The energy that is "not?" What are you, a Houyhnhnm from Gulliver's Travels?
Quote:
Originally Posted by Bakes

Does it evaporate into thin air?

If you think that "energy" has anything in common with a "liquid" that could evaporate, there is no metric that can measure the order of magnitude of your ignorance.
Bakes 16th November 2010, 21:45 Quote
Quote:
Originally Posted by new_world_order
The energy that is "not?" What are you, a Houyhnhnm from Gulliver's Travels?

Again, my question is: What happens to the energy that is not released as heat - where does it go?


Quote:
If you think that "energy" has anything in common with a "liquid" that could evaporate, there is no metric that can measure the order of magnitude of your ignorance.

Pedantry doesn't help anyone. 'Evaporate into thin air' is a figure of speech.

Again, all energy that goes into a computer (bar the energy that goes into the fans and the leds) is released as heat energy.
Altron 17th November 2010, 06:03 Quote
Quote:
Originally Posted by new_world_order
If you think that "energy" has anything in common with a "liquid" that could evaporate, there is no metric that can measure the order of magnitude of your ignorance.

Clearly, he doesn't think that. He was asking you that question. Since your reading comprehension seems poor, let me rephrase it:

Tell us, where does the energy that doesn't get converted to heat, doesn't get converted to very low-power signals on data cables, and doesn't get converted to light by the power LEDs go? You've been arguing that there is some energy that doesn't go to any of these things, so where does it go?
mclean007 17th November 2010, 11:44 Quote
Quote:
Originally Posted by Altron
Clearly, he doesn't think that. He was asking you that question. Since your reading comprehension seems poor, let me rephrase it:

Tell us, where does the energy that doesn't get converted to heat, doesn't get converted to very low-power signals on data cables, and doesn't get converted to light by the power LEDs go? You've been arguing that there is some energy that doesn't go to any of these things, so where does it go?
Indeed. From a thermodynamic perspective the logic (useful work) done by a computer is a decrease in entropy (information disorder) and is achieved by converting low entropy energy (electricity) into high entropy energy in the form of low level waste heat. This is required because the second law of thermodynamics states that in general the total entropy of any system will not decrease other than by increasing the entropy of some other system. So the increase in information order produced by the computer's work (a decrease in entropy in that system) is "paid for" by an increase in entropy in the surrounding atmosphere by dumping highly disordered low level heat into the air.

As has been stated many times above, essentially all of the energy input of a computer system (ignoring the negligible amount that is output as light, data signals and electromagnetic radiation) is dumped as heat. This is the first law of thermodynamics (conservation of energy) at work and, new_world_order, you would do well to educate yourself on this simple principle before lambasting other forum users for their "ignorance". It just makes you look like a fool.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums
MSI 970 Gaming Review

MSI 970 Gaming Review

21st August 2014