bit-tech.net

IBM heats university via supercomputer

IBM heats university via supercomputer

The IBM Aquasar uses a watercooling system running at between 60 and 65°C to keep the server cool and the building warm.

IBM has teamed up with the Swiss Federal Institute of Technology to develop a supercomputer which requires far less cooling than ever before – and uses its excess heat output to warm the building to boot.

As reported by CNet, the impressively named Aquasar supercomputer – which is based around a pair of IBM's BladeCenter servers filled with a mixture of traditional Intel Nehalem-based processors and the rather more exotic IBM PowerXCell 8i – is predicted to offer a not inconsiderable 10 teraflops of processing power to its users.

With so much power cramped in a pair of racks, heat is a concern – but here's where things get clever. Rather than the traditional high-powered HVAC systems required by standard supercomputers – systems which have a high power draw and significantly increase the cost of running such a system – the Aquasar uses a watercooling system to keep things ticking over.

While watercooling is nothing new, the way the system works is something a bit special: the water used isn't actually chilled, but instead is introduced into the system at a temperature of 60°C. IBM believes this will be sufficient to keep the processors below their maximum operating temperatures of 85°C.

By dispensing with traditional chilling systems, IBM believes that its new system will use upwards of 40 percent less power than traditional supercomputer rigs. Further savings – both monetary and environmental – are made by using the by now quite toasty water, which leaves the system at around 65°C, to heat the building in which the system is installed.

The new system – which further increases efficiency by using a system of jet impingement cooling where the water actually makes direct contact with the surface of the chip – will use a sealed-loop system containing around ten litres of water, which will be pumped through the system three times every minute. A heat exchanger will deliver the excess heat directly to the university's existing heating system without compromising the sealed loop.

Fancy the thought of heating your house in winter via your Folding@Home farm, or does the entire concept of a watercooling system that runs at a whopping 60°C seem madness? Share your thoughts over in the forums.

26 Comments

Discuss in the forums Reply
mjm25 25th June 2009, 13:04 Quote
how do they get the water to 60C in the first place to introduce into the system? haha

...someone put the kettle on...
PT88 25th June 2009, 13:13 Quote
if i understand correctly the water is part of the Universitys Heating system, so the water in the heating loop will probably be 60 degrees. I guess its jus a matter of taking some "used" water from the system to cool the Rig, ie water that has already been thru radiators

Good idea
Krikkit 25th June 2009, 13:36 Quote
The two loops are separate. No water is taken from the heating through the computer.

The rack itself will heat the water to 60'C without too much trouble from nothing anyway - just lower the flow of the water. Even if they did use a kettle, don't forget you'll only have to heat the water to 60'C once too - it doesn't go off every night like normal computers.

Clever idea really - in summer when the heating's off they can just keep exchanging the heat with the giant heating system anyway, so no worries about overheating!
liratheal 25th June 2009, 13:46 Quote
If anything, I'm more surprised that this hasn't been done before.
mclean007 25th June 2009, 13:53 Quote
I read not too long ago about a plan to exhaust server room heat into large greenhouses. This is not dissimilar. Makes perfect sense to me. There was also that other plan involving an offshore tidal generator off the north of Scotland, the difficulty being the cost and inefficiency of moving the power generated hundreds of miles to civilisation. The genius plan was to build a server farm up there to soak up the majority of the power, use the waste heat to warm greenhouses, connect to the local grid to feed in spare power or receive top-up power where needed, and just have a bundle of fibre-optics going up there rather than dirty great UHV power lines. Smart thinking I say.
robyholmes 25th June 2009, 14:18 Quote
I have seen someone heat a swimming pool online with 4 computers? Or help too heat it.
p3n 25th June 2009, 14:28 Quote
Quote:
Originally Posted by liratheal
If anything, I'm more surprised that this hasn't been done before.

Google have been pioneering things like this for a while now, basically people realised that you dont need to keep data centers at -20 degrees - I believe Googles' most nerdy datacenter uses an evaporation tower to get rid of excess joules...
Evildead666 25th June 2009, 14:56 Quote
Google also did a test about HDD's, where the cold, warm, and hot disks were tallied from the dead disks.
They found that more disks went funny if they were kept cold and/or hot, but the longest life was warm.
Or so I remember.
Basically, you didn't need to keep the disks cold was the output.

I find 60C a bit high for a starting point, but they will have tested it to death, and a security will be in place so nothing will fry.
A very good idea.
Flibblebot 25th June 2009, 15:27 Quote
60c is perfectly within the operating range of modern processors. They're not trying to overclock them, after all, so there's no real need to cool, just to keep the chips from overheating.
I-E-D 25th June 2009, 15:32 Quote
Quote:
Originally Posted by liratheal
If anything, I'm more surprised that this hasn't been done before.

Exactly my though son :)

So how many normal computers would I need to watercool them all providing enough heat to all the radiators in my house?

:D make one massive loop. 5 i7 OC'd i7 Computers, attached to 1000mm radiators, and a huge waterpump. Sorted :) Well, bye bye heating bills, hello electricty :S
B1GBUD 25th June 2009, 15:48 Quote
I've been considering using my loop for underfloor heating! although leak detection would mean pulling the floor up!
Goty 25th June 2009, 15:53 Quote
Quote:
Originally Posted by Flibblebot
60c is perfectly within the operating range of modern processors. They're not trying to overclock them, after all, so there's no real need to cool, just to keep the chips from overheating.

The water is 60C, not the processors. You've got to keep in mind the CPU-water temp delta. Granted, assuming their system doesn't royally suck, their delta shouldn't be anywhere near high enough to matter.

As for the system itself, 10 TFLOPS seems kind of... low...

Seriously, there are plenty of enthusiasts out there that approach half this computing power on video cards alone.
kempez 25th June 2009, 15:53 Quote
That's pretty cool (;))

Our data centre is used to heat the building too :)
Phil Rhodes 25th June 2009, 16:16 Quote
This suggests itself as a combined technology with that remote game playing thing, where the game runs remotely and sends you the result as a video image. Scale that up a few orders of magnitude, and you've got a civic computer cluster that serves a whole town, and provides heat to it as well.

Of course we'll all be living in arcologies by the time that happens...

P
Cupboard 25th June 2009, 16:55 Quote
Like an extension of combined heat and power: combined heat power and computing :)
Burnout21 25th June 2009, 16:58 Quote
my computer has been heating my room for years! It was the only thing to keep me warm at night in uni halls, had it running intels burn test to keep it at 100% load over night.
g3n3tiX 25th June 2009, 17:23 Quote
I read another article about the same setup where the water temp was 40°C.
Nice initiative though.
The_Beast 25th June 2009, 17:31 Quote
very cool but I doubt it's a new idea


In the colder day around here I'll leave my computer on (running a 75% with Fold@home) to keep warm
RichCreedy 25th June 2009, 18:04 Quote
see i said i should have patented my idea, lol
ch424 25th June 2009, 18:33 Quote
Quote:
Originally Posted by Krikkit
Clever idea really - in summer when the heating's off they can just keep exchanging the heat with the giant heating system anyway, so no worries about overheating!

Oh yeah! Awesome :D

I used to heat my room using my computer, but now that I have to pay my electricity bill and heating is included, I switch it off as much as possible!

Remember that gas heating is more efficient than making your computer waste energy (in terms of kW per £) if you are paying for both though!
Skiddywinks 25th June 2009, 20:24 Quote
Quote:
Originally Posted by I-E-D

Exactly my though son :)

So how many normal computers would I need to watercool them all providing enough heat to all the radiators in my house?

:D make one massive loop. 5 i7 OC'd i7 Computers, attached to 1000mm radiators, and a huge waterpump. Sorted :) Well, bye bye heating bills, hello electricty :S

Sorry to rain on your parade with nitpicking, but the pressure you would need would be much higher than what the blocks you would use would be rated for. Surely?
Quote:
Originally Posted by Burnout21
my computer has been heating my room for years! It was the only thing to keep me warm at night in uni halls, had it running intels burn test to keep it at 100% load over night.

Tell me about it! My uni room was unbearable most days. 4870X2 and an OC'd Q9550 kept my room very toastie when it ran through my push-pull BI GTX 360. Since it is all in a Cosmos S as well, the heat is literally dumped right into my room with minimal chance to cool down or dissipate first.

I remember my little brother thinking that water cooling would make my room cooler, since the cooling was better. Clearly he didn't think it through before he said it :P
tejas 25th June 2009, 20:49 Quote
My room heats up unbearably with a GTX 295 in one PC and a 4890 in the other. I can only use one PC in the summer as I would pass out with any more heat. I would happily use them to heat the water in the house! I need an A/C unit very badly but need to work out costs and more importantly aesthethics...
ch424 25th June 2009, 21:21 Quote
Quote:
Originally Posted by Skiddywinks
the heat is literally dumped right into my room with minimal chance to cool down or dissipate first

If your computer is in your room, where would the heat go other than heating your room!?
Ending Credits 25th June 2009, 21:51 Quote
Quote:
Remember that gas heating is more efficient than making your computer waste energy (in terms of kW per £) if you are paying for both though!

Yes but a 1KW PC should produce 99% of the heat as a 1KW electric fire and your computer can do other things in the meantime. The only problem with this is that you are perhaps losing more money in shortening the lifespan of the components than form electricity however as enthusisasts this might not matter so much. :p
ch424 26th June 2009, 00:03 Quote
No.. you missed my point.

If you have a 1kW computer and leave it on for an hour, you are paying the electricity company for the gas to produce 1kWh of power- which is probably about 2kWh of gas-burning, plus their staffing costs plus infrastructure costs.

If you use your central heating to produce 1kWh of heat, you're just paying for 1kWh's worth of gas, plus a bit on infrastructure.
Publ!c Enemy 7th August 2010, 16:06 Quote
What happens in the summer?
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums