bit-tech.net

AMD demos Fusion Render Cloud

AMD demos Fusion Render Cloud

AMD President and CEO Dirk Meyer demonstrated the next era of cloud computing, the HD Cloud, during his keynote address at CES.

CES 2009: AMD President and CEO Dirk Meyer demonstrated the next era of cloud computing, the HD Cloud, during his keynote address at CES.

Meyer introduced the Fusion Render Cloud reference design, a GPU super computer designed to break the one petaFLOPS barrier with more than 1,000 graphics processors.

"We anticipate it to be the fastest super computer ever and it will be powered by OTOY's software for a singular purpose: to make HD cloud computing a reality," said Meyer. "We plan to have this system ready by the second half of 2009."

The Fusion Render Cloud will break one petaFLOPS with a tenth of the power consuption of the world's fastest super computer today and it requires a significantly smaller footprint than a conventional super computer - it will fit in a single room rather than requiring a whole building. What's more, the Fusion Render Cloud will be upgradeable unlike virtually any other super computer so AMD anticipates further increased performance in the future.

That's all well and good, but what would you use a 1,000+ GPU super computer for? Well, it's designed to run HD content over the Internet (or cloud) through your web browser. Meyer demonstrated full-resolution Blu-ray movie playback through a web browser before moving on to show Mercenaries 2 running without any lag whatsoever in the same web browser.

The demonstration was polished and impressive - it's possible to move the browser window around while playing the game without any adverse effects on the gaming experience. That has little to no real world use, but it was designed to prove that it wasn't just a canned demo being shown.

AMD demos Fusion Render Cloud AMD demos Fusion Render Cloud
Click to enlarge

Meyer invited Richard Hilleman, Chief Creative Officer at Electronic Arts, on stage after the Mercenaries 2 demo. "At Electronic Arts, we have been lucky enough to be a part of the creation of a number of changes in the world of personal computer gaming. From the first PCs to CD gaming to the advent of Internet gaming, we have embraced each new evolution of technology as an opportunity to bring new experiences to our customers," he said. "OTOY and AMD are at the cutting edge of thin client gaming, and we look forward to the new customers we can reach and the new interactive expressions that emerge from revolutionary technology like the AMD Fusion Render Cloud."

There's no denying this is a massive step forwards for cloud computing, but we have a few concerns - the biggest is that there was no details on how much bandwidth the HD cloud will require to achieve experiences like this through the cloud. The demo was run across a wired network and not through the Internet which makes us wonder just how steep the requirements are. Nevertheless, that doesn't take away from how impressive this demo was.

Discuss in the forums.

39 Comments

Discuss in the forums Reply
salesman 10th January 2009, 02:16 Quote
That's truly amazing. I can't wait to hear more about it.
biebiep 10th January 2009, 09:57 Quote
So wait.

That thing plays crysis FOR me?
DriftCarl 10th January 2009, 12:20 Quote
Nice, so in the future instead of paying hundreds of pounds to upgrade our own PC's, we let AMD upgrade theirs and we just have a thin client at home instead of a PC.

Sounds good to me. will be like a guarantee that everyone can play the latest game from a tiny thin client terminal hidden in a desk or even monitor.
ch424 10th January 2009, 12:21 Quote
So basically it takes your local input (keyboard+mouse), sends it over the internet to their servers, which render the game and send it back as a video stream? That's really cool except for the bandwidth problem: most peoples' internet connection struggles with iPlayer, let alone the high quality version.

Did they mention what resolutions were possible over ADSL?
perplekks45 10th January 2009, 16:21 Quote
By the looks of it those are 19" TFTs with a native resolution of ~1280x1024? So this little window with the game is what? 640? 800? Not enough for all those 280 Triple-SLI or 4870X2 CF-X people out there. Sounds like an awesome idea though. Like DriftCarl and ch424 said it'd nice to have just a little box standing around somewhere [hidden] plus keyboard and mouse [wireless most likely] and just play your favorite game without having to have the most up to date hardware.

What impact that'd have on my electricity bill!!! Holy s***!

I think I want it. Even if it means back to gaming at 'low resolutions'. ;)
n3mo 10th January 2009, 16:48 Quote
I don't mind spending cash on hardware and couldn't care less about electricity usage so the idea of entertainment-related remote computing doesn't appeal to me. While i use remote computing/administration extensively (not to mention parallel computing), I need to know the machines I work with and feel the ability to control and to trust them. It's a psychical thing :)
devdevil85 10th January 2009, 22:37 Quote
Thin client gaming....never thought I'd see the day.....
airchie 10th January 2009, 23:58 Quote
There's a few advantages to this that nobody's mentioned yet.

Firstly, no more cheating in games since you're just sending key+mouse to them and getting video back.

Secondly, a standard platform so game devs can spend less time bug-hunting and play-testing and just concentrate on getting the game to actually be any good.

Thirdly, solves the piracy problem pretty much, meaning the game devs have nothing else to blame when their crap game doesn't sell.
This could also lead to a reasonable pay-per-play model and make game demo's worthless.
They could just give you 4 hours free play to let you work out if you liked it.

Lots of potential but a worrying amount of scope for devs to abuse it to milk their customers. :(
Er-El 11th January 2009, 16:17 Quote
Quote:
Originally Posted by ch424
So basically it takes your local input (keyboard+mouse), sends it over the internet to their servers, which render the game and send it back as a video stream? That's really cool except for the bandwidth problem: most peoples' internet connection struggles with iPlayer, let alone the high quality version.

Did they mention what resolutions were possible over ADSL?
Hence it obvioulsy won't be an option for everyone right away. Also, I welcome something like this because it will encourage ISPs to finally do something about an infrastructure that will make gigabit internet (both up' and down') possible where it already isn't.
perplekks45 11th January 2009, 16:30 Quote
Gigabit intenet? Where? In the UK? Rest of Europe? Don't think so. Not in the near future. Germany has a 75 Mbit connection available, that's the fastest I know of. Standard in Germany is 8-16 Mbit which is okay but not anywhere fast enough for thin-client gaming or live-HD streaming.
johnmustrule 12th January 2009, 01:33 Quote
Quote:
Originally Posted by aon`aTv.gsus666
Gigabit intenet? Where? In the UK? Rest of Europe? Don't think so. Not in the near future. Germany has a 75 Mbit connection available, that's the fastest I know of. Standard in Germany is 8-16 Mbit which is okay but not anywhere fast enough for thin-client gaming or live-HD streaming.

I think Japan has a 100Mbit service available, however, the same applies it's simply not enough. And if they had 1000 users wouldn't that still be like having one graphics card per computer? Not to mention lag issues and video compression which is my main concern. To deliver me uncompressed video at the same quality I can produce on my desktop would require sending me 5-15 TB of data in only several hours. I run 1900X1200 and I'm not giving up a single pixle to mpeg, snipeing would be impossible!

All that said I would greatly appreciate being able to render my 3DS max stuff on it.
perplekks45 12th January 2009, 02:46 Quote
There was a news article here on BT a while ago about 1Gbit internet to be made available in Japan. Sad we're that far behind...
mclean007 12th January 2009, 10:56 Quote
So the compute cloud decodes Blu-Ray and then streams it to your PC? But if it has been decoded remotely, then the bandwidth requied would be mammoth - uncompressed 24fps 1080p needs over a gigabit per second, so they must use some kind of compression to transfer it, which (a) needs to be decompressed at the client end; and (b) will surely introduce some additional quality loss. Cool as a tech demo, but I really don't see the practical use. Remote gaming, maybe, but remote movie decoding? Nah.
steveo_mcg 12th January 2009, 11:04 Quote
Remote blu-ray will never happen, can you imagine the kittens the mpaa and sony would have sending their precious movies over the internet. Look at the lengths they went to securing play back kit in the first place.

Now you and I know you could mostly secure it using encryption but execs tend to have the tech smarts of a mammoth and tbh i'd like to keep sony as far away from the IP spec as possible.
p3n 13th January 2009, 13:06 Quote
ISPs complain about bbc iplayer...heh.
airchie 13th January 2009, 17:14 Quote
Quote:
Originally Posted by mclean007
will surely introduce some additional quality loss.
Depends on the compression. If its a lossless compression then it'll not lose anything. It might introduce stuttering etc though if the CPU at your end isn't powerful enough to decode the compression in real time.
Which, kinda defeats the point of the whole thin-client concept... :D
mclean007 13th January 2009, 17:47 Quote
Quote:
Originally Posted by airchie
Depends on the compression. If its a lossless compression then it'll not lose anything. It might introduce stuttering etc though if the CPU at your end isn't powerful enough to decode the compression in real time.
Which, kinda defeats the point of the whole thin-client concept... :D
Show me a lossless codec that can compress full HD video to a manageable bitrate for streaming over the internet and I'll eat my own shirt. It just seems pointless - any device capable of receiving and making use of a 1080p stream will likely have sufficient power to decode the original VC-1 or H264 video stream locally, so all the AMD Cloud does (apart from act as a neat tech demo) is massively increase the bandwidth required to get the same effect.
airchie 13th January 2009, 22:31 Quote
Quote:
Originally Posted by mclean007
Show me a lossless codec that can compress full HD video to a manageable bitrate for streaming over the internet and I'll eat my own shirt.
The point is, the bandwidth isn't there yet for making any use of this tech. Its only showing its possible if the Internet speeds to the home increase significantly.

Quote:
Originally Posted by mclean007
any device capable of receiving and making use of a 1080p stream will likely have sufficient power to decode the original VC-1 or H264 video stream locally
I'm not sure about this, but I thought the main overhead with watching Bluray was the massive overhead on processing power needed to decrypt everything on the fly? Surely, if something else was doing this and just passing all the pre-decrypted content directly to something like a thin client, the thin client wouldn't need much power to simply display it?
ie, the render-cloud would be doing the work of a standalone bluray player and the thin client would be acting like a TV?
perplekks45 13th January 2009, 23:11 Quote
A TV that decodes the re-encoded HD content.
I can see mclean's point. If you have to decode the stream, you'd rather not waste all that bandwidth... at least I wouldn't.
mclean007 14th January 2009, 11:05 Quote
Quote:
Originally Posted by airchie
The point is, the bandwidth isn't there yet for making any use of this tech. Its only showing its possible if the Internet speeds to the home increase significantly.


I'm not sure about this, but I thought the main overhead with watching Bluray was the massive overhead on processing power needed to decrypt everything on the fly? Surely, if something else was doing this and just passing all the pre-decrypted content directly to something like a thin client, the thin client wouldn't need much power to simply display it?
ie, the render-cloud would be doing the work of a standalone bluray player and the thin client would be acting like a TV?
There are 2 main processing tasks involved in getting data from a Blu-Ray disc to a display. The first, as you say, involves decrypting the AACS and BD+ protection on the data; the second is to take that decrypted data and decode it from its lossy compressed format into an uncompressed digital data stream.

You could in theory offload the first task to a cloud computer, but there are a couple of heavy drawbacks. First, home broadband in most places (certain inner-city regions of Sweden, Japan, South Korea etc. are the exceptions) are nowhere near ready to take on streaming 1080p video at Blu-Ray bitrates. Secondly, I really can't imagine movie studios lining up to let their precious intellectual property be streamed over the web with no encryption. Thirdly, although the processing power required to decrypt AACS / BD+ in software is not insignificant, as far as I understand, it is much less demanding than the decoding of high bitrate VC-1 / h264 video. Most new graphics cards will happily offload the decoding process, and the decryption and decoding can be handled very efficiently with dedicated hardware as found in consumer Blu-Ray drives. The point is that the capability to decrypt and decode high bitrate protected content can economically and easily be incorporated into any form factor that needs it, so the idea of offloading the processing to a cloud seems rather pointless!
airchie 15th January 2009, 00:52 Quote
Does bluray really use a lossy compression algorithm to compress?
Seems retarded to do that.
I thought the whole point of bluray, HDDVD etc was to allow HD content without the need for compression?

Anyway, I see your other points.
I never said it was a good idea, just that it was a nice concept.
I do still think we'll move more towards online distribution models as internet speeds get faster. :)
perplekks45 15th January 2009, 01:35 Quote
Don't know if BR is compressed, though I don't think it is... where would be the sense in that?
What we were talking about was the fact, that you'd have to compress it after it has been decrypted by the cloud to be able to send it to the client without the need for a 3 Gbps connection. And that the client then would have to de-compress that which might require a pretty beefy CPU thus killing the whole thin-client idea.
airchie 15th January 2009, 04:23 Quote
Quote:
Originally Posted by aon`aTv.gsus666
What we were talking about was the fact, that you'd have to compress it after it has been decrypted by the cloud to be able to send it to the client without the need for a 3 Gbps connection. And that the client then would have to de-compress that which might require a pretty beefy CPU thus killing the whole thin-client idea.
Isn't that what I said earlier? :)
mclean007 15th January 2009, 11:16 Quote
Uh yes Blu-Ray and HD DVD both use a lossy compression algorithm for video. Many discs now come with lossless multichannel HD audio, but an uncompressed video stream at 1080p24 (1920 x 1080 pixels x 24 frames per second) needs a lot more space than either optical format can provide. Dual layer Blu-Ray maxes out at 50 GB. Ignoring audio, extra features and all, lets assume all of that 50 GB is available for the video content of, say, a 2 hour movie.

Uncompressed, the movie needs 1920 x 1080 x say 32 bits per pixel = 66,355,200 bits per frame = ~8MB per frame. At 24 frames per second, a 2 hour movie contains 86,400 frames, so that's 691,200 MB at 8 MB per frame. So you're looking at about 675 GB (1,024 MB in 1 GB) for an uncompressed movie. You need to compress it by a factor of 13.5:1 to fit on your dual layer Blu-Ray. Lossless compression generally gets nowhere near this, and in fact you need to compress it more than that because you have to leave room for audio and extras etc.

Lossy compression using a modern codec such as VC-1 or H.264 generally provides excellent results at a significantly better compression ratio than lossless can achieve.

Maybe for the next-next-gen, when we have multi-TB holographic discs, we'll have the capacity for lossless ultra-high definition video, but who knows?
Tim S 15th January 2009, 12:14 Quote
Btw, I'm trying to get video of the demo as I think it'd be useful for you guys to see. :)
perplekks45 15th January 2009, 12:49 Quote
Sorry airchie, confusion on my side then. :)

I don't really think we'll get UHDV anytime soon. 16 times the pixel of 1080p? Would definitely be nice but as HD hasn't taken over the market as a whole by now... And as many people bought silly 'HD Ready' TVs with a resolution of 1024x768 they won't buy a new TV for the next couple of years as they think they're ready for HD. :|
The funniest thing about that is that technically XGA is 'HD Ready'. 768 is enough to fulfill the requirements as it can display 720p.

Seems I got sidetracked...
mclean007 15th January 2009, 13:13 Quote
Quote:
Originally Posted by aon`aTv.gsus666
Sorry airchie, confusion on my side then. :)

I don't really think we'll get UHDV anytime soon. 16 times the pixel of 1080p? Would definitely be nice but as HD hasn't taken over the market as a whole by now... And as many people bought silly 'HD Ready' TVs with a resolution of 1024x768 they won't buy a new TV for the next couple of years as they think they're ready for HD. :|
The funniest thing about that is that technically XGA is 'HD Ready'. 768 is enough to fulfill the requirements as it can display 720p.

Seems I got sidetracked...
Don't even get me started on the whole "HD Ready" debacle - 720p has to be scaled by a weird ratio or letterboxed to fit on a 768 line display. At least 1080 line displays can show 720p with 3:2 scaling. And as for a 16:9 widescreen display with an aspect ratio of 4:3 (1024 x 768)... rectangular pixels - not cool. Here's an idea for TV manufacturers - if a screen is too small to make it worth putting in a 1080p panel (let's face it, you're unlikely to see the difference on a 20" TV at 10 feet viewing distance, right?), then why not use 720 x 1280? That way 720p can be displayed full screen without scaling, 480i/p can be displayed full screen with 3:2 scaling, and 1080i/p can be displayed with 2:3 scaling.
perplekks45 15th January 2009, 21:11 Quote
Didn't mean to start a rant here. :p
airchie 16th January 2009, 00:44 Quote
Damn you aon, stop trolling! ;) :D
perplekks45 16th January 2009, 01:01 Quote
Everybody hates me... :(
Tim S 20th January 2009, 18:34 Quote
bzVCZdctASY

Here's the video of the demo for those interested. Apologies for the delay. :)
perplekks45 21st January 2009, 15:53 Quote
Interesting video. Thanks, Tim.

One thing though:
When they show off the live-rendered city it lags horribly. Impressive stuff, sure, but no word about bandwidth. So, again, the question: Do we all have to move to Japan to be able to stream our HD movies?
mclean007 21st January 2009, 16:14 Quote
Quote:
Originally Posted by Tim S
Here's the video of the demo for those interested. Apologies for the delay. :)
Thanks Tim - interesting to watch. I still think though that the apps demonstrated were really just intended to show the power and versatility of the Fusion render cloud. The demos were obviously running very smoothly, but then I imagine the server was sitting on the same gigabit LAN as the clients, so you'd hope it could run smoothly! To run HD content (be that pre-recorded like movies or dynamic like games) remotely, you're still going to need oodles of bandwidth - that is unavoidable.

Streaming video (even HD) over a high speed internet connection to a browser is nothing new. The remote gaming thing is more novel, but I think latency may be a real problem for many games. It's one thing having the common FPS latency thing where if your ping is too high then you shoot people and miss because they aren't where your client software expects them to be (as they have moved but it hasn't updated to your screen yet) - in fact playing a game on Fusion could potentially eliminate that issue - if everyone was playing remotely using hosts located in the Fusion render node, latency between them would be minimal, just like playing on a LAN; but the much more serious issue is that I can't see any way to avoid an unacceptable sluggishness being introduced by the connection latency and the compression (at the server) and decompression (at the client) of the game's graphics for transmission along a broadband line (necessary because even the fastest domestic broadband (say 100 meg) is insufficient to transmit uncompressed video at say 800 x 600, let alone 1920 x 1200).

Don't get me wrong - I'm sure the concept of a GPU supercomputer hosting cloud computing applications has many powerful and useful applications (many of which won't even have been dreamt up by the machine's creators yet), but the idea of running FPS style games remotely through a thin client is total pie in the sky until we have sufficient market penetration of ninja fast broadband (as in gigabit) with near-zero latency.
mclean007 21st January 2009, 16:34 Quote
Quote:
Originally Posted by aon`aTv.gsus666
Do we all have to move to Japan to be able to stream our HD movies?
Well, the Blu-Ray spec allows for a maximum AV (audio and video) bitrate of 48 Mbit/s (and, disregarding bonus features etc., a dual layer 50GB disc has sufficient storage capacity that it could, in theory, carry over 2 hours of video at max bit-rate), so you'd need a hefty internet connection to stream video at that quality. However, acceptable 1080p video can typically be achieved with a significantly lower bitrate by using an advanced codec such as VC-1 or H.264 (both supported by Blu-Ray), maybe as low as 10 Mbit/s or even lower for many types of content. Remember that the more there is going on on screen (i.e. the more the image is changing frame-to-frame), the less redundant information there is so the more the image quality will suffer under heavy compression.

As a general rule, there is a trade-off between image quality, file size / bitrate, and decoder processing power - VC-1 and H.264 can produce excellent quality at quite low bitrates, but require a lot more processing power to decode, especially when image optimising features are used at the encoding stage, than say MPEG-2, which generally produces lower image quality at comparable bitrates. But for what I would consider Blu-Ray quality HD, you're going to need a reasonably high bitrate as well as some chunky decoding power, either in software running on a powerful CPU (or GPU) or as a dedicated hardware decoder.

I guess the beauty of an "on-the-fly" online streaming service (as opposed to simply pushing the raw HD stream from a Blu-Ray disc down a pipe) is that it would be able to adjust the stream dynamically, so if it detected that there was insufficient bandwidth (say the client reported it was stalling or its buffer was running low) then the server could reduce the bitrate until the buffer was refilled; similarly, if the client hardware was insufficiently powerful to decode certain codecs or certain advanced features or video above a given bitrate, it could adjust the stream accordingly, even reducing resolution if required. So, if you have your super-fast PC on a 100 meg connection, you get ultra-high bitrate 1080p with lots of codec optimisations for ultimate image quality, and you also get 7.1 lossless sound which you can pipe into your surround system; on the other hand, someone streaming to his EeePC over 3G gets a lower bitrate stream (so his 3G connection doesn't crap out) at a resolution scaled to fit his EeePC (no point streaming 1080p if it's only going to be displayed on a 800x480 screen, right?) using a codec that the EeePC can handle with ease; he gets bog standard stereo sound for his headphones. So every device gets a service tailored to its requirements.
Tim S 21st January 2009, 18:03 Quote
I'm still not convinced about the bandwidth issues, even though AMD said there wouldn't be any. I'm not buying it until I see demos across the internets and not on stage. :)
perplekks45 21st January 2009, 21:38 Quote
Did AMD give any timescale for that? They said the server will be available in H209... does that mean I will be able to watch movies using it by then or does it mean there will be 1 or 2 'demos' I can stream?

Thanks for putting all my thoughts in one post, mclean. It was nice reading exactly what I would've said if I'd had the time. ;)
mclean007 22nd January 2009, 10:42 Quote
Quote:
Originally Posted by perplekks45
Thanks for putting all my thoughts in one post, mclean. It was nice reading exactly what I would've said if I'd had the time. ;)
Thankfully I am a ninja fast typist ;)
perplekks45 22nd January 2009, 17:43 Quote
Yea, we're very proud to have you here... the typing Ninjaaaaah! ;)
cheeriokilla 7th May 2009, 17:17 Quote
This is very exciting news, maybe game developers will not overlook PC's as the ALPHA MALE in the gaming industry when everyone has access to that. I'm not talking about us, geeks, build our own pc's 4870x2. I'm talking about the masses! awesomeness!
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums