bit-tech.net

R600 family has independent video processor

R600 family has independent video processor

All your video are belong to UVD... in the R600 family, at least.

Last night, AMD announced that its upcoming R600 graphics processor family will feature dedicated universal video decoding hardware that works alongside Avivo.

Traditionally, even if the graphics card capable of decoding video, the video processing done by it will only amount to 50% of the work – the rest is picked up by the CPU.

With AMD’s R600 family, the company’s engineers have designed a product that can almost completely negate the CPU in all video processing: colour, scaling and playback of common codecs will be handled by the GPU.

The problem with the Radeon R5xx line is that only the top end GPUs have enough horsepower to do 1080p. With the universal video decoding engine (UVD) now being separate from the 3D engine, means even the most basic R6xx is capable of handling full 1080p High-Definition playback.

AMD has also integrated an audio controller on its low and mid-range RV6xx range of graphics cards, as the GPUs natively support HDMI.

Since HDMI carries both video and audio streams over one cable, you might want to take advantage of this, especially in HTPCs. With an internal audio controller, this means the entire sound loop back process doesn’t require external cables and is synced up using graphics software before the frames are sent out.

This is a far superior way of doing things for two reasons: 1) no messy cables going out from your sound card, back in your case on your video card and 2) sync issues. Since graphics can take longer to process than audio, especially if you’re upscaling or doing a lot of post processing effects. Thus, you want the software to manage the audio and sync it with the processed video before it hits your screen.

This audio syncing technology isn’t wholly new; it’s essentially the same as we’ve previously seen in the AMD 690G intergrated graphics chipset last month.

Discuss in the forums.

15 Comments

Discuss in the forums Reply
dgb 16th March 2007, 09:15 Quote
I'll really start to care when they get a product out. Interesting enough, indeed good to hear about it because it will force nVidia to think about responding wtih similar technology, but it all looks like damage control over the late launch now. :(
DougEdey 16th March 2007, 09:48 Quote
It sounds like they are still in R&D
Fod 16th March 2007, 10:03 Quote
nah my guess is that they wanted to wait 'til launch to drop all these awesome bombs about the tech, but were held back from launching for whatever reason, so to spoil nvidia's party they're having to release some info to keep the punters hot.
r4tch3t 16th March 2007, 10:25 Quote
Whats the audio going to be like on these?
Saivert 16th March 2007, 10:51 Quote
This is awesome news. Finally some people got it right. We need dedicated video processors. PCs start to look more like consoles nowadays with multiple cores and multiple specialized processors. This is the future. Old-thinking generic processing cores just doesn't cut it anymore with even more demanding games and video codecs.

In order to play full 1080p VC-1 HD DVD now you need minimum Dual-Core 2GHz and a powerfull graphics card with HDCP. Are someone seeing a pattern here if AMD didn't suddenly realize that an independent (dedicated) video processor is the solution?

I don't believe GPUs handle 50% of the job of playing back video. IMO GPUs only deinterlace and do IVTC, no decoding of H.264 or MPEG-2. That's still handled by the main CPU. I got an NVIDIA 7800 GTX 256MB based graphics card and it just doesn't playback AVC and MPEG-2 1080p even when hardware acceleration is activated. The only player I have teste hardware accelerated AVC decoding with is Nero ShowTime which indicated that this is enabled during playback. I'm using NVIDIA's own PureVideo decoder for MPEG-2 acceleration when watching MPEG-2 .ts (transport stream) movies and DVDs. 720p is just fine, but when trying 1080p it stutters and lags.
I will add that I have an AMD Opteron 150 single-core cpu @ 2,6GHz (OC) and 2GB RAM.
Paradigm Shifter 16th March 2007, 11:24 Quote
Quote:
Originally Posted by Saivert
PCs start to look more like consoles nowadays with multiple cores and multiple specialized processors.
Surely this line should be the other way around? PCs have had multiple CPUs far longer than consoles have, it just that recent (ie: 'next gen') consoles took the lead for a while because they are (arguably) so ludicrously overpowered. ;) Intel seems dead set on making dual-core PCs standard in just about every home, and quad-core affordable. :D

...

This is a good thing for AMD/ATi - they seem to be trying to get a lot of things out into the wild at the minute that their competitors don't have... however, it would be nice to see some hardware, rather than just endless press-releases... :)
EQC 16th March 2007, 11:46 Quote
Quote:
Originally Posted by the article
With the universal video decoding engine (UVD) now being separate from the 3D engine, means even the most basic R6xx is capable of handling full 1080p High-Definition playback.

Nice! It's about time. I don't game, so I've no need for a high end video card...but I'd like good video playback. If they stick one of these chips in a sub-$80 video card, and it can decode 1080p H.264 without leaving the CPU with work to do, I'll be very very happy. Especially if the video card, being cheap/low-power, is also passively cooled.
flabber 16th March 2007, 15:23 Quote
Something I don't understand... there are HD-recorders that can play 1080p video (as long as the recorded/copied movie was already 1080p ofcourse). That thing doesn't have such a big videocard, and isn't that powerhungry.

Then how is it possible that PC-users are "struggling" to get 1080p on their monitors? Am I wrong to think that if someone has a 30" monitor and a videocard that supports that monitor, he can play PC-games at the native resolution? I mean, those 20" and bigger monitors have been around for longer then 2007, and we've seen resolutions of 1050 (vertical resolution) for a few years now. Then why is it suddenly so hard to view 1080p movies on a system?

Maye it's a silly question, but with similar monitorresolutions which have been around for quite some time, and gaming at that same resolution has been possible for a long time as well... it just seems weird.
r4tch3t 16th March 2007, 16:50 Quote
The standalone players have a specialised chip that is dedicated to HD decoding, this is what is being implemented here.
DougEdey 16th March 2007, 16:56 Quote
Thinking about it, surely the existing hardware is more then powerful enough to do it. Why bother with adding more hardware, more complexity, more costs to an already powerful unit?
Tyinsar 16th March 2007, 17:09 Quote
Quote:
Originally Posted by EQC
...If they stick one of these chips in a sub-$80 video card, and it can decode 1080p H.264 without leaving the CPU with work to do, I'll be very very happy. Especially if the video card, being cheap/low-power, is also passively cooled.
I suspect that you're not alone in that. It would be silly if this is only for high-end cards (that shouldn't need it).

What interests me is the built in sound. If this has any decent quality I could see this as a challenge to Creative. Perhaps Nvidia would have to bring back an integrated version of SoundStorm as a counter to this.
Ramble 16th March 2007, 19:28 Quote
Awe-some!
Bindibadgi 16th March 2007, 19:32 Quote
Quote:
Originally Posted by r4tch3t
Whats the audio going to be like on these?

It just pulls whatever sound your chosen soundcard produces and pushes it internally rather than externally. I've no idea HOW it works exactly, I'm guessing it lets the soundcard do what it does then just rips the output signal but we won't until the R6xx launch.
Kipman725 16th March 2007, 21:10 Quote
The audio quality should be perfect as it's digital, so the sound quality should depend on your decoding hardware. Although I say should be with a great deal of emphesis there are many ways of loosing information.
Tyinsar 16th March 2007, 21:54 Quote
Quote:
Originally Posted by Bindibadgi
It just pulls whatever sound your chosen soundcard produces and pushes it internally rather than externally. I've no idea HOW it works exactly, I'm guessing it lets the soundcard do what it does then just rips the output signal but we won't until the R6xx launch.
Thanks for the clarification. I'm now wondering what effect this will have on Digitally Restrictive Mangling.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums