bit-tech.net

AMD boasts of Vega NCU improvement claims

AMD boasts of Vega NCU improvement claims

AMD's Vega graphics processors, featuring what the company has called Next-Generation Compute Units, have some major architectural changes - but will that translate into improved real-world performance?

AMD has officially taken the wraps off its next-generation Vega graphics processor architecture, pointing to the improvements it hopes will help it win back market share lost to discrete graphics leader Nvidia.

Vega, due to launch early this year and appearing first in the company's Radeon Instinct accelerator family, is being positioned by the company as being a major leap over previous generation graphics processors. In support of this claim, the company has treated the tech press to a technological overview of just why Vega should have a place in everything from gaming rigs to high-performance computing projects.

The company's promises for Vega begin boldly, with the claim that it includes 'the world's most scalable GPU memory architecture'. Using High Bandwidth Memory 2 (HBM2), the company is claiming a doubled bandwidth per pin and eight-fold capacity increase per stack, along with a halved footprint compared with off-chip GDDR5 memory. A high-bandwidth cache controller linked to system RAM, meanwhile, is claimed to offer smart movement of data between system RAM and video RAM - resulting, the company claims, in lower overall VRAM requirements when running games at Ultra HD resolution.

Vega also includes an entirely refreshed programmable geometry pipeline, offering a claimed doubling of peak throughput per clock, which combines nicely with what the company describes as an 'Intelligent Workgroup Distributor' for improved load-balancing between the processor's various components and pipelines.

The meat of Vega, though, comes in what the company calls its 'Next-Generation Compute Unit', or NCU. This, the company claims, offers 512 8-bit, 256 16-bit, or 128 32-bit floating-point operations per clock and can be split for high-performance, mixed-precision mathematics. The chip is also optimised for both higher clock speeds and increased number of instructions per clock (IPC), packing up to four operations where the company's previous compute units would have managed only one.

Additional improvements promised in the Vega architecture include a draw stream binning rasteriser, which is claimed to both boost performance and reduce power draw, while the shifting of render back-ends to act as clients of the level two (L2) cache boost the performance of deferred shading.

There's one thing that didn't form part of AMD's presentation, however: real world figures. While the company has crowed about the potential improvements available from Vega as a result of its architectural changes, it has not yet put its money where its mouth is and subjected Vega-based cards to real-world benchmarking. AMD has also yet to formally confirm launch dates and card specifications for Vega's release.

10 Comments

Discuss in the forums Reply
Vault-Tec 5th January 2017, 14:04 Quote
Star Wars at 4k 60 FPS is the best they can do yet. No other benchmarks, graphs or anything else only that supposedly it is better than Volta, a tech Nvidia have not released yet.

Why? why do AMD continually talk crap?

Because of the Volta thing I don't believe a sausage of what they say and will be waiting until these land in the hands of reviewers.
bowman 5th January 2017, 14:28 Quote
More hype devoid of substance, and the card will arrive a full year after the 1080, giving Nvidia plenty of time not just to polish their 1080 Ti but work on their next architecture as well.

I fear this will be 'too little, too late', both for AMD, and for our hopes of competition in the GPU market.
Vault-Tec 5th January 2017, 14:42 Quote
Quote:
Originally Posted by bowman
More hype devoid of substance, and the card will arrive a full year after the 1080, giving Nvidia plenty of time not just to polish their 1080 Ti but work on their next architecture as well.

I fear this will be 'too little, too late', both for AMD, and for our hopes of competition in the GPU market.

It's not hype it's facts, mate (yes, I am being incredibly sarcastic). Not only have they showed Star Wars at 4k 60 FPS they also showed how well it runs Doom at 4k.

http://i.imgur.com/HtmKayP.png

Bet they didn't think people would see that...
rollo 5th January 2017, 17:41 Quote
if that doom screen is accurate I assume its nightmare, if it's ultra it's slower than the old fury x.
Corky42 5th January 2017, 18:18 Quote
Quote:
Originally Posted by Vault-Tec
<Picture>

Bet they didn't think people would see that...

Is anything known about what setting that was as this video shows Doom using Vulkan running at 4k@60fps.

p4YvbwAHJrc

It would be very odd if they released a new card / architecture that performs worse than existing cards, isn't the Fury X able to hit 60fps@4k with Doom?
Harlequin 5th January 2017, 18:24 Quote
if that's minimum fps then its higher on nightmare than a gtx 1080
Corky42 5th January 2017, 18:31 Quote
Think i managed to track down where Vault-Tec got it from, i think this sums it up nicely.
Quote:
To clarify, it's a dip that lasts only a few frames during the head popping. However, as we all know, high fps is nothing if the average min fps is bad. Hopefully it's something as innocent as a CPU usage spike, but we will have to wait and see.
A "dip" is probably not much to worry about when talking about new card with what i assume are beta drivers, time will tell i guess.
Cr@1g 6th January 2017, 13:51 Quote
And Nvidia bored us to death with Shield? I like how this is playing out a proper Game of poker this one. Nvidia didn't show their hand and I reckon AMD thought best not to either at this stage. My money is on AMD, their out for blood on this one and I reckon it's coming this time.
Anfield 6th January 2017, 15:51 Quote
They better bring the heat, cause getting to 60 FPS at 4K soon won't be enough any more,

Dell announced an 8K monitor and Asus a 144Hz 4K one...
rollo 6th January 2017, 16:59 Quote
Been = to a 1080 6-9months later ( depending on launch) is only really good if it massively undercuts them. After brexit vote that is now very unlikely.

Anandtechs post suggest a April to June launch window which would put it nearly 1 year later
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums