bit-tech.net

Gigabyte Z68 boards to use ‘virtual GPU’ software

Gigabyte Z68 boards to use ‘virtual GPU’ software

Lucid's software will enable Gigabyte Z68 boards to seamlessly combine on-board and discrete graphics.

Gigabyte has announced that at least four of its forthcoming Z68-based motherboards will use LucidLogix’s Virtu software to combine Intel’s Sandy Bridge processor graphics unit (PGU) with any add-in GPU, creating an all-in-one 'virtual' GPU.

Lucid previously made a name for itself with the Hydra chip, which aimed to combine the power of any graphics cards, regardless of the manufacturer. However, the company's Virtu technology takes the form of software.

Lucid claims that the software ‘will dynamically balance the advanced power-efficient, built-in media features of the Intel Core processor graphics with the high-end, DirectX 11 3D, anti-aliasing and performance features of discrete GPUs, while significantly reducing the power drain of traditional entertainment desktops.

The idea is that the software will take advantage of the media-centric capabilities of the Sandy Bridge PGU, which Intel claims is capable of encoding video faster and with less power consumption than mid-range discrete GPUs. This might sound unlikely, but it is plausible.

This is how Virtu works. Easy, no?

This is how Virtu works. Easy, no?

Intel uses fixed-function logic in the PGU that’s highly optimised for the task of video encoding while Nvidia and AMD rely on shoving the task through the general-purpose stream processor units of their GPUs to achieve the same result, which can be less efficient.

Our current understanding of Z68 is that its graphics system can operate in a hybrid fashion, enabling a power-hungry discrete graphics card to shut down completely while the PGU takes over basic Windows tasks.

However, the trick of the Virtu technology is make any transition from one graphics unit to the other seamless. According to Lucid, the Virtu software ‘is able to assign tasks in real time to the best available graphics resource based on power, performance and features considerations,’ while the display output is permanently routed through the Sandy Bridge graphics system. The company claims that the end result will be a fast PC that produces minimal heat and noise, while also reducing the power draw.

We are excited about our new partnership with Lucid because of the huge potential for switchable graphics in the PC DIY market, where Gigabyte motherboards enjoy considerable market share’ said Henry Kao, vice president of R&D at Gigabyte.

Are you excited by the prospect of virtual graphics, or are you happy with what you’ve got? Let us know in the forums.

29 Comments

Discuss in the forums Reply
perplekks45 6th May 2011, 10:25 Quote
Interesting. Sounds a bit like nVidia's Optimus.
c0ldfused 6th May 2011, 10:50 Quote
Sounds like a great idea, I use my PC for surfing the net or watching movies but have a 560Ti for gaming.
Could save money on the bills in the long run.
Flibblebot 6th May 2011, 10:57 Quote
But if everything is output through Sandybridge, does that mean having to use the motherboard video connection? Does that mean that you'll lose the ability to use multiple monitors?

It sounds like a cool idea in theory, but I'll wait to see what it's like in practice before getting excited.
thelaw 6th May 2011, 10:58 Quote
Hybrid transitions between discreet graphics and the cpu/gpu graphics eh?

I agree with the coments the biggest issue will to design the system to seem flawless when switching off discreet graphics cards when just in windows and them turning on when the system demands graphics processing powervwithout a time delay/lag...just sounds like a extra feature to charge more for a z68 board...
Deders 6th May 2011, 11:35 Quote
Quicksync?
Bindibadgi 6th May 2011, 11:37 Quote
Quote:
Originally Posted by Deders
Quicksync?

The ASUS Z68 boards can do QuickSync even if you plug the display output into the graphics cards because they have additional power hardware; it doesn't matter what output you use. AFAIK Gigabyte boards cannot do this. Hopefully bit-tech will test this :)
TheStockBroker 6th May 2011, 11:47 Quote
From the article: "The idea is that the software will take advantage of the media-centric capabilities of the Sandy Bridge PGU, which Intel claims is capable of encoding video faster and with less power consumption than mid-range discrete GPUs. This might sound unlikely, but it is plausible"

Has Bit-Tech not yet done their own tests?

Everywhere else is showing quite literally unbelievable encoding results.

TSB
xaser04 6th May 2011, 11:55 Quote
Quote:
Originally Posted by thelaw
Hybrid transitions between discreet graphics and the cpu/gpu graphics eh?

I agree with the coments the biggest issue will to design the system to seem flawless when switching off discreet graphics cards when just in windows and them turning on when the system demands graphics processing powervwithout a time delay/lag...just sounds like a extra feature to charge more for a z68 board...

This is the biggest "niggle" that affects Optimus. Users of the M11x R2 have noted many times where games simply refuse to use the discreet GPU instead defaulting the ingetraged one regardless of following the correct Optimus related procedures (adding the game to the "whitelist").

Hopefully this won't affect the way this (Lucid logic) is supposed to work.
azrael- 6th May 2011, 12:04 Quote
The way Virtu (and nVidia's Optimus/Synergy) works is by copying the content of the discrete GPU's frame buffer to the internal GPU's frame buffer. The problem here, as I see it, is that this may incur quite a substantial performance penalty. Although, if Virtu is a hit (and perhaps if Intel should buy Lucid or do their own Virtu), this process might move into hardware at some point.

Oh, and regarding the Z68 chipset. I haven't really been able to figure out what all the commotion is. Z68 is the same chipset as P67 and H67 as well as all the other revisions of Cougar Point. The difference comes from which parts of the chipset are fused off and which aren't. It's called artificial market segmentation, and it's a game Intel just loves to play... :)
TWeaK 6th May 2011, 12:20 Quote
Quote:
Originally Posted by Flibblebot
But if everything is output through Sandybridge, does that mean having to use the motherboard video connection? Does that mean that you'll lose the ability to use multiple monitors?

This. How will the GPU pass its display data to the system? Will it go back over the PCI-E connection? I know we're nowhere near saturating that connection, but it strikes me as pretty inefficient to have data shuffled around like that. Surely it would've been better to output whatever the Sandybridge PGU does over the discrete card.
Bungletron 6th May 2011, 12:33 Quote
Quote:
Originally Posted by xaser04
This is the biggest "niggle" that affects Optimus. Users of the M11x R2 have noted many times where games simply refuse to use the discreet GPU instead defaulting the ingetraged one regardless of following the correct Optimus related procedures .

I am an M11x R2 owner, as far as I am concerned the issue was fixed when the updated drivers were released last summer. As long as you are happy to add programs to the whitelist then I would say the Optimus system is flawless, cool and quiet doing running of the mill stuff, performance boost when discrete graphics are required, the transition is seemless.

If this system works like Optimus it would be very useful. For example, I use 2 computers as a main rig and server, running the rig 24 hours is inefficient. If the discrete card was not used when not gaming then it would certainly be more viable to use one machine as gaming rig and server, I would cut my power consumption in half.

I would hope that all video is routed internally (likely through the motherboard's video connector, which may negate its use in multi monitor setups as was mentioned). One thing I am unsure about is the mention of software, as far as I am aware Optimus is a hardware implementation for the most part, surely this is where the flawless and full performance transitions must come from. I fear any implementation in software would cause overhead and cripple performance, we shall see.
dunx 6th May 2011, 12:40 Quote
Be honest ! How many of you actually look at the power consumption ?

dunx

P.S. i7-960 + i7-870+ i7-870+i5-655K+Q6600....
wuyanxu 6th May 2011, 12:43 Quote
some Gigabyte boards doesn't have VGA output, most likely they just desoldered P67 returned, soldered Z68 and off it goes.

so nvidia optimus (aka Synegy) won't work on these boards anymore, they require monitor to be pluged into the integrated VGA output. this press release is simply Gigabyte's way of saying "there, have some duck-tap to fix this issue we are too lazy to fix".


using nvidia optimus for nvidia cards, and AMD's solution for AMD cards, are will be much better considering they already have strict game-driver test procedure.
Hawkest 6th May 2011, 12:54 Quote
i thought hydra wasn't as impressive as it was made out to be..... have lucid got this right?
Deders 6th May 2011, 13:06 Quote
Quote:
Originally Posted by Hawkest
i thought hydra wasn't as impressive as it was made out to be..... have lucid got this right?

I think that has more to do with multi GPU gaming performance, where there will be overhead translating the ATI and Nvidia driver code into something compatible with each other. I think with the Z68 implementation only 1 GPU will be used at once, for whatever purpose it's best for.
Denis_iii 6th May 2011, 13:32 Quote
is Z68 the x58 replacement?
l3v1ck 6th May 2011, 16:54 Quote
I'd be interested in this as a way to save battery life on laptops, but on desktops it isn't an issue for me as I'd expect these boards to carry a slight price premium over ther standard boards without this technology.
bobwya 6th May 2011, 18:24 Quote
Quote:
Originally Posted by dunx
Be honest ! How many of you actually look at the power consumption ?

dunx

P.S. i7-960 + i7-870+ i7-870+i5-655K+Q6600....

Quite often since we had a new electricity meter fitted - which appears to actually measure our household usage (6 computer peak + various gadgets)... :-(
I prefered the old meter which didn't appear to notice I was running a 4Ghz core i7 920, dual 2.6Ghz Opteron 270's, both with 8800GTX's and a HD array 24/7... Just about as inefficient (since the motherboard the Opterons are in doesn't support AMD PowerNOW! properly and the 8800GTX doesn't have a 2D low voltage mode) - as you can get!
Deders 6th May 2011, 18:27 Quote
Now and then I like to see how much power certain games and processes take up, Crysis 2 consistently draws the most for me, nearly 400w from the wall. Only the last Protos level from starcraft 2 came close.
Guinevere 6th May 2011, 21:03 Quote
I wait with baited breath for the first benchmarks that prove that under certain very specific conditions you get a 15% benefit in either performance or power management, but for everything else it either makes no difference or actually makes performance / power drain / stability worse.

Oh, and for of course the feature to last no longer than one season - to be dropped for the next big marketing thing - be it overclocking doo-dads, gaming sound cards, low latency gaming NICs or other useless power management acronyms (UPMA's)
Sloth 6th May 2011, 21:18 Quote
Unless there's marked performance benefits (which I find unlikely versus my 5870, but who knows) I'm not terribly excited, the small power savings from a technology like this just aren't worth it since the addition of this software will likely come with a higher price tag. If you're really trying hard to save power there's much lower hanging fruit in your average household.
JumpingJack 7th May 2011, 04:56 Quote
Quote:
Originally Posted by TheStockBroker
From the article: "The idea is that the software will take advantage of the media-centric capabilities of the Sandy Bridge PGU, which Intel claims is capable of encoding video faster and with less power consumption than mid-range discrete GPUs. This might sound unlikely, but it is plausible"

Has Bit-Tech not yet done their own tests?

Everywhere else is showing quite literally unbelievable encoding results.

TSB

http://www.anandtech.com/show/4083/the-sandy-bridge-review-intel-core-i7-2600k-i5-2500k-core-i3-2100-tested/9

This is one of the better, more detailed comparisons out of the SB reviews. Looking at this data, I would place SB above a high end GPU.
StoneyMahoney 7th May 2011, 10:27 Quote
Sounds good - if it works. The manual switchable graphics on my laptop not be the most convenient solution in the world, but it seems human beings and Optimus sometimes disagree on which GFX chip should be active and I like my manual control, thanks all the same.

Of course, get a decision making system that actually works properly and I'll be all over it.
slothy89 9th May 2011, 04:49 Quote
If you are concerned about power consumption, and are also an avid gamer.. Easy solution..
Buy cheap basic PC for if you just want to check emails, and use your e-peen gaming machine just for games.. That way you have 0 chance of driver clashes, or the software deciding that crysis2 will run better on the intel GPU.

Or better yet, use your phone for email/web! Most ppl have smartphones these days..
ZERO <ibis> 9th May 2011, 06:43 Quote
I could care less about this in my desktop unless they create a way for me to use all the gpu power I have all at once as I only care about max top end performance. In addition I run bonic so power efficiency is no issue as I am donating that cost to help science.

Now for my laptop this would be great b/c I do not run bonic on that and a laptop is something that I really want to be able to optimize the power usage of so that I can keep using it while on battery.
Xir 9th May 2011, 09:31 Quote
Quote:
Originally Posted by JumpingJack
this is one of the better, more detailed comparisons out of the SB reviews. Looking at this data, I would place SB above a high end GPU.
SB is almost at the quality level of CPU rendering here, but much faster.
What made me wonder is the poor quality performance of CUDA in this article...

The 4-in-1 pictures should be in a similar test here on bit-tech, they're great.;)
Nikumba 10th May 2011, 10:25 Quote
This tech words wonderfully well on the 2011 MBP, Intel HD3000 and AMD 6750M swaps with no issue, no noticeable lag or anything like that, just a little notification to tell me the card has changed.

So I am sure if a sandy bridge mac can do it, a Intel board should be able to do it. And it works just as well with a external monitor connected for dual screens

Kimbie
Andy Mc 11th May 2011, 12:29 Quote
Whats the betting that it performs worse than their last outing, the Hydra chipset?
azrael- 11th May 2011, 13:18 Quote
Quote:
Originally Posted by Andy Mc
Whats the betting that it performs worse than their last outing, the Hydra chipset?
You'd lose...
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums