bit-tech.net

Gigabyte anticipates quad CrossFire

Gigabyte anticipates quad CrossFire

Gigabyte GA-M790-DQ6 board offers the potential of quad CrossFire.

It's been quiet on the AMD front recently, but suddenly the new RD790 has reared its head on the Gigabyte stand this morning. ATI hinted at quad CrossFire in Tunisia, at the HD 2900 XT launch, but nothing has been said since then.

Gigabyte’s AMD Product Manager fully expects quad CrossFire in the (near) future, and the board shown supports four slots: two x16 or four in x8. We can only hope it performs better than the fiasco that was Quad SLI.

This is also with PCI-Express 2.0, which offers twice the bandwidth for the same pin count and a maximum power of 150W. The RD790 is the new chipset and will beat Intel's X38 to market. It also includes support for AM2+ and HyperTransport 3.0 to match AMD's Phenom range.

Naturally, until quad CrossFire arrives you've got the choice of just using just normal CrossFire or simply four graphics cards for eight displays, like Asus has already shown with its Intel 975X workstation board released last year.

The GA-M790-DQ6 is made using all Japanese solid state SMD capacitors (like all DQ6 boards), as well as copper heatpipes, Realtek ALC889a sound and dual PCI-Express Gigabit Ethernet.

Interested in the potential performance benefits of quad CrossFire or think it's yet another excuse to try and pry more money out of us? Let us know in the forums.

13 Comments

Discuss in the forums Reply
Gravemind123 6th June 2007, 06:06 Quote
How is that going to work with HD 2900XTs? You would have to replace the cooler on it to a single-slot solution, and would pretty much mean water cooling for a card of that energy use. Although I suppose if you can afford quad HD 2900s that isn't much of an issue for you. Interesting idea, could you have oct-crossfire if you used 4 X1950Pro duals or the dual HD 2600 Sapphire is going to make? I would love to see oct-crossfire in action!
Tim S 6th June 2007, 06:41 Quote
Quote:
Originally Posted by Gravemind123
How is that going to work with HD 2900XTs? You would have to replace the cooler on it to a single-slot solution, and would pretty much mean water cooling for a card of that energy use. Although I suppose if you can afford quad HD 2900s that isn't much of an issue for you. Interesting idea, could you have oct-crossfire if you used 4 X1950Pro duals or the dual HD 2600 Sapphire is going to make? I would love to see oct-crossfire in action!
I did mention this in my pre-show computex coverage, but here we've actually got pics of the board running :)
Gravemind123 6th June 2007, 06:59 Quote
I would like to see some pictures and benchmarks of the board up and running. Did you get any word there on possible octo-crossfire?(sorry for being repetitive, but the thought of it is pretty cool).
Tim S 6th June 2007, 07:06 Quote
Quote:
Originally Posted by Gravemind123
I would like to see some pictures and benchmarks of the board up and running. Did you get any word there on possible octo-crossfire?(sorry for being repetitive, but the thought of it is pretty cool).
We can't benchmark the board at the moment, since it's unreleased hardware (I've already asked if we could...). I'm not sure about Octo CrossFire, but you'd be waiting on drivers for a while, since quad-CrossFire drivers (for the dual HD 2600 XT) are months away. Also, that card has a dual slot cooler on it, so you'd not be able to get a pair in Gigabyte's board.
HourBeforeDawn 6th June 2007, 10:05 Quote
ya I was just about to say something about that, would be sick to two see octal actually work on that, be insane numbers not to mention the ability to have 16 dvi monitor support lol
Dr. Strangelove 6th June 2007, 11:05 Quote
Quote:
...or think it's yet another excuse to try and pry more money out of us?

No I think it's because the "Canadians" are hoping for some more global warming, so that they don't freeze so much in the winter. with the power needed to run quad crossfire you would think that the governments soon would start charging greenhouse tax on our GFXs
r4tch3t 6th June 2007, 11:25 Quote
What are the other PC-E slots at when in 16x-16x mode?
And octo wouldn't work even if you replaced the cooler, the PC bracket is a dual one and there are DVI ports on both.
Still Quad is just money grabbing, even SLI/Crossfire are really just money grabbers.
Hugo 6th June 2007, 11:34 Quote
Not to mention that, if you think about it, the cost vs performance figures are going to be terrible. The processing overhead for keeping 4 cards in sync has got to be exponentially higher than for two cards, so your probably looking at getting maybe 3x the performance of a single card rather than 4x.

Of course until we actually see it working that's just a hypothesis, but it's a reasonable assumption based on current data.
DXR_13KE 6th June 2007, 13:13 Quote
this will need a thermonuclear reactor to feed it.
Toka 6th June 2007, 15:28 Quote
Quote:
Originally Posted by Dr. Strangelove
with the power needed to run quad crossfire you would think that the governments soon would start charging greenhouse tax on our GFXs

rofl

Paradigm Shifter 6th June 2007, 16:53 Quote
Not so sure it'll be as power-hungry as many are worried about. Since it can only cope with single-slot coolers, that limits the really ultra-power-hungry cards right off the bat. And the HD2400 and 2600 are supposed to be cool running chips.
devdevil85 6th June 2007, 17:35 Quote
Yeah once the 65nm R650 comes out, heat will be less (as one would expect) and who knows, maybe driver support for Octo (which is completely insane). Would the two extra slots that are x8's be bandwith limiting?, if so, would it just be best to wait for a mobo that supports (4) x16's? In my next rig, I want a lot of headroom for the future and for OCing. I just don't want to get bottlenecked right from the start. It would be a wise investment to go with a 4 slot PCI-E 2.0 config that can support Crossfire because you can continually upgrade every year or so and notice improvement....well atleast I think you would, or would all the new cards run at the same clock as the slowest? O man, somebody help me figure that one out cause that would suck.
Jipa 6th June 2007, 23:57 Quote
TBH I don't get it. Like the current SLI-systems wouldn't heat up like hell and be powerful enough?

Uh oh.. wait... AHA it's an ATI/AMD-system, sure you need four of those to compete the half-a-year old 8800GTX SLI! That explains.. But still has someone been actually waiting for such board?
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums