bit-tech.net

Gigabyte GA-Z68AP-D3 Review

Comments 1 to 12 of 12

Reply
Hustler 11th October 2011, 13:24 Quote
"they only have Crossfire X certification - there's no SLI."

Oh...great, give us the inferior dual card system, why dont you.......

If your only going to give us one, then at least let it be Nvidia's, which seems to be a far more stable and reliable dual card system.

Seriously Gigabyte....i do wish you'd used SLI
Xir 11th October 2011, 13:44 Quote
Z68 boards have been just a tenner or so more expensive than the comparable P67 boards, I really can't hear this
Quote:
"it doesn't add many useful features over Intel's P67 chipset, despite the extra cost."
anymore.
Fast response (and sometimes encoding on the iGPU) for very little more. How is this not a win?
favst89 11th October 2011, 13:49 Quote
I guess more features without much added cost is good and maintaining the overclocking ability.

As for the x-fire SLI. I have not tried either so as to which is better I can't comment.
I believe SLI certification requires 2 slots with x8 lanes each. Whereas x-fire only requires x8 and x4. I assume this means extra cost for components/wiring of some form. It may also cost gigabyte a fee to have the certification in the first place.
damien c 11th October 2011, 14:56 Quote
SLI or Crossfire on Sandy Bridge is not as good as it could be simply because I don't think any of the board's will give it true 16x for each slot.

I know when I was running SLI'd GTX 580's when I installed the Nvidia Driver's I alway's got the message that the bottom card was running, slow because it wasn't in a high speed slot.

As for this board not having SLI it could put people off buying it who may want to SLI some cheap Nvidia card's, like the GTX 560's but what it offers is fine for those who decide to use 2x ATI card's or a single GTX 590.

Seem's like a decent board for someone on a budget but for someone with enough cash to buy something decent like a SLI rig they need to spend, more on a board and unless you can get a board that will give both card's the 16x speed then I think they should look, at the X58 setup's as they will give each card the full speed available to them.

Good Job Gigabyte keep it up!
hurrakan 11th October 2011, 15:57 Quote
Why weren't the benchmarks for the "Asus Maximus IV Gene-Z" included in the charts in this review?
hurrakan 11th October 2011, 16:01 Quote
"The one useful feature the more expensive chipset has is Intel Smart Response"

This comment is misleading. It's only true for people who only have a single, small-capacity SSD. For people with lots of SSDs, Smart Response is the most useless feature.
Christopher N. Lew 11th October 2011, 17:43 Quote
Does Smart Response make a difference in the real world?
Xir 12th October 2011, 10:02 Quote
Quote:
Originally Posted by Christopher N. Lew
Does Smart Response make a difference in the real world?
For Smart Response, I think there was a test right here on bit somewhere?
For Virtu-for-transcoding, Check Tom's or Anand (as the bit article tested iGPU and dGPU the wrong way round)
:'(
phuzz 13th October 2011, 21:51 Quote
Does SLI or Crossfire actually work yet?
As in, does it actually provide faster frame rates than a comparable cost single card system without loads of headaches?
YingKo 14th October 2011, 05:07 Quote
Seriously, a parallel port?
fluxtatic 15th October 2011, 06:38 Quote
If you're dropping $120 on a mobo, I doubt you really care much about XFire or SLI, honestly....are you honestly going to cheap out on the board and then deal with dual-card headaches on cheap cards? Why not buy a single, better card in the first place? If money was no object to me, I might go in for a multi-card setup, but it seems to be demonstrated over and over again that it doesn't scale as nicely as you might like. Aside from that, it seems that driver issues are a constant pita. Sure, some games will scale monstrously under XF/SLI, but it mostly seems to be an e-peen thing. As to favst89's comment, that may be the case (cba atm), but the board should be able to provide that, since SB does have 16 lanes (or are there other components eating into those lanes?) I would guess he's dead on about the fee part, though. Maybe XFire's just cheap enough that Gigabyte can make it a tick-box feature. Even an enthusiast might think it pretty lame that their shiny new mobo won't do either one, whether or not it matters to them personally. If I could have gotten my new board for a few bucks less without XF, I would have in a heartbeat, if that was the only thing that got dropped. (I do find it funny that mine is quad-GPU certified when they run at x16 x4, though - how well would that even work, shoving two 6990s in there? Not to mention spending 10x the cost of the board on the video cards - yes, I know the 6990s aren't the only dual-GPU AMD cards, but shiny new mobo! Need shiny new video cards!

Word, to YingKo - why the hell would I be buying a Z68 board if I needed a parallel and/or serial port? In most scenarios I can think of, Z68 would gain me no advantage in that sort of setup - better off getting a low-end ASRock or Biostar AMD board and shoving an Athlon II X2 in there - hell of a lot cheaper, too, even for a cheap Intel board. Not counting a CNC controller or similar, I suppose - who was it mentioning their greatly stripped-down XP install running the CNC machine? Total focus on running the CNC control so the part doesn't get hosed by WinUpdate being 'helpful' or the like. Would the superior platform help in that situation?
accountlink 1st November 2011, 21:16 Quote
What heatsink/cooler are you using to achieve 4.9GHz with Gigabyte mobo?

I have the Gigabyte GA-Z68AP-D3-B3 w/i5 2500k ...using CoolerMaster Hyper 212+ ...i'm able to overclock to 4.5 GHz no problems.

would like to replicate my successes with 4.9 if possible using the cooler you guys tested with. Thx.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums