Core Clock: 825MHz Memory Clock: 1,800MHz Warranty: Three years parts and labour
Nvidia has been pretty dominant at the top of the graphics card market ever since the GeForce 8800 GTX launched back in November 2006 and it's fair to say that AMD has struggled a little in recent times. R600 wasn't fast enough, while RV670 offered a great balance of features and performance per watt, but it was never going to win any performance medals at the high-end.
However, when AMD paired a couple of RV670 chips together to create R680, or the Radeon HD 3870 X2, it ended up giving Nvidia a brief run for its money. ATI's first dual-GPU card since the ATI Rage Fury MAXX was competitive with cards like the GeForce 8800 GTX and GeForce 8800 GTS 512, but it fell down when it came to price.
Click to enlarge
Both competing cards from Nvidia were available for less and the Radeon HD 3870 X2's shaky performance in a couple of titles led to us recommending Nvidia's cards over the new dual-GPU solution from AMD. Things have moved on a bit since the end of January and we're now three months into the Radeon HD 3870 X2's lifespan, so it's as good a time as any to see how things have progressed.
What's interesting is that, unlike many of AMD's partners, Asus has taken the word innovation to heart with its EAH3870 X2 graphics card – not only is there a new cooler, but the company has also added support for up to four digital displays on the card. Yes, you read that right – this graphics card has four HDCP-compliant dual-link DVI-I ports.
Because of this, the PCB has been slightly re-worked to accommodate the additional display connectivity and the company's engineers have also taken the time to fix one of our biggest annoyances with the reference Radeon HD 3870 X2 card: the power connectors. On the AMD-designed board, the power sockets spruced out at 90 degrees to the PCB, which made cabling a bit of a nightmare – especially when you've got a pair of them in CrossFireX. Asus has corrected this by rotating the connectors 90 degrees so that cables plug into the top edge of the PCB – it's a little thing, but it makes quite a big of difference when you're installing a Radeon HD 3870 X2 in a case where the graphics card is below the power supply.
Click to enlarge
While this is a step forwards, we can't help but feel that the cooler is a bit of a step backwards. Instead of just the one fan, the Asus-designed board features two fans which in theory means twice as much noise—or at least double the number of points of failure—but there is a good reason for it. The reason is quite simple, in fact – the additional fan is required because Asus has opted to implement the four DVI ports, which means that air can no longer be exhausted out of the chassis through the PCI bracket.
This reasoning alone doesn't make it a good solution though, and in order to compensate for the non-existent exhaust, Asus has implemented quite a novel heatsink design under the two fans. In this, there are two separate aluminium radiators each connected to a copper cooling plate via a couple of heatpipes that are flattened at the one end. These help to transfer heat away from the GPU cores into the flow of cool air as quickly as possible.
Anyway, before I get too far away from the point of noise, I should mention that it's a little louder than a Radeon HD 3870 X2 1GB with a reference cooling solution on it based on testing the two cards side by side. However, it's not anywhere near loud enough to warrant sticking a pair of scissors in the fans to get some peace and quiet. What's more, once you factor in other noise in your system and close the sidepanel, you will not be able to hear the fans spinning away.
Interestingly, on the rear of the card, Asus has done away with the black cooling plate that covered most of the back of the reference card, which makes us wonder whether it was needed in the first place. However, I have a thing about bare memory chips (having had a few cards die over the years because of exposed memory going bad) and purely from my own experiences, I would have liked to see something to at least protect the DRAMs on the back of the card – regardless of whether or not it actually aids their cooling.