bit-tech.net

Palit Revolution 700 (Radeon HD 4870 X2)

Comments 1 to 23 of 23

Reply
Naberius 13th January 2009, 09:00 Quote
Its not exactly the most sensible deign, most motherboards are now designed to accomodate 2-slot coolers, using the 3rd slot means that there is a limited number of motherboards that would be capable of running two of these together. Then there is the d-sub connection on the rear, who would spend £370.00 on a graphics card and not purchase a decent monitor with DVI.

I think this is a great example of the underdogs proving that sometimes they should just leave it to the market leaders, and thats before you get onto the minimal performance benefits and the huge power consumption.
liratheal 13th January 2009, 09:17 Quote
I think this card is massively confused. It feels like someone mixed three design teams together and forgot to tell them they were not working on different products.

On the one hand you've got the epic cooling team, the other you've got the compatability team, and rolling around on the floor is the performance team.

I have no doubt that if Palit cleaned their evidently sloppy card design up, they could well produce a decent card worthy of the price point they've gone for.

As for the bundle, I think it's refreshing to see one not concerned with which free games it can add to its product as a selling point. People buying cards of this ilk usually already have the games in question. T-shirts are a no-no, too, as you're then left with sizing. Personally, I love t-shirts, but I'll be damned if I've ever had a free t-shirt that's actually fitted me. That doesn't leave much for a bundle. In my opinion, of course.

Side note: Harry, have you fallen in love with your 'e' key? :p
naokaji 13th January 2009, 09:28 Quote
Quote:
Originally Posted by Naberius
Then there is the d-sub connection on the rear, who would spend £370.00 on a graphics card and not purchase a decent monitor with DVI.

I'm no friend of D-sub connections either, but there are still people out there who refuse to use TFT's and want to stick to their CRT's.
Baz 13th January 2009, 09:34 Quote
Quote:
Originally Posted by liratheal


Side note: Harry, have you fallen in love with your 'e' key? :p

wah?
Xtrafresh 13th January 2009, 09:38 Quote
Why on earth did you guys waste all the time putting this card through your whole benchmarking suite when it's only got a tiny bump to the memory? Normally the graphs are really interesting, but this time i skipped the lot and went right to the conclusion.

It would have been better to run al your tests, including the power consumption, at a safe 800/1000 overclock, i think that's the sweetspot for most potential buyers anyway.

Otherwise good article, and good find on that power consumption!
liratheal 13th January 2009, 09:51 Quote
Quote:
Originally Posted by Baz
wah?

"around [eurl=http://www.ebuyer.com/product/148274 ]£340 [/eurl]following " :B
Baz 13th January 2009, 10:33 Quote
Quote:
Originally Posted by liratheal
Quote:
Originally Posted by Baz
wah?

"around [eurl=http://www.ebuyer.com/product/148274 ]£340 [/eurl]following " :B

aha! Fixed.
eXpander 13th January 2009, 11:03 Quote
Did i miss it or u didn't put some test results with the overclocked card? If i were to buy it, i would take it just for the overclock.

Too bad they didn't bundle a t-shirt! :P liratheal ;)
mauvecloud 13th January 2009, 16:23 Quote
This makes me curious if there are any third party vga coolers that would fit on a 4870 X2 - it seems like most are only designed to fit single-gpu cards.
MJudg 13th January 2009, 17:47 Quote
7 for performance? come on, bit tech.....
rollo 13th January 2009, 18:03 Quote
it deservs a 7

no single card should be above 8 anyway.

383 watts idle. Is just way way way exesive.

520 watt in usage. OMG. From a single card. Whoever created this is out of there mined
Xtrafresh 13th January 2009, 18:17 Quote
hold it right there rollo, the quoted usage is SYSTEM usage, and not the card alone! And that's an overclocked i7 system, not known for it's low power draw

And what do you mean by "no single card should be above 8"? The performance rating there is not an absolute number, it is a number based on how well the card should perform versus how well it does perform.

I think Palit are lucky to get a 7 out of this, a card with a cooling and price premium like this should perform a lot better then a stock card, but it's nearly identical unless you start overclocking.
Bindibadgi 13th January 2009, 18:28 Quote
Quote:
Originally Posted by rollo

520 watt in usage. OMG. From a single card. Whoever created this is out of there mined

That's "at the wall" - so including Core i7 system + PSU inefficiency, which for the Galaxy DXX is about 80%
Quote:
Originally Posted by Xtrafresh

And what do you mean by "no single card should be above 8"? The performance rating there is not an absolute number, it is a number based on how well the card should perform versus how well it does perform.

Exactly ;)
wolfticket 13th January 2009, 21:23 Quote
It's been mentioned above, but I thought I'd reiterate: pages 3-14 are one of the most thorough and comprehensive proofs of the-bleedin-obvious I have ever seen.

I admire your benchmarking professionalism though ;)

I wonder if, with some decent case airflow, you could rip the shroud and the fans off and have a passive card that doesn't burst into flames. Not sure I'd like to try it myself however
Bindibadgi 13th January 2009, 22:07 Quote
Quote:
Originally Posted by wolfticket
It's been mentioned above, but I thought I'd reiterate: pages 3-14 are one of the most thorough and comprehensive proofs of the-bleedin-obvious I have ever seen.

Sometimes a custom power circuit throws up a quirky which is why we have to test it all. In addition - if we didnt post it all, other people would get fussy :p
HourBeforeDawn 13th January 2009, 22:28 Quote
lol the VGA port is funny, why even bother, they should have just dropped that and opened up that portion for more ventilation.
Xtrafresh 13th January 2009, 22:42 Quote
the vga is there for 2 reasons:
1: CRT-zealots
2: benchmarkers that are using cheap generic 19" screens from the office.
Aragon Speed 14th January 2009, 01:09 Quote
I have medium knowledge of hardware, meaning I understand most of what is said, but some things still elude me and I have to ask.

I understand the derision the inclusion of a VGA port on the card deserves when they could have added a £2.50 DVI to VGA converter to the bundle, but why would you need more than one DVI port unless you are running a multi-monitor set up? I can see the extra ports usefulness if that is what you want to do, but in my experience the common gamer would rather have one very good large monitor, and very rarely has more than one.

So while the option would be nice, it is hardly a point that deserves a huge amount of attention. A mention for those who want to run more than one monitor, yes, but a lot of complaining? Or am I missing something here?
frojoe 14th January 2009, 02:53 Quote
Quote:
Originally Posted by Aragon Speed
I have medium knowledge of hardware, meaning I understand most of what is said, but some things still elude me and I have to ask.

I understand the derision the inclusion of a VGA port on the card deserves when they could have added a £2.50 DVI to VGA converter to the bundle, but why would you need more than one DVI port unless you are running a multi-monitor set up? I can see the extra ports usefulness if that is what you want to do, but in my experience the common gamer would rather have one very good large monitor, and very rarely has more than one.

So while the option would be nice, it is hardly a point that deserves a huge amount of attention. A mention for those who want to run more than one monitor, yes, but a lot of complaining? Or am I missing something here?

A gamer yes, but photo editing, 3d design, and many other uses that require some graphics power could make use of many monitors, especially with a card this powerful.
Aragon Speed 14th January 2009, 04:12 Quote
Quote:
Originally Posted by frojoe
A gamer yes, but photo editing, 3d design, and many other uses that require some graphics power could make use of many monitors, especially with a card this powerful.
As I said, a mention for those that would need to use two DVI ports is important, but that is not the majority of people. ;) And about 90% of the people that have that type of job do not buy the hardware, the company does, and I don't know of many companies that would be buying 4870 X2's.

So again I ask why there seems to be a lot of people focussing on this particular aspect of a card as a major down side? (And I don't just mean in this thread, but in general.) I'm not trying to be a pain, just trying to understand the general point of view.

Having the extra port is a bonus and a good thing, but the lack of one does not seem to be that great an issue for the general populous, so it makes me wonder why so many condone a card for not having the extra port?

I do a lot of texture and 3D work myself, so in my case the extra port is important to me and not having it would put me off buying this card (So I understand your point about certain people needing the two DVI outputs), but I am talking about the majority of people here and not the minority.

It's the majority that seems to have the issue about the lack of a second DVI output rather than the minority, so I am trying to see why. :)
wolfticket 14th January 2009, 17:02 Quote
Remember, it is easy and cheap to have a HDMI>DVI adaptor or cable, as they a basically the same (cept the connector and the lack of sound on dvi).
So this card doesn't really lack a second digital output, it just adds display port and vga.
Xtrafresh 14th January 2009, 17:13 Quote
the reason for all the complaining is not because many of us have a second display, but most people that buy this card would like to have a second display. This card does not have the connectivity (or doesn't seem to, while in fact it does as pointed out by wolfticket), so it will put some potential buyers off, especially since the industry standard seems to be 2 DVI ports and component since ages now.
eXpander 14th January 2009, 18:27 Quote
I work in a computer store and most of the monitors I have in stock have a VGA connector, as opposed to DVI. Most of the cheaper 19" and 22" come with a VGA connector.

So, from my perspective, it's a bonus to have VGA on the board and not having to use the DVI-VGA adapter.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums