Radion Vs GeForce

Catsby

One of Freddy's beloved
Joined
Apr 21, 2004
Messages
249
Catsby would like you to have a look at this link:
http://www.hardocp.com/article.html?art=NjI4

Catsby has read it through, and found the method of comparing the card far more useful than the old benchmarking method.

Catsby hopes that this method is not so time consuming that it prevents proper comparisons being made.
 

Panda On Smack

Can't get enough of FH
Joined
Dec 22, 2003
Messages
1,030
i feel bad that i picked up on how you type your forum messages, wasnt having a go just seemed to annoy me. This brings me onto my next point which makes me feel even worse:

whats a Radion?
 

Milkshake

Loyal Freddie
Joined
Dec 22, 2003
Messages
496
I'm getting me a Radeon X800 XT Platinum 256MB for £170 :)

Gotta love having 'contacts' :D

*claps his hands like a seal*
 

Catsby

One of Freddy's beloved
Joined
Apr 21, 2004
Messages
249
Catsby is fallable. Catsby offers his profuse apologies.

Catsby clearly owns a GeForce4.
 

Escape

Can't get enough of FH
Joined
Dec 26, 2003
Messages
1,643
Escape suspects Catsby will be interested in this article, which compares the 9800XT, x800Pro, x800XT and 6800U.
After a thorough trouncing, the 6800U falls to pieces with sub-standard Farcry results, which Escape will have you believe is the only reason to buy a next-generation card.

Escape agrees the "highest playable" benchmarks a worthy addition to graphics card reviews, but would like to see the "apples to apples" tests maintained in future tests, for further reference.
 

Bodhi

Once agreed with Scouse and a LibDem at same time
Joined
Dec 22, 2003
Messages
9,283
Dear oh dear. That review doesn't look good for nVIDIA at all. Their latest and greatest can barely outperform nVIDIA's old card. Good lord how the tables have turned.
 

Xavier

Can't get enough of FH
Joined
Dec 22, 2003
Messages
1,542
Aside from the fact the HardOCP article doesn't include the 5950 (thus making comments comparing it against NVIDIAs last generation a tad retarded) the conclusory data here:


1083564189888Adk70te_10_1.gif


doesn't seem to indicate any kind of trouncing.

HardOCP ran beta Catalyst 4.5's against first candidate drivers for the 6800 ultra (the current WHQL are 61.34's - much newer) and their 6800 was apparently clocked at 400Mhz core, whereas retail cards are being shipped at 440mhz.

:rolleyes:
 

Quige

Fledgling Freddie
Joined
Dec 22, 2003
Messages
118
... other than the Geforce 6800 only being able to manage 1600x1200 in one game, Call of Duty, based on an old engine, and in fact having to drop 1 or 2 resolutions compared to the X800 cards to maintain playable framerates at similar AA and AF settings. Seems fairly trounce-like to me, though point about underclocked core and beta drivers may be relevant.
 

Xavier

Can't get enough of FH
Joined
Dec 22, 2003
Messages
1,542
Quige said:
and in fact having to drop 1 or 2 resolutions compared to the X800 cards to maintain playable framerates at similar AA and AF settings.
lol, in more than half of the tests the 6800 is running equal settings to the X800 Pro - talk about selective intepretation ;)

Ah well, I'm not fussed, I've got both boards and I know which I'd rather use, in light of such bias above I guess I should just worry about the games I play and kit my rig out accordingly ;)

Xav
 

Xavier

Can't get enough of FH
Joined
Dec 22, 2003
Messages
1,542
Better still, reading hardocps updated X800Pro coverage I notice they're still using the beta 60.72 NVIDIA driver and margin between the radeon/geforce has closed somewhat. Roll on retail board with proper stock clocks and WHQL driver evaluation :D

1086454522gIfgJP1IvU_12_1.gif



Oh, and incidentally chaps, Crytek have acknowledged that ATI chips have issues with Fary Cry and its shadowing, seems that half the shadow calculation gets dropped by the GPU, some of the symptoms to which can be seen here. So, before you go drawing any conclusions on that one I'd wait until Patch 1.2 comes out at the start of next month.
 

Xavier

Can't get enough of FH
Joined
Dec 22, 2003
Messages
1,542
GeForce 6800's hit retail care of PNY this week, X800 Pro's are also now out. ATI are having temperature problems mass producing their XT's but they're available in quite limited volume.
 

Quige

Fledgling Freddie
Joined
Dec 22, 2003
Messages
118
Xavier said:
lol, in more than half of the tests the 6800 is running equal settings to the X800 Pro - talk about selective intepretation ;)

Ah well, I'm not fussed, I've got both boards and I know which I'd rather use, in light of such bias above I guess I should just worry about the games I play and kit my rig out accordingly ;)

Xav
:) not trying to get into an arguement, but I was looking at the X800XT results that were a lot better. You're looking at the Pro, which is just as selective as me only comparing the X800XT. It just looks like the X800XT is the faster card all round. If money is no object.

The second set of benchmarks again do show the cards as more equal, but no X800XT for comparison.

Across the board the ATI seems to do better in more games than the GeForce, is all it seemed from the graph there.
 

Xavier

Can't get enough of FH
Joined
Dec 22, 2003
Messages
1,542

Danya

Fledgling Freddie
Joined
Dec 23, 2003
Messages
2,466
Give the X800s are cheaper, apparently better image quality and faster than the 6800s, and also only use one molex instead of two, I don't think there's really much in it TBH. I'll certainly not be buying a 6800.
 

Xavier

Can't get enough of FH
Joined
Dec 22, 2003
Messages
1,542
Danya said:
Give the X800s are cheaper,
The X800XT, ATI's current top GPU (and the one NVIDIA now beat with retail clock speeds/drivers looking at the reg review costs equal that of the 6800 Ultra
Danya said:
apparently better image quality and faster than the 6800s,
Heh, well we know that's not true don't we ;) ATI have already been cheating with their image optimisation on the new family of GPUs, right down to passing the correct mipmaps (the method they told journos to use to verify IQ no less) but only selectively running their AF. Their IQ at best can be described as 'different' - definately not better ;)
Danya said:
and also only use one molex instead of two,
haha, big deal - my 350W has no problems running both connectors and with PCI Express they're both single plug

*yawn*
 

WPKenny

Resident Freddy
Joined
Dec 22, 2003
Messages
1,348
Regardless of finding an our right "BESTEST CARD EVA!" style winner I'd be much more interested in finding out which card delivers best performance per £.

I don't want to spend 100 quid more on a card just cos it's the best when it's only getting 5 fps more.

Once HL2 and D3 are firmly on their way I shall be upgrading my gfx card so this info is of great interest to me.
 

Quige

Fledgling Freddie
Joined
Dec 22, 2003
Messages
118
WPKenny said:
I don't want to spend 100 quid more on a card just cos it's the best when it's only getting 5 fps more.
Too true ... unless I win the pools or an aged unknown relative dies and leaves me a bunch of cash, I can't see myself getting either of them until they're in the sub £150 price point anyway ... by which point I expect I'll have a PCI Express mboard. Hopefully by then these issues will be clearer on a value for money basis. There may even be games using Shader 3.0 by then as well :)

Recently upgraded from a Geforce 4 Ti to a Radeon 9800 Pro so still appreciating that improvement and that I can play Far Cry with the settings up high!
 

Danya

Fledgling Freddie
Joined
Dec 23, 2003
Messages
2,466
Xavier said:
The X800XT, ATI's current top GPU (and the one NVIDIA now beat with retail clock speeds/drivers looking at the reg review costs equal that of the 6800 Ultra
Will withold jusdgement on that till I see a review with useful benchmarks I think. 3d marks mean little, especially given some of the cheating used by both sides on 3d mark in the past. Halo isn't a game I'd play on the PC let alone use as the basis for making a purchasing decision.

Xavier said:
Heh, well we know that's not true don't we ;) ATI have already been cheating with their image optimisation on the new family of GPUs, right down to passing the correct mipmaps (the method they told journos to use to verify IQ no less) but only selectively running their AF. Their IQ at best can be described as 'different' - definately not better ;)
I'm going on the screenshots posted - the X800 looks better to me, thus I call it better. IQ is subjective by it's very nature, especially given the complexity graphics processing is now approaching. I'm not hugely bothered whether ATIs method is considered "cheating", if it looks better then it's not cheating by my estimation. ;)

Xavier said:
haha, big deal - my 350W has no problems running both connectors and with PCI Express they're both single plug.
I pay for my power, I'd rather use less than more, especially given my computer frequently runs for extended periods. The gfx card is easily one of the largest consumers of power in my system, anything which reduces consumption is good. Besides, it's easier to cool lower power devices in general which makes for a quieter pc (a big factor for me).

Xavier said:
Didn't your mother ever teach you manners?
 

Catsby

One of Freddy's beloved
Joined
Apr 21, 2004
Messages
249
Xavier said:
Catsby wonders why Xaveir is here if he finds the rest of us so boring and uninformed.
 

Escape

Can't get enough of FH
Joined
Dec 26, 2003
Messages
1,643
I think some people get a little too personal about this stuff. It's not likely that I'll ever pledge alliegence to an electronics manufacturer, so maybe I'm missing the point.

What I want from the next card; dual DVI outputs, video in/out and video encoding + good fps :p
The 6800 comes with video encoding built into the gpu, whereas the x800XT-PE has an extra chip for it(rage theatre). I'd be interested to see which one performs better, although I haven't seen any mention of either in reviews...
 

Danya

Fledgling Freddie
Joined
Dec 23, 2003
Messages
2,466
Yes I was quite interested in the video encoding hardware, but nothing has really been made of it as of yet. Makes you wonder if it'll ever really get used.
 

Bodhi

Once agreed with Scouse and a LibDem at same time
Joined
Dec 22, 2003
Messages
9,283
Xavier said:
Aside from the fact the HardOCP article doesn't include the 5950 (thus making comments comparing it against NVIDIAs last generation a tad retarded) the conclusory data here:


1083564189888Adk70te_10_1.gif


doesn't seem to indicate any kind of trouncing.

HardOCP ran beta Catalyst 4.5's against first candidate drivers for the 6800 ultra (the current WHQL are 61.34's - much newer) and their 6800 was apparently clocked at 400Mhz core, whereas retail cards are being shipped at 440mhz.

:rolleyes:

Ya what? I looked at their conclusory data and saw ATI's top product running near enough max settings in 1600x1200 in every game that will run that resolution, which is by all accounts, pretty awesome. I also see nVIDIA's latest product struggling to achieve 1280x1024 across the board, which is what my card can do, and my card is fairly standard these days (9800 Pro). Now (there's a wee bit of maths involved here so stay with me if you can) last time I looked :-

Awesome >>>>>>>>>>> Standard

It's Sweden Bulgaraia all over again. It's not even a win on points ladies and gentlemen, it's a 3rd round knockout with a cheeky punch to the goulies. It's a trouncing. Thank you and goodnight.
 

JBP|

Part of the furniture
Joined
Dec 19, 2003
Messages
1,360
what intrests me more is the advent of PCIExpress, where i believe nvidia 1st gen cards will be agp cards with an "added" chip to convert the information (thus only giving speeds of 8xAGP)

where as ati's 1st gen card will be a "genuine" PCIExpress thingymabob (giving speeds of 16xAGP)

so allready ati have the upper hand in the next gen cards

having said that ill be waiting for the 2nd or 3rd gen of such cards anyway
 

Users who are viewing this thread

Top Bottom