X800XT v 6800 Ultra

Status
Not open for further replies.

Ukle

One of Freddy's beloved
Joined
Dec 22, 2003
Messages
410
Ok with both cards being released in the Uk within the next few weeks finally going to get my hands on 1... question is which one to go for.

Both are similar on Performance it seems.

X800XT wins slightly on price (40 quid difference)

X800XT wins on not having silly high 12 volt power requirements

6800 wins on potentially better long term...

So far opting for the 6800 as not a fan of ATi due to the various driver issues had with every ATi I have ever owned, while the 3 Geforces have never been a problem at all. Also got a PSU that should be able to cope with the 12 volt requirement although it is a bit low on volts (11.76) at moment even with it supply 35 amp (I think) on the line...

So whats your opinion then?
 

fatbusinessman

Fledgling Freddie
Joined
Dec 22, 2003
Messages
810
It's a bit of a gamble really - the Pixel Shader 3.0 effects on the GeForce look damn impressive, but then you have to hope that enough games companies take advantage of them to make it worthwhile (tech demos are pretty, but usually a bit slight on gameplay).

Personally I'd say go for the 6800, but then again I'm being a tightwad and buying a second-hand 5700 Ultra :)
 

TdC

Trem's hunky sex love muffin
Joined
Dec 20, 2003
Messages
30,925
I was an utter nVidia fanbwoi up to a gf4600ti, after which I moved to an ati9700pro on a whim. I must say that I am really, *really* chuffed with the 9700 and odds are I will buy ati again unless nVidia makes something that performs better and costs less.

hardOCP article here: http://www.hardocp.com/article.html?art=NjExLDE=
 

wyrd_fish

Fledgling Freddie
Joined
Dec 27, 2003
Messages
537
i'd go for teh X800XT... basically because there's nouthing that annoys me more than thoses extra power conectors
 

Ukle

One of Freddy's beloved
Joined
Dec 22, 2003
Messages
410
Was reading that and a few other articles at HardOCP at dinner time. One thing that is interesting that they might be altering the power consumption in the 6800 by a BIOS patch, hope it does as not many people will be willing to spend an extra £100+ (for the new PSU) just to go none ATi.

As for the powerconnectors dont care how many it has just care how much power its consuming.

Only thing as said I have against the ATi (other than the more future proofing on the 6800) are drivers, just dont want to spend £350 on something then have problems with it. Hmm more i think about it more I think have already made up my mind to go for the 6800 due to my past experience :/

Although knowing my luck at moment the ATi will have perfect drivers but the Nvidia ones will stink :mad:
 

SawTooTH

Can't get enough of FH
Joined
Dec 22, 2003
Messages
819
Most reviews Ive read come down in favour of the ATI card. I dont imagine that I would be able to spot the difference anyway. If anything the ATI cards image quality is often quoted as crisper but it depends as always on the res you run the games. Im stuck at 1280 X 1024 due to TFT so Im not interested in how fast it runs at higher resolutions.

Needless to say Im buying the ATI card as Im a HL2 fan at heart.
 

JBP|

Part of the furniture
Joined
Dec 19, 2003
Messages
1,363
well both yoni and i have ati cards (9700pro and 9800XT repectivley)
we have both in the past had nvidia cards also



i personally rate the ati cards over nvidia any day of the week


whats this thing about drivers though?

i am using 4.3 catalyst while yoni uses 3.6 niether of us have any problems running the games we play nor do we have any driver "issues"
 

old.user4556

Has a sexy sister. I am also a Bodhi wannabee.
Joined
Dec 22, 2003
Messages
16,163
When the 6800 came out, I felt the need to buy one immediately because I was so impressed.

However I waited because the X800 Pro and XT (the Pro is cheaper than the 6800) appeal to me over the nVidia card because of their quiet operation, single slot design and lower power consumption than even the 9800XT.

As Sawtooth has already said similarly, i'm a Half Life fan at heart so ATi wins me over. Additionally, as FBM said, PS3 on the nVidia is very impressive. But at the moment, it's all still demos, eye-candy and tech examples.

I'm holding on to the 9800 Pro for the next couple of months because there isn't anything it doesn't handle at the moment. Besides, by the time HL2 is here the price will have dropped a little.

ATi for me :drink:

G
 

Danya

Fledgling Freddie
Joined
Dec 23, 2003
Messages
2,466
X800 - ps3.0 can do virtually nothing 2.x can't, I doubt well be seeing any games which won't run will full effects on both cards in the lifetime of either. X800 is much quicker on ps2.0+ stuff to boot which is good if that's your bag.
Plus I have two monitors and ATIs multimon support kicks nvidia's arse bigtime.
Stability? Not had a driver issue since I got my 9800 Pro.
 

old.user4556

Has a sexy sister. I am also a Bodhi wannabee.
Joined
Dec 22, 2003
Messages
16,163
Yes, when I ran an nVidia card when I sent my 9800 Pro back to be replaced I had to live with christ awful nVidia drivers for dual monitor support.

Edit ~ I actually like nVidia's drivers, but their dual monitor support was a total PITA.

Worse still, when I ran my projector on the second device it just flat out refused to send 1280*1024 to the TFT and 800*600 to the projector. It got all stroppy and would only send a 640*480 res to the projector even though it said 800*600 in the control panel. There is something not quite right with their overlay controls too, seemed to totally arse up the res on my DVD playback.

9800 Pro came back, all was sorted.
 

Krazeh

Part of the furniture
Joined
Dec 30, 2003
Messages
950
Isn't the X800 just essentially a turbocharged 9800? It's not exactly a new design, just a lot more power behind it?

Whereas the 6800 is a brand new design? It'll be interesting to see what the performance is like a few driver revisions down the line, i can't see that either set of drivers is fully optimised yet and specially not the nvidia ones owin to the fact the gpu is brand new
 

JBP|

Part of the furniture
Joined
Dec 19, 2003
Messages
1,363
erm i dont think so

the 9800 XT is a turbo charged 9800 pro (whence it weighing in at about 1 tonne due to the cooling it requires to run)

the X800 is a new chip with a new design giving twice the power of the 9800XT but without the need for the excessive cooling


thats how ive read the info anyway


what intrests me more is the advent of PCIExpress, where i believe nvidia 1st gen cards will be agp cards with an "added" chip to convert the information,where as ati's 1st gen card will be a "genuine" PCIExpress thingymabob

having said that ill be waiting for the 2nd or 3rd gen of such cards anyway
 

SawTooTH

Can't get enough of FH
Joined
Dec 22, 2003
Messages
819
JBP| said:
well both yoni and i have ati cards (9700pro and 9800XT repectivley)
we have both in the past had nvidia cards also



i personally rate the ati cards over nvidia any day of the week


whats this thing about drivers though?

i am using 4.3 catalyst while yoni uses 3.6 niether of us have any problems running the games we play nor do we have any driver "issues"

I agree, the ATI issue was a deserved reputation about 2 years ago when they first moved into performance cards. I bought one of the early ones and the drivers were buggy as hell. Now however I believe they are much better than NVidia in driver turn around and problems where they exist are addressed quickly.
and before anyone suggests Im an ATI fan boy I have an NVIDIA card in my main machine at the moment, but Im moving back to ATI.
 

Ukle

One of Freddy's beloved
Joined
Dec 22, 2003
Messages
410
My issues with Drivers are not old ones had problems with the last ATi i bought (9600XT) it either played appalingly in some games or flat out refused to play - Trackmania being the most notible exception :/ While the 5900 I got for the other machine I never had any problems with...

This is the problem - I know the ATi is probably a more simpler and more effective design but the NVidia is a better card in terms of technology

Also I use Duel Monitor all the time never had any problems with it on the NVidia and never felt like trying it with the ATi due to the instability :/ Also thought that ATi Duel monitor was crap as it didn't do Direct X 'right'? (Something along the lines of not treating the 2 monitors as one display area instead it treats it as 2 monitors and can only update 1 at a time?)

If the ATi was reliable I would definatly go for it but my past experience has clowded my judgement :mad:
 

TdC

Trem's hunky sex love muffin
Joined
Dec 20, 2003
Messages
30,925
tbh I felt much the same way until I got my 9700pro, and I was using nvidia since I got a TNT1 card (which I ran with 2 voodooII's in sli | the v2's ran better heh) allowing me to choose between the two different accelerators. everything I read about Ati told me that they had the better card so I went with it and it rocks. I only use it on a single monitor to play games, but it's not let me down yet. thing is, I can't work out if I'm going to upgrade to a 9800XT or to the new model.
 

SheepCow

Bringer of Code
Joined
Dec 22, 2003
Messages
1,365
I've had a nice assortment of ATI and nVidia cards and at the moment I've got a Radeon 9800 pro and I love it. So for now I think I'll stick with ATI unless nVidia do something interesting, which I don't think the new Geforce is.

My CPU and RAM are the bottleneck at the moment so I doubt I'll be upgrading any time soon ...
 

fatbusinessman

Fledgling Freddie
Joined
Dec 22, 2003
Messages
810
SheepCow said:
So for now I think I'll stick with ATI unless nVidia do something interesting, which I don't think the new Geforce is.
Wow - you have pretty high standards of interesting. A 100% performance increase plus extended shader engine sounds pretty interesting to me...
 

SheepCow

Bringer of Code
Joined
Dec 22, 2003
Messages
1,365
As Clown says they've improved from crap so that wasn't really unexpected. Both the new ATI card and the nVidia one are very similar performance-wise -- hence why I'm not that impressed by either tbh (there won't be much using that power for a long time). But atm ATI are in my good books, so nVidia have to do something special (not just equal their rivals) to get back on top ;)
 

Xavier

Can't get enough of FH
Joined
Dec 22, 2003
Messages
1,542
Ok, sadly we haven't had time to publish our own conclusions just yet (I recently became a dad don'tchaknow...) but I've already had a bit of a tinker on both boards and I think some of the comments made above are a tad skewed...

In the situations where NVIDIA have doubled performance on the NV3X family with their 6800, they're generally doing pretty much the same to the 9800XT (I'm sure we can all find a couple of benchmarks out there which don't quite conform... but that's not the point)... a lot of the differences which existed between ATI and NVIDIA in the last generation were down to the inherent differences of 4x2 and 8x1 architectures - this time around they're 16 pipes apiece and the playing field is a great deal smoother.

Ignoring the recent cheating (though it's nice to see ATI finally eating their words after playing holier than thou for 9 months...) the choice is pretty simple. Right now there are two very high end GPUs which when they're both in retail will perform in many cases almost on a par... on one hand you've got the R4x0 VPUs (which can almost be thought of as the fourth generation tiled die of their R300 architecture, and maturity is good - after all look at how far the Northwood took Intel), on the other hand NVIDIAs 6800 - which as you may have read is being shipped at OVER the originally specified clocks in most cases, and offers full shader model 3.0 support.

If you're only interested in CS which will ultimately never touch SM3.0 then the 6800 isn't for you - but then neither really is the radeon - a 9800XT or 5950 ultra will happily do the job and save you about 50%... however I'm guessing that most of us are planning on playing a selection of the big titles waiting to land this summer (HL2, Doom3, Thief3 for instance...) and unlike when the radeon 9700 arrived, first with SM2.0 and 6-8 months before DX9 rolled out (let alone any DX9 games) there's already announcements of support in games due now (and a few already out, with UbiSoft promising a SM3.0 patch for Far Cry before the 6800 Ultra hits shelves).

Xav
 

Xavier

Can't get enough of FH
Joined
Dec 22, 2003
Messages
1,542
SheepCow said:
so nVidia have to do something special (not just equal their rivals) to get back on top ;)
Did you even read Fatties comment fella? X800XT = SM2.0 / 6800 = SM3.0 - that's hardly equal.

Xav
 

SheepCow

Bringer of Code
Joined
Dec 22, 2003
Messages
1,365
Shader version 3.0 is used in 1 engine, the Unreal 3 engine. I've heard that a few other games will be implementing it, but you can acomplish most of the things they've been showing off with Shader 2, so I doubt there will be much of a difference.

Hopefully nVidia will improve their drivers before the card gets out en-masse as at the moment it is a bit behind the ATI card with image quality.

edit: I was referring to image quality, frame rates when I said equal ;)
 

Danya

Fledgling Freddie
Joined
Dec 23, 2003
Messages
2,466
Xavier said:
Did you even read Fatties comment fella? X800XT = SM2.0 / 6800 = SM3.0 - that's hardly equal.

Xav
I'm sorry, that's the wrong answer, but thanks for playing!

I'm tired of seeing this misinformation spread. If you're going to state something as fact on a technical forum get it right.
The X800 supports PS2.x NOT 2.0 - there's a very large difference. Basically all the stuff people are getting wood over in 3.0 is also in 2.x. Specially all the fancy flow control and suchlike things are in 2.x. 3.0 offers only a few improvments - they reworked the register organisation, but that's largely a cosmetic thing. They added a couple of new registers which aren't likely to be used in many shaders. They increased some minimums on instruction counts and registers available - but ATI can support them in ps2.x anyway.
In real terms, and given most shaders are written in HLSL or Cg, I doubt the programmers will see any difference whatsoever in terms of functionality on either card.

For more details check out: http://msdn.microsoft.com/library/d...aders/PixelShaders/PixelShaderDifferences.asp
 

Xavier

Can't get enough of FH
Joined
Dec 22, 2003
Messages
1,542
Sheepcow - you didn't even read my post! Far Cry will offer SM3.0 support c/o Ubisoft in the very near future. They're not the only developer either who will have Shader 3.0 stuff on the market before the end of the summer, when we're able to talk about it openly we will too ;)



Danya - I wrote a nice long rebuttal, but for the majority of HW forum-goers don't understand much beyond register counts and the basics of the rendering pipeline so I'll save the verbose version of why you're talking utter bollocks for the article, in the interim I'd take a read of the piece Dave Baumann of B3D wrote on NV40 here and look closely at his conclusions covering SM3.0.


Oh, and if you're going to post the links for differences in shader models, then do it properly, your linkage above is to Microsofts Pixel shaders pages on MSDN, which obviously doesn't look at the differences within VS3.0 (i.e the other half of Shader Model 3.0) or the additional functionality exposed by the hardware and drivers to OpenGL for Shader 3.0 which has a few added tweaks and widgets.


Xav
 

JBP|

Part of the furniture
Joined
Dec 19, 2003
Messages
1,363
ok this is getting way to g33ky for me


/me runs away confused
 

SheepCow

Bringer of Code
Joined
Dec 22, 2003
Messages
1,365
I did read your post, and I said "I've heard that a few other games will be implementing it". I know of about 4 other games (1 of which is Far Cry) that will be implementing it - but I've also spoken with some games developers and they seem to believe that anything they can so in Shader 3.0 they'll be able to do with 2.

The graphics technology atm is pushing too far ahead of what's actually being used and the games industry is getting in to trouble.
 

Xavier

Can't get enough of FH
Joined
Dec 22, 2003
Messages
1,542
I wouldn't say so - in fact quite the opposite.

When the GeForce3 and Radeon 8500 GPUs came out, the first PS & VS processing chips, developers had to hand code all their shaders in assembler... teh nasteh... but since before the arrival of DX9 there are now high-level languages which not only make shader development a LOT easier for existing developers, but also more accessible to codies and artists who are just starting out.

As long as the high-level shading languages are kept to date with the APIs and hardware, the net result is only likely to be better looking games arriving to market sooner, not later - and that shouldn't be a problem as it's down to the manufacturers to keep these things in parallel.

In addition to the languages and tools available to developers, both NVIDIA and ATI have hoardes of FAEs and Developers who they send out to development houses to assist with this kind of stuff - the general consensus from which being that they're speeding up the time-to-market associated with game support for the latest and greatest eye-candy.

Xav
 

Danya

Fledgling Freddie
Joined
Dec 23, 2003
Messages
2,466
Xavier said:
Danya - I wrote a nice long rebuttal, but for the majority of HW forum-goers don't understand much beyond register counts and the basics of the rendering pipeline so I'll save the verbose version of why you're talking utter bollocks for the article, in the interim I'd take a read of the piece Dave Baumann of B3D wrote on NV40 here and look closely at his conclusions covering SM3.0.


Oh, and if you're going to post the links for differences in shader models, then do it properly, your linkage above is to Microsofts Pixel shaders pages on MSDN, which obviously doesn't look at the differences within VS3.0 (i.e the other half of Shader Model 3.0) or the additional functionality exposed by the hardware and drivers to OpenGL for Shader 3.0 which has a few added tweaks and widgets.
If you're going to post a discussion of the differences between shader models, try to at least address the two models in question. That's a 3.0 vs 2.0 comparison, and as such is misleading when comparing a GF6800 and an X800.

To compare vertex shaders, the only thing you miss with vs2.x is the texture lookup fuinctionality. That sounds interesting, but it remains to be seen if it's actually useful or not. I'm sure you can find the vs3.0 vs 2.x comparison on MSDN too if you look.

Frankly, I'm not much bothered about what OpenGL uses as it's very much been left by the wayside - the vast majority of games use D3D, so that's what is interesting to look at.

Have any game developers actually said they will be doing effects on a GF6800 that won't run on a X800, or are you just assuming? I can well see new stuff not running on the current generation of 2.0 cards as they have much less functionality. Besides which, GF-FX cards suck the big one for ps2.0 speed.
 

SawTooTH

Can't get enough of FH
Joined
Dec 22, 2003
Messages
819
mumbles summit about who could spot the difference? I dont know about you but whenyou see screenies of 2 images from both these cards Im pushed to see the difference. Are they just spot the difference competitions cos Im crap at em.
 
Status
Not open for further replies.

Users who are viewing this thread

Top Bottom