Jesus Christ - NV40 benchies.

SheepCow

Bringer of Code
Joined
Dec 22, 2003
Messages
1,365
HL2 runs fine on a 9800 pro, so I doubt it'll be causing people with this new nvidia card to worry.
 

SheepCow

Bringer of Code
Joined
Dec 22, 2003
Messages
1,365
MrBlack said:
here's a clear loss of quality in Far Cry compared to the ATI card.

Do you have a link that has a comparison of ATI to nVidia image quality with Far Cry? I wouldn't mind seeing this.
 

Xavier

Can't get enough of FH
Joined
Dec 22, 2003
Messages
1,542
Actually, Far Cry has a different shader path for NV3x totally, so if the 6800 Ultra is using them all he needed to do was change a few config values to get the NVIDIA card using the ATI shaders... But every TomsHardware article needs something to moan0r about, so they wouldn't go and do something simple like that, would they? ;)
 

Xavier

Can't get enough of FH
Joined
Dec 22, 2003
Messages
1,542
Moriath said:
but the image quality in the shader dept is not as good as the xt tho ... they still have some work to do before they release it it seems.
*slaps forehead*

The result of shader calculations on both is equal, HAS to be equal, they're mathematical calculations FFS... it's only when the cards use different levels of precision or have inbuilt optimisations for a specific application that *any* difference in terms of IQ will arise, aside from those due to different AA/AF implementations.

Incidentally, with the 6800 Ultra able to do full-speed FP32 and the 9800XTs (and the new X800XT's) limited to FP24, ATI are going to end up lagging behind on IQ for this generation...

The only issue currently is that *many* games developers need to include NV4x detection code, so as to default to the higher precision, rather than using NV3x FP16 which they will do currently.

Xav
 

Xavier

Can't get enough of FH
Joined
Dec 22, 2003
Messages
1,542
MrBlack said:
Ok, I'll use small words so you'll be sure to understand ;)

Take a look at the screenshots on the THG review. nVidia seem to be making a habit of trading visual quality for bigger frame rates. They got panned for this a while back when they made their drivers look out for particular executables such as 3dmark and quake3, and activating what they called "Optimisations". (Although I seem to remember ATI being guilty of similar acts in the past)

It was widely criticised as a cheap attempt to score higher benchmark numbers at the expense of image integrity.
This seems to be a more generalised move, but there's a clear loss of quality in Far Cry compared to the ATI card.

I think this is what Gabe Newell was getting at when he criticised nVidia's codepath and DX9 compliance.

http://www.hardwareanalysis.com/content/article/1654/
http://www.theinquirer.net/?article=11515
http://www.tomshardware.com/business/20030911/

The 6800 still looks like a stunning card and it would be hard to believe that Forceware 60 series drivers still wouldn't be fully compliant, but the shots from the THG review are a bit worrying.
I'd refer you to my above post then, and hope you'll spend a little time longer thinking before posting such drivel... Shader quality can't differ unless one of the cards is substituting the original shader, as NVIDIA did with 3DMark'03.

Hence my comment.

Have a look at HardOCPs article which focuses heavily on IQ, it's pretty clear from them that where NVIDIA did one differ (Brilinear/Trilinear filtering) they've now drawn level, and still come up tops by huge margins...

The phrase alone 'cacky shader quality' (which is what I initially referred to) is just going to demonstrate how little you understand about these technologies to begin with, posting a bunch of links commenting toward the IQ of a previous generation of 3D architecture doesn't help your cause either. The only people who are responsible for 'cacky shaders' are developers, once it's in their game, if it goes unaltered, it *has to* render identically on any board capable of executing it - it's not even a point for discussion.

Numpty. ;)
 

Xavier

Can't get enough of FH
Joined
Dec 22, 2003
Messages
1,542
This probably needs posting separately, so I'll risk quad posting simply 'because'...

In the 'real world' the only debate over IQ on the DX9 offerings thus far, has rotated around floating point precision (FP16/FP32 vs FP24), True Trilinear Filtering and AA/AF methods.

The first is down to developers to have their code correctly recognise and handle the GPUs capabilities, thus in the case of Far Cry until Crytek patch their game, IQ comments based on shader codepaths is a bit pointless... When the game *does* recognise NV4x, IQ should be equal to what ATI produce simply because the other aspects of their GPU are already on a par (see HardOCPs IQ piece...)

Precision is the biggie, with ATI behind NVIDIA now (even with their R4xx VPUs) in terms of floating point accuracy. DX9.0c now asks for FP32 as 'full precision' and anything below that is partial. ATI haven't upgraded that aspect of their new chips and thus can still only offer FP24, which means when later DX9 games appear it's them who will be getting drubbed for banding and lesser frame quality.

The third, a matter of 'trilinear or brilinear' did indeed plague all of NVIDIAs cards, but again, check with HardOCP and you'll see quite clearly that true trilinear filtering is a couple of clicks away and the performance delta between the two now negligable, as you'd expect with such a quick chip... AA and AF methods will continue to differ marginally, but look at the images HardOCP compare, in places NVIDIA now win, in others ATI, almost a dead tie.

It really pisses me off to hear people who think they know what they're talking about creating new and nonexistant gripes about boards they've probably never even used. All it serves to do, is put people off perfectly good products and create unnecessary doubt. It's widely accepted by anyone who has worked with one of these boards for any period of time that NVIDIA have finally pulled out all the stops and 'gotten it right' - ATI are going to have to get up pretty damn early in the morning now to even draw level, let alone better what this GPU can deliver.
 

MrBlack

Fledgling Freddie
Joined
Dec 24, 2003
Messages
148
Ah. there is that. I was testing you. Honest.

I posted the previous links to give people a bit of back story. I was too eager to jump on the "ooh cheaty nVidia" bandwagon after reading that THG review to stop and think for myself about it. So I used that as a basis for the post, preferring to assume that they'd cheat themselves some better scores, though it was more of a "I'm worried, but I'll wait and see" post. I see now that my conern is almost certainly wasted.

I suppose the only thing actually using the full precision abilities is that ShakyCam Unreal Engine footage. Kind of hard to make image quality arguments based on stills from this (unless ForceWare 6 is adding a laugh track and people's blurry heads in the foreground for that extra cinematic quality)

Anyway, consider me suitably chastised, flagellative(?) and educated. I know this stuff, really, :worthy:
 

Users who are viewing this thread

Top Bottom