nVidia GeForce FX Product Family Update

J

Jonty

Guest
nVidia has finally updated their site and officially announced the GeForceFX 5600 and 5200 Ultra, the mainstream and budget equivalents of their ill-fated GeForceFX line.

A product comparison shows that the 5600 Ultra will ship with 256Mb DDR-I RAM, which is quite interesting. All specifications seem to focus on the Ultra cards, at the moment.
Code:
[b]Feature[/b]		[b]GFFX 5800[/b]		[b]GFFX 5600[/b] 		[b]GFFX 5200[/b] 
[b]CineFX Engine[/b] 	X			X			X 
[b]Intellisample[/b]	X			X  
[b]nView[/b]		X			X			X 
[b]DDR-II[/b]		X   
[b]AGP 8X[/b]		X			X			X 
[b]Maximum Memory[/b]	128MB			256MB			128MB 
[b]Vertices/sec.[/b] 	200 million		88 million		81 million
Although rumours exist that put the non-5800 cards way behind in the performance stakes, review samples have yet to be sent so nothing can be confirmed (and nVidia aren't rushing after all the changes they implemented to the 5800 Ultra after the review boards were sent out).

All cards will be fully DX9 compliant, and there should be 1.5M of them in circulation by the end of April. Even the 5200, which starts around $79, is fully DX9 complaint; something which ATi does not currently offer on it's budget range, not even in its (dubiously :p) rebranded 9200 cards.

I'm a little biassed towards nVidia, but I wouldn't say no to a 256Mb 5600-Ultra :D The GeForce FX 5600 GPUs will deliver 30% more performance at half the price of the GeForce4 Ti 4600. If these cards end up outperforming the Radeon 9500/9600 Pro/Non-Pro (which they might, but it's likely they'll be about even) then nVidia ought to be able to bounce back.

At the end of the day, competition is good. It forces creativity, reduces prices. The 5800-Ultra, for all it's bad press, was only ever aimed at the top 2% of the market, and it's 'loss' is negligable, thankfully.

Roll on the NV35 :D

Kind Regards
 
J

Jonty

Guest
I was under the impression that no samples had been sent, but here's a shot of 5600 Ultra card that nVnews are testing (results released 10 March 2003).

5600_ultra.jpg


As you can tell, it's a single slot solution (thank goodness :D) and uses a cooling system very similar to that of the Quadro FX (if not the same). It does require an external power connector, but at least you will not have to guess what's going on due to crashes and drive errors, as per ATi (I speak from experience). Thankfully nVidia have this one covered . . .

power_indicator.gif


Finally, here are some of the specs that nVidia's GeForce FX 5600 page doesn't go into due to their technical nature (however some are questionable, so don't take them as gospel :)) . . .

  • TSMC's 0.13 Micron Process
  • 80 Million Transistors
  • 350MHz Graphics Processing Unit
  • 700MHz (Effective) DDR Memory
  • 256-Bit Memory Bus
  • 11.2GB/sec Physical Memory Bandwidth
  • CineFX Architecture
  • Intellisample Technology
  • 128-Bit Floating Point Precision
  • 4x1 Pipeline Architecture
  • Lossless Color And Z Compression
  • Z Occlusion Culling
  • AGP 8X Interface
  • Integrated TV-Encoder, TMDS Transmitters
  • Dual Integrated 400MHz Ramdacs
  • Integrated Full Hardware MPEG-2 Decoder

Kind Regards
 
W

Wij

Guest
Is it low noise ?

My new techie project is to make my PC silent :) It's a cheaper project than making it the fastest one out there...
 
X

Xavier

Guest
the 5200 non-ultra doesn't even need a fan...

oh, an on the note of 5600 and 5200 samples I have both boards in for testing at the moment, alongside the 9800 Pro and 5800 Ultra :D

heh

/rubs his hands together while he waits for the RV350 sample later today...
 
J

Jonty

Guest
Groovy :D Bet you don't want to give some of these things back once you're done, hehe :)

As for it being low noise, I presume it will be no louder than your standard graphics card. Even the 5800 Ultra has been reduced to a mere 7dB (about the same noise level as a human heartbeat) by Gainward; and eVGA's single-slot 5800 Ultra with ASC3 cooling ought not to be particularly loud. Obviously neither are silent, though.

Kind Regards
 
X

Xavier

Guest
the 5200 doesn't need a fan as in it's cool enough to run passively... zero noise.
 
W

Wij

Guest
Cool. I presume it will be better than my GF3 ? :)
 
T

Testin da Cable

Guest
ah well, I cba waiting (yet again) for (yet another) gfx card upgrade. prolly getting new system this weekend :)
 
J

Jonty

Guest
Please forgive the big image posting, but I thought this was a useful comparison chart from Gainward. Yes, it focuses only on Gainward's lineup, but it does highlight the product differences nicely.

Gainward's bundles are also amongst the most extensive on the market, just a shame about the names :p ("Hi, yeah, I'm after a Gainward FX PowerPack! Model Ultra/1000 Plus Golden Sample, please." :D)

gainward.gif


Kind Regards
 
E

Embattle

Guest
Nvidia doesn't impress me any more, too much hype and not enough product.
 
J

Jonty

Guest
Originally Posted by Embattle
Nvidia doesn't impress me any more, too much hype and not enough product.
I see where you're coming from, but I beg to differ :p The hype surrounding the GeForce FX was something akin to Duke Nukem Forever; that is, an official announcement, the odd bits from the developers, but 95% from the rumour mill. The 5800 Ultra was a card for the top 2% of the market, the 'enthusiasts' as nVidia calls them (or just those with 'more money than sense' to the rest of us ;)). And yet, despite this, the press chose to give such an disproportionate amount of attention and hype to the card.

ATi's impressive gains only fuelled this, with the rumour mill puporting that the GeForce FX must be the best thing since sliced bread to kill off the competition etc etc. And, just like Duke (I fear), when the product finally arrives there is no way the developer will ever be able to satisfy that amount of hype that has ultimately spun out of control (just like the Star Wars: Episode I, come to think of it :))

There are plenty of examples of this, but at the end of the day it's our duty to just step back, and put things in perspective. nVidia are still incredibly powerful and lead the graphics market. But they now have to work hard to compensate for the brilliant progress ATi (and possibly 3DLabs) has made. Being at the forefront isn't easy, but, biassed as I am, I still believe nVidia can compete, they just have to restart their infamous six month product cycle :p

Kind Regards
 
E

Embattle

Guest
Oh I know that a lot of it was self induced hype but the final product still disappointed many, the fact it still isn't available makes it even worse.
 
J

Jonty

Guest
Yeah, I'll give you that :D It's now outperformed by the pre-release 9800 Pro, which isn't promising considering the close release proximity of the two. Whether the 5600 and 5200 will keep up with the 9600 and 9200 should be interesting (just coincidence that the last three numbers of both companies products now match? :))

Kind Regards
 
X

Xavier

Guest
yes, considering the 9200 is just a refresh of the 9000 to AGP 8x, it won't perform any quicker, they had to make the part 8x for their SI's and tier ones.
 
J

Jonty

Guest
Last post, honest :p The following sites all have short previews of the GeForceFX 5600 and 5200. No performance tests, as yet, which is just as well considering these are unoptimized pre-productions units :)

Kind Regards

Jonty

P.S. Does the Radeon 9800 Pro (or the rest of the 9x00 line, for that matter) fully support OpenGL 2.0 in it's current state? I know this of less importance than it used to be, considering DirectX's growing dominance, but I did notice nVidia only fully supports OpenGL 1.4.
 
X

Xavier

Guest
the only real current qualifier to claim OpenGL2.0 is the ability to run pixel shaders of a non-finite length, hence ATI's F-Buffer...

Two immediate problems

Check OpenGL.org - the latest version of the standard is 1.4 - 2.0 remains unratified

3DLabs basically fleshed out most of GL2/OGL2 themselves, hence all their press releases with ATI recently, they're trying to join together and push a new standard to steal some of NV's limelight as developers really haven't responded well to RenderMonkey (or for that matter true HLSL) - Cg penetrated the community much further than either expected.

It's the 9700 all over again, MS only ratified and finalised DX9 in October of last year but the GPU was available end of august, ATI pulled a sneaky with their 96-bit FPU on the chip and got away with it... now they're claiming OGL2.0 and because 3DLabs give them the thumbs up (not the ARB, who will be necessary for people to get approval in the future) they get GL2 on the list too.

Having gotten myself a stealthed up 5800 Ultra now, I'm happy to stick with the FX card, it's not like any of the features of the 9700/9800/FX really matter with todays games - with everything turned up at 1280x1024 gameplay is pretty similar...

speaking to ATI on Thursday when the news of the 5200 hit, they didn't expect NVIDIA to get their top-to-bottom solution so quickly, and it's netted them a shed load of design wins in a matter of days. ATI pushed the arrival of DX9 boards well ahead of the competition, which has benefited everyone really, and now there's a solution at the low end which will sell in real volume, it's just a pity they don't have one down there too.

Think from the angle of the likes of Mesh, Simply, Advent and Multivision, who are in essence box-shifters, if they can have full DX9 support, as opposed to DX7 or DX8 from the likes of the GF4MX/R9000/R7500 for a dollar or two more, there's only one choice...
 
J

Jonty

Guest
Thanks for the heads up, Xavier. You are indeed a fountain of knowledge *Jonty looks in vain for a Fountain of Knowledge emoticon and settles with Merlin instead :merlin:*

Kind Regards
 
J

Jonty

Guest
Ooo, harsh :D Considering it was fairly obviously a subjective, opinion-based reply, you can't slag him off for, in this instance, favouring nVidia when you clearly do not. His opinion his own to have :rolleyes:

Kind Regards
 
X

Xavier

Guest
all the same, we can't have that kind of language around here :D
 
J

Jonty

Guest
Some intriguing benchmarks over at Bjorn 3D. It seems, at least for the 5600 Ultra, I have been a little over excited in terms of it's performance potential :( *sigh*

Kind Regards
 
W

Wij

Guest
Not impressed.

I want something that will at least make my GF3 weep rather than moderately annoying it.
 
E

Embattle

Guest
Originally posted by Jonty
Some intriguing benchmarks over at Bjorn 3D. It seems, at least for the 5600 Ultra, I have been a little over excited in terms of it's performance potential :( *sigh*

Kind Regards

You keep getting over excited ;)
 
J

Jonty

Guest
Originally posted by Embattle
You keep getting over excited ;)
I'm not even going to dignify that with a reply . . . which, peversely, I am doing, so I should, for the record, that I don't get over excited, but that I am a wee nVidia fan boy from time to time because of my long history with them. But, that doesn't make me your b****, Embattle. Licking your shoes indeed :rolleyes:

Okay, genuinely last post. Au revoir :)
 
J

Jonty

Guest
A few interesting NV35 details from the exclusive testing at CeBIT . . .
  • 256 Bit Memory Bus
  • 500MHz DDR I (Effective 1000MHz)
  • 500MHz GPU
  • Low noise cooling solution (Distinct from that of the NV30)
Apparently a quick benchmark in Quake III Arena (1600x1200, 4XAA and 8xAF) averaged 111 FPS, compared to a GeForce FX Ti5800 Ultra that got 48 FPS. Both chips were clocked at 250MHz, however, firstly because the NV35 was only recently tapped out, and secondly because a balanced wanted comparision with the NV30 was desired, which was hence hence also running at 250MHz.

Kind Regards
 
E

Embattle

Guest
I think I'll wait for some proper benchmarks and not those odd ones that tend to seep out of trade shows ;)
 
J

Jonty

Guest
hehe, I agree, but I would expect serious improvements over the NV30 purely on the basis of the 256-bit memory bus utilised in the NV35. That said, the Radeon 9900 Pro should be ready to roll by the time the NV35 is released, so the competition should be interesting.

Kind Regards
 
E

Embattle

Guest
I would expect it to be good since the NV30 was a total screw up really, it left them with a high end card that hasn't, in reality, done well in the performance or saleability stakes. If they screw up again even the nVidia diehards will have to question nVidia's abilities as a high end graphics card produce ;)
 
B

bodhi

Guest
FX 5600 and 5200 Ultra got a great review at [H]ardocp.





*snigger*
 

Users who are viewing this thread

Similar threads

K
Replies
7
Views
459
Embattle
E
F
Replies
5
Views
549
Insane
I
A
Replies
8
Views
587
agraucred
A
E
Replies
0
Views
334
Embattle
E
C
Replies
3
Views
576
caLLous
C
Top Bottom