3DMark 2003

C

cjc1665

Guest
So that would be 3DMark 2003 released, then.

Bang goes the internet.
 
F

FatBusinessman

Guest
Yeah, I noticed. Getting a paltry 35 K/sec download :(

(on 10 Mbit/sec University uber-broadband)
 
D

Durzel

Guest
It ate my PC for breakfast. :(

I was getting 2 fps on the "CPU Test 2" test :( Time for an upgrade methinks.
 
T

Testin da Cable

Guest
I don't trust it any further than I can throw St.Pauls :p
 
D

Durzel

Guest
How come? That statement confuses me somewhat..

All it does is benchmark your CPU and graphics card in a "realtime" gaming environment. In that respect it does exactly what it says on the tin.

If its a case of agreeing with the result or not then surely thats a moot point since everyone is running the same benchmark, the results are immutably a measurement of your PC against everyone elses.

What exactly don't you trust about it?
 
T

Testin da Cable

Guest
I don't trust the benchmark because nearly every tech site I know has shat all over it. even nVidia has denounced it as inaccurate and severely limited in its tests.

on the other hand, if you run it and it looks nice -no stuttering, screwedup graphics et al, then your computer is just fine regardless the score it produces.

imo naturally :)
 
D

Durzel

Guest
It's hardly a surprise that nVidia denounce it when they've been routinely shat on in the tests by ATi for the past 6 months.

Thing is, if its using DX primitives, and runs the same on all systems - surely its an empirical benchmark? It's a bit like saying a rolling road isn't an accurate measure of a cars power because one has an analogue display to show the BHP, and the other has digital.

One may be more accurate than the other, but if you compare all systems on the same thing - it's still an "accurate" (in terms of consistent inaccuracy) measurement.
 
T

Testin da Cable

Guest
oh, I wasn't aware that futuremark had an aTi connection. in that case it's no suprise at all that the competitors aren't enthusing about the product heh.

it's true that you can test systems and are able to compare if you consistantly use the same benchmark, but my argument is that if said method is inherently inefficient you will get a warped result which will inaccurately portray the true capabilities of the kit.
 
X

Xtro

Guest
Originally posted by Durzel

Thing is, if its using DX primitives, and runs the same on all systems - surely its an empirical benchmark? It's a bit like saying a rolling road isn't an accurate measure of a cars power because one has an analogue display to show the BHP, and the other has digital.

Not long 'til t'pub opens soon, come on lads write more interesting stuff or I'll have to do some WORK FFS before I go for a liquid lunch!
 
D

Durzel

Guest
But using DX primitives (i.e. not vendor specific instructions) surely the "warpage" would be consistent across all platforms, and therefore by definition an equally "fair" benchmark.

Or something. My head hurts.
 
T

Testin da Cable

Guest
I dunno. I'm not nearly caffeinated enough to decide if we're right, wrong or both partially correct. perhaps I'll dream up a better arguement after lunch.
 
W

WPKenny

Guest
I think the point that people are trying to make is that it runs better on ATI's than on Nvidia's.

Therefore although the benchmarks will be consistant the apparent performance difference between two near identical setups with different grahpics cards will not be a true reflection of the systems capabilities in real world performance in games etc.

If I may demonstrate in a graphical way:

ATI 3dmark speed ---------------------------------------------]
Nvidia 3dmark speed ------------------------]

ATI Q3 speed ---------------------------------------------------]
Nvidia q3 speed ------------------------------------------]

They're implying the 3dmark test makes ATI cards look far more better than Nvidia than they actually are.
 
A

adams901

Guest
3D mark is useless, I run it the other week to compare my system against other peoples of a similar spec, and do you think I could find someone who hadn't over clocked their PC in some way?.

I want to see if I get similar scores to people with the same system as me, not compare scores with people who have the same system as me but have over clocked it to the point of blowing up, or against people who have a lower spec system that shows up as something it isn't.

I demand Justice, I demand Mad Onion create a separate test for stock machines only :eek:
 
L

leggy

Guest
I tend not to worry about frame rate and benchmarking anymore. I became obsessive and at the end of the day if the pc isn't isnt grinding and my games are as smooth as possible I leave it be.

Doesn't stop me spending too much money on next to useless upgrades though :D
 
C

.cage

Guest
OMFFFFFFFFFG OMG OMGO M LADS I GOT LIIKE 3,000,000 3D MARKS LIKE OMG LIKE LUMAYO LIKE OMG LIKE 3D MARKZZZZ LIKE KARL
 
D

Durzel

Guest
Originally posted by WPKenny
I think the point that people are trying to make is that it runs better on ATI's than on Nvidia's.
Because ATi is currently faster? :)

If I may demonstrate in a graphical way:

ATI 3dmark speed ---------------------------------------------]
Nvidia 3dmark speed ------------------------]

ATI Q3 speed ---------------------------------------------------]
Nvidia q3 speed ------------------------------------------]

They're implying the 3dmark test makes ATI cards look far more better than Nvidia than they actually are.
It isn't unusual for performance differences to show up in different "categories" of graphics tests. Q3A is really only testing two real aspects of a graphics card - vertex shading and fill rate. And of course CPU power.

This is further borne out by the fact that in the tests HardOCP ran on the "Radeon 9700 Pro vs GeforceFX" it ended up being faster in one test, and slower in all the others.

The fact that 3DMark2003 utilises a multitude of different graphics benchmark tests (pixel shaders, bump mapping, etc) means it is a better assessment of a graphics cards overall capability than, say, Q3A. The fact that games don't often use all of these features doesn't invalidate its results.

As hard as it may be to accept, the reason Nvidia come out slower than ATi in 3DMark is quite simply because they are slower currently. It really is as simple as that.

As I stated before, the exact same measurement applied to different pieces of hardware is as fair and unbiased benchmark as you're ever likely to get.

Of couse how much credence you lend to 3DMark depends entirely on what you want out of your graphics card. My card barely managed to make 2 fps in the "CPU Test 2" test, but that doesn't stop my system as a whole being perfectly able to play the current raft of games.
 
C

.cage

Guest
so long as it gets at least 60fps constant in everything im happy

GeforceFX looks like a fucking minature dyson though :/
 
S

Scouse

Guest
I believe that the new "nature" demo makes a ATI 9700 Pro grind to a halt.

I'll check that out when I get my new PC :)



Apparently THIS is a piccie of it.... nice.....

(Set it off downloading before leaving for work - not seen it yet - is the "demo" any good?)
 
W

Will

Guest
3D Mark is bad. If I get one more person moaning down in the hardware forums "I changed blah, and lost 2000 3D Marks" I'll...well, I'll be sarcastic as always.

Its about how it feels when you play, not about how many Marks you get.
 
A

adams901

Guest
Originally posted by Will.
3D Mark is bad. If I get one more person moaning down in the hardware forums "I changed blah, and lost 2000 3D Marks" I'll...well, I'll be sarcastic as always.

Its about how it feels when you play, not about how many Marks you get.

does the same apply to sex?, it dont matter how big it is as long as you enjoy it :)
 
P

PR.

Guest
Here is how I benchmark my system...

1. Install recent high demanding game (e.g. Unreal 2)
2. Load it (if it loads it passed benchmark 1)
3. Turn all the Gfx up to max
4. Play Game
5. If it runs well with minimal stutters it passes


There are exceptions to this BF1942 for example which we all know is slow and stuttery anyway :/
 
L

leggy

Guest
Originally posted by PR.
There are exceptions to this BF1942 for example which we all know is slow and stuttery anyway :/

I get a solid smooth frame rate on all maps now.

I was getting 99 solid on market garden.
 
K

kan

Guest
as a benchmarker i think its crap tbh..


saying a celeron 900 with a radeon 9700 is a better gaming machine than a p4 3 gig with a 4600 in it is flawed imho.


plus im gutted my system only got 4660 3dmarks :)
 
S

Scouse

Guest
saying a celeron 900 with a radeon 9700 is a better gaming machine than a p4 3 gig with a 4600 in it is flawed imho.



Does it really come up with that or was it a throwaway comment??!! - I'd be surprised....


If so then it's seriously flawed. :(
 
D

Durzel

Guest
3DMark2003 is a DirectX 9 benchmark.

Geforce Ti4600 uses pixel shader version 1.1, vertex shader version 1.3, and doesn't support DirectX 9 (no optimisations for it)

Radeon 9700 uses pixel shaders version 2.0, vertex shader version 2.0, and is optimised for DirectX 9.

Gaming benchmarks are heavily GPU-loaded (since the graphics card does about 75% of the work).

So are the results that surprising?
 
N

Nos-

Guest
It does, due to the radeons being dx9 compliant and the geffers missing out some tests because they're not ;/
 

Users who are viewing this thread

Top Bottom