All This 32bit Bull-Shit

O

old.sjp

Guest
Ive finally left me voodoo2's :) and gone all 32bit, BUT where is this great increase in picture quality u all went on about ?

as fair as i see, the only REAL differance is the frame rate!

i must say i feal really cheated :), i remember all the "once you have seen 32bit, there is no going back" statements which where thrown arround awhile ago.


i mean its a really hard choice :-

1)piss slow 32-bit color and sod all image differance and low resolutions.

2)fast 16-bit color and 2 or 3 resolutions higher.

3)cant we have 16-bit and textures which are twice as large instead ?


hehe, iam interested to here u thoughts on this :)
 
K

kryt

Guest
Errrr....
I run a measly tnt2u. 32 bit colour, 800x600 (in q3) all the fads nads, and im only on a celeron 366 running at 458 atm. it gives me 64fps. 16 bit colour instead of 32 gives me 51 fps, ok 13 fps difference, but its still smooth and i can se a visual difference. mostly in the reflections, mirrors, explosions etc. There is a diff, you just need to see it.

And i hope for your sake you are not trying to extract 32 bit colour out of games like QW, Q2 or UT :)
 
O

old.TUG

Guest
32 bit on *old* games means fuck all

32bit in q3 is essential :)

I have a celly 366@578 atm, DDR GeForce... 32bit @ 1024*768 with everything maxed gets me 60fps in demo001 and 002...

16 bit... not sure wot I get, cuz it's nice and fast in 32bit so I never really test it in 16bit :)

PIII 650@900MHz+ next week anyway... :D
 
O

old.sjp

Guest
Sorry i should have said.... i only really play Q3a so the textures REALLY are 32-bit :)

and TUG, u hit the nail on the head -
60fps is sod all ! :)

when u allow for the variations in fps durring intense moments + the fact that the human eye IS sensitive to rates OVER 80fps (before anyone says "cant see faster than 25fps", f@ck off and reasearch it :)) u need over 100fps to to maintain a steady 85fps (the real minimum that companies should be aiming for imnsho).

in 16bit mode with everything on max (inc all textures details, very high geometery etc) i can get over 100fps at 1024, if i turn the detail down abit 1280 is totaly smooth. (ie > 100fps)

in 32-bit i cant! (tis better than 60 tho) :)


u also illistarte my other point - would you HONESTLY rather play @ 1024x in 32-bit, than @ 1280x in 16-bit ?


o yea, TUG (while u are here :))i conly only over clock the memory on me herc geforce2 (32mb) by a couple of mhz (ie to arround 337mhz) is this normal :), or have i been a bit unlucky and got a card which is allready maxed out (tis fine at 333 so i cant really crumble) ?
 
O

old.TUG

Guest
Sure 60fps isn't fast, but it's only a lil' celeron ;)

Although, I put the 5.22 dets on with S3TC and it never drops below 45 now, tis quite good in heavy scenes too.

Any other game is piss fast, but me new CPU should see me with 80fps+ rather than 60 @ those settings :)

Yeah, I can tell the diff between 30 and 60 and 90fps, I think ppl who say they can't tell the diff are speaking arse too :)

I'd rather play in 1024*768 in 32bit simply becuz I only have a 15" monitor :)
Heh, the herc's ram heatsinks are shitw, and they have air pockets under em according to many people... thus, rip em off, get some proper ones and you could possibly get your ram upto a healthy 400MHz! :)
 
O

old.frankie

Guest
i heard that 32 bit colour doesnt add any extra vid card CPU pressures, because the cpu on the card is just as good at 16bit as it is at 32bit. The only thing that makes the slow down is the memory bandwidth limitations, so in theory, if a TNT2U had unlimited mem bandwidth then there would be no slowdown from 16 to 32 bit colour, or so i hear.

Also, 32 bit colour grows on you after a while, if you just swap from 16 to 32 you prolly will think its pointless, but after a while it really grows on you, like an addiction.

Anyway, who cares if it only runs at 60fps in 32bit, only serious speed demon freaks whos only goal is to win would care ( those people run at 512x400 in 1 bit to get like 500 fps, on a brand spanking new GF2 ).



------------------
----------------------
-------------------------
----------------------
DR_FRANKENSTIEN
 
E

Embattle

Guest
60 fps is fine since you can't tell any difference at a higher rate.

32bit is bull because in Q3A you don't stand around looking at walls unless you're TUG, must get killed a lot then :D.
 
O

old.TUG

Guest
heh frankie - interesting theory ;)

Probably isn't the case though m8.

Oi embattle - look @ da sky when you are playing Q3 ;) Compare 16bit to 32bit and you'll see a huge diff... why am I lookin at the sky you ask? Well... u ain't seen me l33t mouse usage skills innit ;)

heh, no banding from RL shots in 32bit either ;)

But, to tell the truth I only use 32bit for Q3 cuz it looks nicer :) Can't tell the diff on many games. Tried FSAA on me GeForce, I can at least run Q2 quite fast with it on :)
 
E

Embattle

Guest
What I mean by bull is, sure it makes a diff but when playing Q3A you'll not notice, unless like TUG your sky watching and a whole army goes by with him still looking at the sky ;).
 
O

old.sjp

Guest
hehe :)

right i played q3 for a while last night (with everything maxed out 32bit/res/test etc) and appart from the fact i was only getting ~50fps, the main differance was in the fog :), ill admit that THAT looks a lot better.


and heres another one for TUG :)

ive noticed that the texture probs on me card seam to have more to do with texture compression being enabled at max texture detail than the absolute clock speed.

i get slight (v rare) corruption in q3 @ 1280 max texture + compression at the default speed (333mhz).

when i increase the clock it steadyly increases (@ 345 theres a lot) BUT if i turn text compression OFF it dissapers and is fine at 350mhz. is this a known problem ?

any sugestions (but to be honest i can cope with the default settings :))?
 
O

old.Nosser-

Guest
60fps is plenty for gaming.

"You beat me cause I only get 60fps" is the same as saying "I'm shite and will try blaming my plentiful frame rate"

I can play quite happily with 30fps, on my old system q2 gave me about 15-20fps and I still had no trouble playing it.

------------------
:/
 
O

old.TUG

Guest
Nah 60fps is slow to me ;)

I can tell the diff between 60 and 90 sooo easily in q2! I have to cap it to 60 online though which is a cunt, but offline = mmmmmmmm :)

PIII 650@900+ will see Q2 timedemo's in the 180fps+ range - thats comin on satarday

Never heard no probs about texture compression and image corruption... have a look @ www.tweak3d.net/faq to see if owt is listed there
 
S

stu

Guest
Hm.

Last time I played Q3 (probably the same day I installed it, shite game) I got well over 100fps solid. That was with 1024x768, 32bit, EVERYTHING turned on.

Having a decent 32bit rendering card makes a MASSIVE difference. If you can't tell the difference between SLI and a GeForce running Q3, you need a white cane and a labrador. Hell, if you can't tell the difference in Q2... 3DFX chipsets tend to make the graphics look very washed-out and weak. GeForce makes it a hell of a lot richer.

The human eye can detect up to about 80fps, btw (hence why 80Hz is "flicker free"). Who said anything about 25fps?

Can you actually tell that much difference between 1024 and 1280? Or rather, does it actually make that much difference in game? One of the things that struck me, going from an utterly shite PC to a bells and whistles one, the improved graphics are great for 5 minutes, but tbh I barely notice them... I'm concentrating more on shooting people.

If you've got a 333MHz chip, that's your problem. The faster your chip, the more the gap between the SLI and Geforce is going to become.

BTW, the problem with the texture rip... sounds like the one I had. Turn your i/o voltage up by 0.1v - your card isn't getting enough power. The reason why it becomes more apparent when you clock is because the CPU is draining more power, so the card is getting less.
 
O

old.sjp

Guest
Right an update here :), last night i played q3 for while with everyting mnaxed out (1280x, max textures 32-bit etc) and APPART from the crappy frame-rates ;) the only real differance i could see was fog/smoke, that is ALOT better!, but i can live with the dithering in 16-bit :)

1st, Ta Stu ill give that a go :)

but =here i go all thicky :eek:=, does anyone no if u can change the i/o voltage on an asus K7V ?, there are options for the core voltage in the bios, but, i ant seen one for the i/o (i take it they arnt the same thing) ?

2nd, Q3a is NOT a shite game, tis the best :)

3rd, I have said the geforce2 was quicker (but i could hit 115fps on me v2's ......), it was just that i dont think the image improvement is worth the frame rate drop when u go 32bit.

4th, Personally i have found the image quality to be worse on me geforce than on me v2's (smoke/fog alot better tho) (talking 16-bit of course), funnly enuff i would describe me geforce as "washed-out and weak" where as me voodoo was nice and rich (nowt funnier than folk i guess) :)

5th, 1280x looks alot less jaggy so yes i can tell the diff, but when i start playing the detail/mode slowly drops to keep everything as smooth as possible.

6th, tis all relative, to be honest a 200mhz core/333Mhz mem vid card isent REALLY that slow (id just rather have more if possible) :)

7th, you normally cant make a post which mentions frame rates without getting the following kind of repply ........


------------------------------------------
Jonny Pleebb Here

U talking CRAP blah-blah-blah-blah-blah
the human eye cant see faster than 25fps blah-blah-blah-blah-blah TV is smooth blah-blah-blah-blah-blah no differance between 25 and 95 blah-blah-blah-blah-blah blah-blah-blah-blah-blah blah-blah-blah-blah-blah

:) :) :) :) :) :) :) :) :) :)
 
O

old.TUG

Guest
I wanna see stu's full system specs if he can honestly pull 100fps in q3 timedemos at 1024 with everything on...
 
O

old.sjp

Guest
HeHeHeHe

Out of interest i get arround 80fps on high quality with r_picmap 0, r_subdivistions 1, r_lodbias -2, r_lodcurveerror 10000 @ 1024x.

pc = geforce2 (32mb), asus k7v, 256mb ram (133@cas2), k7 700 (@735 :(), aureal vortex 2 soundcard.

also the "sweat spot" resolution wise for me is 960x, tis a could balance between frame rate AND jaggies . . . . . also (if iam honest :)) 32-bit+max textures is quite playable at this resoultin (>85fps). (would like 1280 instead tho :))

Stu-i bumpped the i-o voltage up a bit and it as help to stablise me system in general (was getting the odd lock up when re-booting) but i still get the texture probs ... :(

After looking into it a bit more closely, it only seams to happen at res >1024 WHEN TRILINEAR filtering is on ? ? ?


Heres 2 o/c questions :)-

1. when i bump the fsb up on me pc (only to arround 107) it runs fine (v stable) and boots from off with out any probs. BUT, when i leave the bios settings screen or restart windows it looks up ?, i dont think its an heat prob cos u can turn the pc off/on STRAIGHT away and it boots fine ?

2. when o/c'ing a vid card, what is it norm to test it on ?

i was/am using hq @1280x with max textures on the quaver demo (thought if anything was gona show probs up that would).

if i was testiung using lower res's me card would be classed as stable at >350 mem and 215 core ?

if i drop the texture detail down i get the same results,

if not get the flashy textures @334/200mhz (odd ones at default)?

so the question is what constitutes a stable card, when web-site www.Some-site.com says it clocks to 360 what are they normal testing it with, and stu what do u use ?
 

Users who are viewing this thread

Top Bottom