Graphic Cards

O

old.sjp

Guest
gota say buying the radeon was a(nother) cockup bodhi .....

they are expensive, slower than a geforce2 gts and whats all this shit about dvd's ?

i have a Herc Geforce2 gts (includes tv output) and the picture quality is great (using windvd2000), tbh, most hardware acceleration helps to reduce the cpu load NOT improve the picture quality and since i assume you ARNT using a celery 400, who give a fuck ? ? ?

out of interest the cpu load never goes above 45% when iam watching a dvd (k7 @ 735) so whats the problem ?




and seeing as frame rates and 32bit colour have been mentioned ............

32 bit colour DOES NOT look massivly better than 16 bit and certainly is NOT worth the massive frame rate hit, ffs in 16 bit i can play q3 with everything maxed out geomerty, textures (dyn lights if i wanted :)) @ 970x and get a constant (and capped) frame rate of 125, in 32bit (which i used for a while) it averaged at ~90 BUT it DROPPED to ~45 on quite a regular basis (and yes there IS a big diff between 40, 90 and 120), maybe the V6000/Geforce2 ultras can finally make 32bit pratical (anyone got one and can comment ?) but nothing else on the market does ......

of course the above only applies if you play at greater than 800x600 and i guess it depends on what you call playable, but if your into first person 90 is the lowest you should be aining for.

how the fuck review sites can say 40 is ok mistifies me, an average of 40 translates to moments of ~15fps when it gets heavy (bit like playing quake on my old 486:)) so they should stop missinforming ppl .........



k :), back to the original question, get a Geforce1 DDR (or a geforce 2 gts if you can afford it), its quicker than a MX (ppl stop talking crap and face the truth :)), its arround the same price (i just looked on aria's website) and there is nothing stoping you from trying to over clock it when you get it :p, but remeber you might get one which wont ... it happens :(
 
O

old.TUG

Guest
Bodhi, I could have done you that radeon for a nice price of £263. Nevamind eh m8? Haha :D
 
B

bodhi

Guest
Believe it or not the intention of my post was not to start a radeon vs gh3yforce 2 flame war (mainly cos nVIDIlamers are about as insufferable as Q3 players - although they are mainly one and the same - anyway I digress), twas just to inform you that there are cards out there other than, and in some cases more suitable than Geffer 2's.


BTW The geffer 2 is not faster than the radeon. It depends on the res/colour depth. Geffer is faster at low res and 16 bit, Radeon smacks it down at high res and 32 bit. And lets face it, if you have spent nigh on 300 quid on a graphics card, you arent going to play in 640x480x16 are you (Unless you're an anal Deathmatcher who cant play with less than 457390850934856093760957690545 fps.) The 50fps I get atm is fine enough for me, especially as I hardly ever play q3.

To quote anandtech.com, "If you want a fast and fully featured card, the radeon is your best bet". And as I want a fast and fully featured gfx card.........

Anyway I shall report back when its installed.

Bodhi

P.S TUG, oh no you could have saved me 20 whole quid. I am like so gutted.
 
O

old.Davehart

Guest
graphics cards? Ok.....


Quick challenge for you:

Who can get the maximun / little FPS using a ATi Expert 98?

mines about 12 ave with a P3 500 and 128Mb ram
 
O

old.Alpha

Guest
ATI Radeon - very nice card, feature packed and well designed, but then we come back to the same problem that haunts ATI time after time, imo very poor driver support, its well known they cant write high quality drivers for their cards and never have.

Im at that stage now where im looking for a new card, my v3 is now a venerable oap in the gfx card market and as such im obviously eyeing up the geforce 2's, im running whats classed ( on most review sites ) a low end system - p3 500.
As someone said previously, a a Geforce DDR is a sound investment, comparing price and performance and new products on the horizon. Theres always a better card and that always muddles decisions, do you wait for it or buy now.
A fellow clan member who works for Bullfrog and is already working with Xbox dev kits made a good point - The NV20 will be the entry level card when games such as Doom3 appear.
With that in mind, im going to get a Geforce2 GTS, depending on where you shop, its between £ 40-60 more expensive than a standard Geforce DDR but will buy me an extra few months down the line.


Theres a good article here

Sharyxextreme Geforce2 on a value system

on comparing a geforce 2 on high end ( 1gig) and low end ( 700 ) amd and intel systems, whats important is to realise that the higher resolution you go, the difference in fps between high and lo end drops, to quote the benchmarks in the article

Running Q3 in 1280 on high end yields approx 49fps on low end and 53fps on high end.

Rich.
 
S

stu

Guest
Originally posted by Bodhi
BTW The geffer 2 is not faster than the radeon. It depends on the res/colour depth. Geffer is faster at low res and 16 bit, Radeon smacks it down at high res and 32 bit...

To quote anandtech.com, "If you want a fast and fully featured card, the radeon is your best bet". And as I want a fast and fully featured gfx card.........

Interesting. Because when I looked on Anandtech it said this:

GeForce2 GTS owners should be happy with the new drivers because the card now outperforms the Radeon in almost all situations, even with the Radeon's updated drivers.
 
O

old.sjp

Guest
Bodhi, :)

first, i am not pro nivid its ust thay make the fastest cards at the moment ( a few yrs ago id be saying the same for a v2 sli to all tho's tnt owners ).

k, now thats over :p, you are correct (possibly) about the radi being quicker than some geforce2 at high res/colour BUT you obvously dident read the rest of my post ..... NIETHER is really PLAYABLE at those levels so the geforce 2 is effectivly a hell of a lot quicker at resolutions that you CAN ACTUALLY PLAY USING .... ill repeat again slowly :)

m y g e f o r c e 2 g i v e s 7 5 f p s i n q 3 at 1 0 2 4 i n 3 2 b i t m o d e t h i s i s W A N K a n d i s n o t r e a l l y p l a y a b l e o n l i n e. :p

if you (or others) are gona turn round and honestly say you get far better results on the high quality settings (1024 @ 32) you/they are talking shite, the only card which is practicle at these res's is the geforce2 ultra (and the v 6000 if it ever appears).

now (the nice bit :)) the radi is a nice card which looks good (as long as u dont use win2k) and a some nice multi media features which are not standard on the geforce range (but can be got if you look around) horses for courses really if you mostly play games the geforce is a far better card, BUT if you the odd game, and like to mess arround with video and watch dvd's the rad is better for you. BUT its expensive (why is everyone going on about how dear the geforce2 is ? i got one the wk it came out over here and paid £240 for a herc with tv out and a free game ....)

anyway whatever you got it will be a big step up from Davehart ATi Expert :p (i got 32 fps in q3 btw :p, 256mb, k7 735 cant remember the settings :))
 
G

granny

Guest
Just to reply to sjp about the 32/16 bit thing... mate if you can't see the huge difference between 16bit and 32bit you really need your eyes tested. Yeah there's a big fps hit and I certainly don't use 32bit for any online FPS gaming (mind you all I play is CS anyway ;) but for single player gaming it can't be beaten - try firing up something like Deus Ex in 16bit then have a look at it in 32bit, like jesus.. different game.. same with Homeworld, etc etc.

Reminds me of the shite people used to come out with when they'd just spent loads of dosh on some kind of 3dfx spank and were trying to justify their purchase in the face of faster & better looking cards from nvidia :p

Oh god I said nvidia, that fool Bodhi's gonna pipe up again now isn't he? Sigh...
 
O

old.Mikey

Guest
I'd also recommend the GeForce DDR, got one on my C300 @ 450 an it's brilliant. I get no real slow down at 1024 * 768 and 32 bit colour.

The current graphics cards are just too damn expensive!



Mikey out!!
 
O

old.sjp

Guest
tis ok m8 me eyes are fine (i had them tested l8ly :p)

and strangely, iwhen i upgraded my old v2 sli's to the geforce2 i WAS NOT blown away by the quality improvemet, it was simply alot quicker (i always prefered the 16 bit look of me old vs's, but i guess iam gona be called a 3dfxllama now ? :p)

k, sensibly, meybe it depends on the type of game you play, but i personlly prefer q3 (que q3llama coments :p) and i have to say the it doesent look much different in 16 bit mode (apart from the fogging and the fact the 32 one is half the speed).

the fog does look better in 32 bit mode, but there are ways of improving that in 16 bit if u do a few tweaks ....

also personally id rather have nice sharp tetures at 1280x in 16bit (which is quite playable offine on my system, quicker than 32bit at 1024 anyway) than having 800x600 runing at 32bit.




hmm, that sounds rong -> i mean i find 32bit 800 fine with max eyecandy at 800, but to slow at 1024, but 16bit 1280 is somewhere inbetween and is quite playable :)
 
S

stu

Guest
Originally posted by sjp
m y g e f o r c e 2 g i v e s 7 5 f p s i n q 3 at 1 0 2 4 i n 3 2 b i t m o d e t h i s i s W A N K a n d i s n o t r e a l l y p l a y a b l e o n l i n e. :p

I don't get it. Are you trying to tell me that 75fps is unplayable because it's too slow??
 
O

old.sjp

Guest
ffs YES !! :)

its a bloody average, average of 75 includes momens of less that 40 whihs is way to low !

like i said early the LOWEST you should aim for (if playing online) is 90 fps, this means when it slows down you will stil get arround 60 fps.

remember i am talking about playing online, if its a single player game 40fps is ok.


also, altho' you cant always see the difference, you certainly can feal it, remeber a frame rate of 40fps equates to a ping of arround 25 ..... and my steady 90 would give 10.
 
B

bodhi

Guest
That's Interesting. I used to play online with a P200 and voodoo 1. Got 35fps average. Seemed more than playable enough to me. But then as I said earlier, I'm not really an anal q3 player who cries when my fps go to less than 40000004938549038590348590.6. I even find SLI playable under q3.

Choosing a radeon over a geffer 2 had nothing to do with speed. Consider this. If you had a bad experience with say a Ford Focus, you'd be very wary of buying Ford again. I had a bad experience with a Quantum hard disk, hence I shall never buy Quantum again. And funnily enough, I had a bad experience with a TNT2u (visual quality a Rage Pro would be ashamed of - OK mebbe a slight exaggeration but you get the picture - , about as average as you can get performance - how does paying 180 quid for a 3fps gain over a Rage Fury grab you? - and plain awful DVD performance (I was waiting for the next generation of set top boxes to come out)), hence I shall never be buying nVIDIA again.

I also feel the need to stand up for ATi's drivers (someone has to :) ). The drivers that got them this bad reputation, were the drivers for the Rage 128. Now ignoring the fact that I have had a Rage Fury for 18 months (I gave the TNT2u to the person who bought the Rage off me when I got the TNT) without a single driver related problem, there is a reason why early Rage 128 drivers has issues. ATi redid the way the drivers handled the hardware, and hence were starting with a blank slate. Compare this to nVIDIA, who's chips have shared components since the Riva 128, you can probably see why nVIDIA's drivers are a bit better. Which brings me onto something which annoys me intensely about anandtech, tomshardware and the other big hardware sites. nVIDIA improve their drivers and leap in front of Radeon. What they dont seem to acknowledge is, whats to stop ATi doing the same? Afaik reputation never got in the way of development. This has nothing to do with the fact that I like ATi cards. It would be the same if 3dfx, matrox, s3 or even number fucking nine (anyone remember them? :) )were in the same position.

/rant over (Sorry it was a bit lengthy, but I'm stuck in a physics lab getting intensely bored.)

Anyway the bottom line is, this new competition between ATi and nVIDIA is good for us consumers. All we need now is 3dfx to come back from the dead and prove they can still make a decent chip (Rampage?....) and it could be the best situation for us in years. Perhaps we should stop the "My chipset is better than yours" and just be happy that no one company is dominating everything.

Let the good times roll :)

Bodhi
 
B

bids

Guest
Bloody 'ell Bodhi, that was a half rational rant for a change ;)

Given up winding people up then ;)
 
O

old.sjp

Guest
yep,


like i said it (the rad) is nice enuff card and tbh, most cards today give quite repectable performance in most games.


and yes, its a shame that 3dfx went down the pan, if only the voodoo 5 had been out when it was supposed to have been ..... :( .... o well


but i still say ATI's Win2k drivers are pretty bad :p
 
O

old.logic7

Guest
I just got my GeForce MX the other day and the only problem that I have with it is that I prefer the visual quality of the Savage4 that I was running before. Other than that, it RAWKS!!!!!! I was playing Tribes last night at 1024x768x32 with everything maxed and it was very playable. It was still faster than the Savage4 at 640x480x16. I fired up UT, It's butter smooth. The Savage4 using Metal is still the best looking version of UT, but it's not as fast as the MX.

To sjp:

I've never had a problem playing online... Even when I was playing Quake on my Cyrix P200MX with a Matrox Mystique220 and 33.6k modem, I didn't have a problem. If someone want's to complain about playing online 'cause he's gettin 70FPS, you need to check yourself. When Q3A Test first hit, I was playing it with a P200 o/c to 262MHz and a Real 3D i740 card. I could play then and I can still play today. Hell, the other game machine on my network had a Permedia2 in it and we played online with it (Celeron366, card at 32bit color) with no problem. I think that the problem is with you. You must to suck gerbil ass at online games. If me and my crew could play successfully with a damned Permedia2, you can play with a GeForce.

I your fps does not in any way, shape, or form, equate to your ping. Your modem's packet speed does. Guys like you kill me with your little foo-foo problems.

[Edited by logic7 on 26-10-00 at 14:50]
 
O

old.sjp

Guest
hmmm, wtf you on about ?

for starters iam sure i could kick u butt at q3 on any system u wish to pick, with what ever type of connection you want :p
its ust thet ITS ALOT MORE FUN with a LOW PING and a nice HIGH FPS, but i guess your some sort of perv that likes to punish him self (hehe, why play on a 1GHz p3 when u can use that old 8086 thats sitting upstairs ?)

now thats out off the way, i DID NOT SAY fps was ping, just that it had a similar effect. Of course a low ping / high fps combo is best :), but persoanlly i found playing on a modem with a geforce2 far better than having a crappy frame rate and isdn.


Now, heres what i was getting at - its simple math (as the US would say :)) -

25fps = a delay of 40 msecs between screen updates
40fps = a delay of 25 msecs between screen updates
75fps = a delay of 13 msecs between screen updates
125fps = a delay of 8 msecs between screen updates

now do we agree that this deley exists between screen updates ? yes ?, well its a tiny bit like your ping then isent it ?

also if you could nock 25 msecs off your ping would you bother or just says "its not worth it" ?

also Plonkers, the q3 physics model is tied to the frame rate NOT THE PACKET RATE, try and do a quick mid air rail kill and you can soon feal the diff, the mouse movement is nowhere near as smooth at 40hz (lets not even mention all the jumps can can do at 125hz (hehe 1.17 roks :)))

also to be fair, all i was saying is that this generation of video cards is not UP TO RUNNING games in 32bit mode at high res's with max eyecandy, why does this upset some of you ?



*********************************************************

remember, iam not saying your card is shit or anything, iam slagging all manufactures off equally ......... they should all pull their fingers out and give us ones that can paly at 1280x at 32 bit colour depth :)

given a choice had rather have 32 bit (although its not that much better imho :p) and high res over 940x at 16 bit, but for me, we are not at that point yet.
 
O

old.logic7

Guest
sorry dude, but ping and fps do not have a similar effect. I have never read or heard of such a thing. The only reason that we want high fps is for overhead when things get intense (so the action doesn't get choppy). We don't want it to look like Doom on a 486sx 25MHz. Your fps should always stay above 30fps, at that framerate everything appears to be fluid. 60fps is ideal, but not always attainable. If you're like me (and many other gamers), you usually turn off a lot of eyecandy to get the highest fps possible.

You complain that 70+ is not enough... Take a look ath the Gaming section of my website (I haven't updated the section in ages!) http://members.theglobe.com/reflexx1

Take a look at the screenshots for Q2. In Demo1, I was able to get 77.9fps. In Crusher (which is far worse than anything I've ever encountered online) I got 38.9fps. Sounds a bit like you with Q3A with your GeForce, huh? I saw no problem when playing online. I have a broadband connection and see sub 100ms ping times regularly. Even when my ping times go over 200 (usually on a foreign server), I still have fluid images and the only problem is firing lag. Skill is the real problem here. Guys with less than I have can beat me, you, and anyone else they want because of skill. You can blame your hardware all you want, but in the end, it all boins down to skill.
 
O

old.sjp

Guest
sorry but you are talking shit logic7.


for starters FUCK OFF and dont equate "i want a nice smooth gaming environment" with "i am shit and cant play on a crap system". i can play well on any system, i uset WANT itto be as nice as possible.

and if you never notice big frame drops on 32bit eye candy when palying online, i guess you only duel, cos you have obvoiusly never walking into a room and had a fire fight.

also pls dont compare q2 with q3, almost every card on the market will give exelent perfomence on q2, but plenty stuter a bit on q3.

and iam very sorry to break this to you, but if you are hitting 30 fps IT DOES NOT LOOK FLIUD, it looks damm jerky, the eye is very sensitve to changes in frame rates.

YES a steady 25 fps looks fine on TV (but that contains motion bluring, but lets not go down that road) and yes a CONSTANT 30 fps in game is "ok2, but 60 -> 30 -> 75 in the space of a few seconds stinks.


[Edited by sjp on 27-10-00 at 00:03]
 
O

old.sjp

Guest
now pay atention and take notes :p

=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-

I WANT THE BEST FRAME RATES AT THE HIGHEST RESOLUTIONS USING THE BIGEST TEXTURES AT THE HIGHEST COLOUR DEPTHS WITH AS MANY POLYGONS ON THE SCREEN AS POSSIBLE AT A CONSTANT 85 FRAMES PER SECOND

AND I WANT YOU ALL TO HAVE THE SAME :p

=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-

ok ? :), now honestly answer the following questions .. :)



do you disage with the above statement ?


your screen shots show 25 fps at 800x600 on demo001 in q3, do you HONESTLY SAY THATS FINE ?
{ can we have a quick vote pls who thinks 25fps in demo001 in q3 is an aceptable frame rate :)}


a decent player can play well on (almost) anyting, so i want the shitist pc available, to show off me l33t skills ? :p


a low fps is NOT like ping BUT IS an extra delay .... yes ?


mouse aiming is nicer and smoother at higher fps .... yes ?


the frame rate is intrinsicly linked to the frame rate in q3 and a frame rate of 125 is better than a frame rate of 40 .... yes ?


frame rate variations are bad and should be eliminated at all costs .... yes ?


the minimum frame rate/refresh rate should be 85Hz BECAUSE the human eye is sensitve upto arround 85Hz (now be an arse and so no its not ... :)) .... yes ?


if we want to get picky, the optimal frame rate should be whatever your monitors refresh rate is (120 in my case) .... yes ?




hehe, now all fuck off and enjoy your choosen game at what ever resoution/frame rate/ping you want :)
 
O

old.MeddlE

Guest
Originally posted by sjp
the minimum frame rate/refresh rate should be 85Hz BECAUSE the human eye is sensitve upto arround 85Hz (now be an arse and so no its not ... :)) .... yes ?

I'll be an arse and point out that the human eye is only sensitive up to 72Hz, which is why that is considered the first flicker-free refresh rate on a monitor. The higher the better. I've only just got up and felt like being pedantic. ;)
 
O

old.sjp

Guest
hehe, no prob m8, i can be pedantic 2 :p

actually if you read more upto date research you will find that "now" (it goes up every month :p), the periferal vision cells in the eye are now sensitive upto arround 90Hz.

on a personal note, i can "see" my monitor refresh whenever its set to lower than 100Hz, hence it try and use resolutions that give that as a min (unfortunately thats 1152x... or less :()
 
O

old.TUG

Guest
Well me can tell the difference between 75Hz and 85Hz refresh rate easily and I can seriously tell the diff between 30fps, 60fps and 90fps+

I dunno if this is any use to this gay argument but that stuff I just wrote above is all true for meee.
 
O

old.logic7

Guest
I'll give you a moment to go back and tell me where I said 25 fps was good (Quote me if you can)...




There, you didn't find it, did you? Didn't think so. I'm not here to start anything, but you're the one whining about 90fps being the lowest anyone should aim for.

"of course the above only applies if you play at greater than 800x600 and i guess it depends on what you call playable, but if your into first person 90 is the lowest you should be aining for.

how the fuck review sites can say 40 is ok mistifies me, an average of 40 translates to moments of ~15fps when it gets heavy (bit like playing quake on my old 486) so they should stop missinforming ppl ......... "

How can review sites say that 40fps is fine, hmmm.... Maybe because it IS fine. And, if I remember correctly, most current cards don't drop to an average of 40fps until you up the resolution past 1024x768.

Now, about the Q2/Q3 comarison. I pointed that out because we appeared to get similar framerates in these games (I was getting close to 40fps when things got heavy and close to 80 when it wasn't). At the time I took my screenshots, Q3Demo was all there was to get, the full game hadn't been released yet. I haven't re-run any tests cause I'm lazy and, to be quite honest, gaming isn't that important in my life.

Quake 2 and the Crusher Demo were, at one opint, the de facto standard for measuring performance. When I took those screenshots, there was no GeForce, Radeon, or VSA100 like there is now. Even then, if you got better than 30fps in Crusher, you had a good card. Nowadays, if you can get 40-60 fps from Quake 3 running the Quaver demo at 1024x768x32 you're doing good (Since I run at 800x600 or 640x480, I don't have any worries). I was drawing a parallel between the two, maybe I should have elaborated on that.

If you believe that all of these review sites are spreading misinformation, why don't you tell them all that they're wrong? You could send a mass email to everyone to let thm know that they are spreading lies, making us all feel that we're l33t cause we can get 40fps under heavy conditions. Personally, I don't give a damn, I have my GeForce MX and I'm happy with it. I'm more productive now because of it (remember, I'm a 3D/Graphic artist too, see the "Design" section of my site), and my games are running better (no more crashes, lockups or things of that nature).
 
B

bodhi

Guest
Never knew flame wars could ensue without my input :) .

Admittedly I sit on the side who cant tell much of a difference between 60fps and 100fps, but I have to say I think we'e losing track of the point here, and that is that most of todays cards (voodoo 5 and TNT2 M64 excluded :) ) can run your game of choice at a more than acceptable framerate. I do feel that the phrase "if yoy play fps you want more than 90 fps" is a bit of a fallacy tho. Have you tried getting more than 90 in UT using D3D? But then 30 fps in UT feels like 60 in q3, which shows that different fps games have diferent standards which has to be remembered. Anyway there was a point to this post when I started it, but due to places being open selling intoxicating liquors, I cant remember what it was.

To quote joecartoon "Oh my freakin head, Im so wasted"

:D

Bodhi

P.S I shall no doubt regret this post in the morning. Ah well....


[Edited by Bodhi on 28-10-00 at 04:15]
 
O

old.sjp

Guest
look, i dont no why ive caused a flame war :p, i will reprase it again :) -


========================================================

we should all be able to play our choosen pfs @

1) 1280x resolution
2) 32-bit colour
3) masive detailed textures
4) lots and lots of tiny polygons
4) a constant frame rate of 85 fps

========================================================

ok, why, is that causing a flame war ? :p






now back to the flame :)

logic7 (again) you are talikng shit m8 :p

my point is that SOME review sites (toms hardware, allways fell into this catogory) talk plain shit when they review a card, i have read a few reviews where they say 25 fps is more than ok !!!, this is plainly shit.

think back, how many said get a TNT becasue it does 32 bit colour ? quite a few, how many said 32-bit was so slow then it was totally unusable ? not many :/

not all sites do this though thankfully (firing squad/Sharky extreme et al) and they give ppl a more honest review, and say what segment of the market it suits, but at least they use 60fps as a MINIMUM and move that up if its aimed at the "hardcore gamer" (god i h8 that phrase :))



and bodhi is sort of (damm nooooooooooooooo :p) right required fps arries from game to game (but UT's poxy 40fps in DX mode does NOT feal like quake3's 75 (cant enter 60 as i never had it that low :)), personally i played q2 quite happily online with a cap of 40.

also driving games look good at 30 (i cant think of a recent one i DONT run at 1280x at 32bit).

but in quake3 (the game a personally prefer) a high frame rate makes the movement feal so much better (like i keep saying) it does NOT LOOK any better but it certainly feals it.
 
O

old.sjp

Guest
logic7,

there is quite an easy way to stop this silly argument :) -

please fire up Q3, play with the settings until u get arround 125 fps on demo001 (shoulent take long on an MX :p).

now pick a map of your choice and move arround it using these settings, use the rail abit, do some straff jumps (try and o a few impossible jumps :))......


THEN


use \com_maxfps 40 and do the same things at 40Hz (appart from the impossible jumps :)) ....


then come back and tell me u HONESTLY could NOT feal a big difference :)
 
X

Xavier

Guest
SJP

the framerate and refresh do not equate to a delay ingame, the time from server to client, i.e. the pipe delay affects gameplay

the rest is aesthetics

it's true that the game plays better over 60fps... quake3 uses a scalable physics model that is only really 100% when your framerate scoots over 55fps

BUT

a higher framerate does not equal a lower ping, when you time 0-62 in your motaah do you include turning the key? or the time taken to start a stopwatch? what has the delay in transmission of packets of data between client and server got to do with the fractional delay between screen refreshes, playing at 40fps compared to 60 or 80fps isn't playing in slow motion, it's just not smooth...
 

Users who are viewing this thread

Similar threads

M
Replies
1
Views
390
N
A
Replies
42
Views
1K
old.RedVenom
O
W
Replies
65
Views
2K
W
D
Replies
14
Views
665
Moving Target
M
E
Replies
27
Views
876
SFXman
S
Top Bottom