X
Xavier
Guest
By now we've all read articles pointing fingers at company x for "optimising" their drivers specifically for application y, and for the most part the only solution is slapped wrists all round for those concerned...
Personally I don't believe you can call any alteration a games codepath which results in poorer IQ an optimisation, after all if you wanted things to look worse but run quicker that's what the texture detail slider is for.
What if instead of sticking the optimisations into your driver, the 'bias' occurs at an application level, like we saw with 3DMark 2003 back in April - is that any more forgivable? If you accept the input from one manufacturers developer relations’ team so as to improve your games performance on their GPUs, surely you owe it to your prospective customers to have similar relationships with all the major players?
The only other logical answer is to avoid such tweaks totally, but it would be crazy to suggest that games developers coded 'down the line', producing arbitrary DX9 code with zero optimisation for either side, the different architectures of the hardware after all is becoming way too diverse.
With games developers beginning to talk in terms of whose hardware their titles favor, a division is already beginning to appear, and they seem fine with that... Isn't it their job to make sure their engine is geared in such a way as to lever equal performance and eye candy from the flagship products of NVIDIA, ATI et al? Do we really want to end up buying our graphics card based on which games a GPU will run? Or to have to choose a GPU knowing that certain games will be a non-starter? I thought the days of needing a specific GPU to play a certain game such as what we saw with Glide and 3DFX were long gone... but with some of the rumbles we've been hearing from certain camps lately we're not a million miles from such a situation becoming reality.
Personally I don't believe you can call any alteration a games codepath which results in poorer IQ an optimisation, after all if you wanted things to look worse but run quicker that's what the texture detail slider is for.
What if instead of sticking the optimisations into your driver, the 'bias' occurs at an application level, like we saw with 3DMark 2003 back in April - is that any more forgivable? If you accept the input from one manufacturers developer relations’ team so as to improve your games performance on their GPUs, surely you owe it to your prospective customers to have similar relationships with all the major players?
The only other logical answer is to avoid such tweaks totally, but it would be crazy to suggest that games developers coded 'down the line', producing arbitrary DX9 code with zero optimisation for either side, the different architectures of the hardware after all is becoming way too diverse.
With games developers beginning to talk in terms of whose hardware their titles favor, a division is already beginning to appear, and they seem fine with that... Isn't it their job to make sure their engine is geared in such a way as to lever equal performance and eye candy from the flagship products of NVIDIA, ATI et al? Do we really want to end up buying our graphics card based on which games a GPU will run? Or to have to choose a GPU knowing that certain games will be a non-starter? I thought the days of needing a specific GPU to play a certain game such as what we saw with Glide and 3DFX were long gone... but with some of the rumbles we've been hearing from certain camps lately we're not a million miles from such a situation becoming reality.