2 possibilities
First you have to remember that your monitor has a "native" resolution. This is what the monitor runs best at - the GPU has to do minimal "scaling" to native compared to trying to run something different.
When your PC knows what the native is on the video - and you change it - your making the GPU do 2x the work. First it has to draw the picture in memory - then redraw it to the odd and funky scale you want on your monitor just because you don't want to use the native res. This isn't exactly the way it really works - but running at a non-native res is often harder on the vid card.
Second - with CPU intensive games, the CPU is in charge of "throttling down" on the graphics. When you push the graphics up, your actually not making the CPU spend process cycles "deciding" what to draw and what not to. The CPU just sends it all to the vid card - and the GPU then draws it. It actually frees up the CPU when you increase the detail levels. More data moved off the CPU and onto the GPU. Less info being held in the cache and more stuff sent into the memory of the card - like texture data. As long as the GPU can handle it, your going to see better frame rates in the rare instances the game is a CPU hog vs GPU. Really all depends on where the bottleneck is.
__________________
Good Hunting!
Captain Haplo
|