View Single Post
Old 06-07-09, 08:57 AM   #9
CaptainHaplo
Silent Hunter
 
CaptainHaplo's Avatar
 
Join Date: Apr 2007
Posts: 4,404
Downloads: 29
Uploads: 0
Yes and No. The explanation I gave isn't perfect by a long shot. Both do some scaling, in a sense. What happens is your vid card outputs a signal at a certain resolution - what you set it to in the OS. If that isn't the same as what the monitor is set to - then yes the monitor has to scale it to fit what you have. However, most monitors currently just have a range of supportable resolutions, so they simply accept the proper input and display it, no scaling needed. The "native" res is the one which is best displayed on the monitor. Only monitors with their own "manual" res selection thus need to do any scaling. Edit - I forgot about the autoadjust function - that does define some minimal scaling for the input so that it uses full screen area. So pretty much all monitors do SOME scaling.

The key is the vid card. Ever notice in windows when you go to resolution - there is a "recommended" one? This is based of the monitor. Its also what win draws initially. So the GPU draws this for output to the monitor. But, if you have the monitor set to a different res, the GPU then has to scale the drawing back - because IT is the one that knows what resolution to output. The monitor is passive in that. Make better sense now?
__________________
Good Hunting!

Captain Haplo
CaptainHaplo is offline   Reply With Quote