» Tue May 17, 2011 10:46 am
No but the signal must be converted from digital to anolog (as per what I wrote). There is a DAC chip pre the DVI-I port on the video card wich works with the GPU to decide what format the detected monitor requires.
VGA connectors and cables carry anolog component RGBHV (red, green, blue, horizontal sync, vertical sync) video signals, and VESA Display Data Channel (VESA DDC) data. I the old days, an now still on large live events, we use what is called an RGBHV data cable wich is about 3 times as thick and contains 5 shielded coaxial cables with bayonet (BNC) connectors on either end. Usually colored red, green, blue, white(H), and black or yellow(V). This used to be the only way of achieving high quality data projection. Now the industry is moving to DVI-D with fibre optic cables.
So If the monitor only has a VGA input, then the signal must be converted to anolog in order to be displayed on a CRT monitor, wich is what I'm assuming the OP is now using.
This might help in understanding: http://www.thesmallest.com/lessonettes/dviandvga.html