Question : Problem: Digital output to monitor is blurry; analog output is fine

I have recently built a computer and after some troubleshooting almost everything seems to working correctly.

The problem: my digital output is not working correctly. When in digital mode, the lcd is fuzzy, as if it was not running in it's native resolution. (The monitor's information box reports that it is set to a strange resolution, 1280x1080 if I recall correctly.) Windows' display control panel reports that the resolution is set to 1440x900.
The vexing thing is that if I use a VGA cable (and a vga-dvi adapter so that I can plug it into the card), everything works fine.
I've tried different dvi cables to no avail. I have hooked the monitor up to a different computer (with an ATI video card) with dvi and been had a proper image.
I have downloaded the most recent motherboard drivers and video card drivers. I think I have the most recent monitor drivers, but Samsung's website does not specifically list the 906bw as a model with drivers.
After spending an hour on the phone with BFG's tech support, they concluded that my card may be faulty. I RMA'd it, and got one that is almost certainly not faulty.
Unfortunately the problem still exists.
Has anyone encountered this or have any idea what the problem might be? I *can* use VGA and it looks just fine; but if I've got the digital capability, I'd like to use it.
Thanks!

Theoretically Relevant Computer Specs:
EVGA 680i sli motherboard
E6600 dual core
good ram (passes memory tests)
X-Fi sound card
BFG 8800 GTX 768MB video card
Samsung SyncMaster 906BW widescreen monitor

Answer : Problem: Digital output to monitor is blurry; analog output is fine

try changing the resolution lower. the monitor is telling you it is in 1080 (HDTV) resolution. but the computer is stating it is in 1440x900 which is also a wide screen format. when you change the resolution check it in the monitor menu and what does it say it is? there has to be a point when the the reported resolution and the video card settings in the display properties match. once we find that out we can continue to trouble shoot this problem. I am aware that the VGA works fine. but the difference between VGA port on your video card and the dvi port is the dvi port is mapping the data digitally and the VGA is using an  analog signal mapped against time and using the D/A converter on you video card vs the monitors ability to properly interpret the digital signal correctly. which it might not be able to because the card may have a bad digital ground and noise is interfering with the signal.
Random Solutions  
 
programming4us programming4us