15

I've tried a lot of things but I can't seem to get it working correctly. Below is my setup.

Laptop is a Dell Latitude E6535, which has Video chip/card NVidia NVS 5200M.

I have two external AOC IPS i2367Fh 23-Inch Screen LED Monitor. One connected to the laptop through HDMI and another through VGA.

Problem is, the text and overall image looks perfect and sharp on whichever of the two I connect through HDMI, while the VGA one just does not seem right, text does not look crisp, and just blurry enough that you cannot tell it's blurry, but you feel there's something wrong about it. I asked the wife to look at both monitors without telling her there was an issue, and she said the same thing.

Now, I have made sure it is not the cable because like I said, if I connect the monitor that looks odd through HDMI instead of VGA (i switch which monitor uses which cable), then it looks fine and the other looks bad. I've also tried two VGA cables.

When I go to the NVDIA settings, only one of the monitors can use the NVS 5200M chip, while the other one takes the Intel HD Graphics 4000 adapter, and I think that depends on which one I make my main display, but making the VGA monitor my main display (even though it would use the NVS 5200M) does not fix the issue. It will still look bad.

The resolution that I'm using is the native one for the monitors, 1920 x 1080.

I already tried tuning ClearType but did not fix it either.

Any ideas are welcome. Thanks

EDIT:

Thank you everyone for the responses/suggestions. It seems in order to get it working I will need either a docking station or an adapter. I'm considering the following:

LAST UPDATE:

The USB to HDMI adapter did NOT work for me at all. I just went with the docking station and DP to HDMI adapter and it worked flawlessly. Crisp image and text on both external monitors.

silverCORE
  • 705
  • 4
  • 13
  • 23

7 Answers7

24

VGA is analog. HDMI is digital. Meaning: the digital output of your computer is converted to the analog VGA signal. The analog VGA signal is converted back to a digital signal by your monitor. These conversions depend on the quality of the involved cable, connectors and especially the analog/digital converter components within your graphics card and the monitor. It can be very good, but never perfect. Some data is always lost/changed. At low resolution, this difference is not notable. At higher resolutions, it is. And with the analog use case being not very common nowadays, you can expect vendors to use cheaper/worse A/D converters for current hardware, not better ones. See also: http://www.brighthub.com/computing/hardware/articles/23769.aspx

Matthias
  • 411
19

Each and every VGA flat-panel display has an "Auto" button. It automatically adjusts the interpretation of the analog signal to achieve a (more or less) pixel-perfect mapping.

Activate this function when the outer edges of the image displayed are clearly defined (nothing black) and you have text visible.

Still, 1080p is in the upper regions of what's possible (at all) with VGA and many devices nowadays have lowish-quality VGA output.

user219095
  • 65,551
14

Basically, HDMI is digital and VGA is analogue.

There are a few solutions:

1) Buy a docking station which provides access to 2x HDMI or 1x HDMI + 1x DVI.

2) Use an on-board DVI instead of the VGA.

3) Buy a HDMI -> VGA converter. The 1st VGA will no longer seem blurry in comparison with the HDMI.

Martin F
  • 213
Sarima
  • 239
11

As others have said, VGA is an analog signal but the pixels in a flat panel display are digital. The monitor has to know where in the VGA analog waveform to sample the signal to convert it to digital. The auto adjust feature on flat panel displays attempts to guess the best timing for that sampling. But if the timing isn't perfect, you will get a blurry image.

To give your monitor the best chance for the auto adjust to pick the correct timing values, you need to display an image with lots of high-contrast transitions. My go-to for this type of image is a single-pixel black/white checkerboard. And this is the place I always go to get that checkerboard: http://techmind.org/lcd/phasing.html

tl;dr: Go to http://techmind.org/lcd/phasing.html, maximize your browser window, and press your monitor's auto adjust button.

longneck
  • 402
3

I suspect the lower quality you see is due to the analogue nature of the VGA signal. The higher the bandwidth you use (higher resolutions), the worse it becomes.

I see one reliable but not exactly cheap solution: A docking station. I checked briefly, the one for the 6540 comes with dual DP and dual Dual-link DVIs, make sure you have sufficient ports before you buy. Then, with both screens connected digitally, you should see crisp text on both screens. You might still see a difference in colour though.

TheUser1024
  • 2,929
1

I think the following may be of help for you.

I've actually had the same problem myself (I obviously can't tell if the root cause is the same but the symptoms were very similar) and here are my specs, my findings and my trick to workaround it :

Spec : I'm using 2x identical Samsung monitors, a custom built desktop unit and a lenovo laptop (no docking station). The lenovo laptop has a minidisplay port and a vga port. Both of the computers go through a dual screen video switch, which allows me to switch between both computers while still using both monitors on each.

Symptoms : When I switch to the laptop, the monitor plugged through VGA (VGA -> Switch -> VGA) gets a slight blur, which is pretty aggrivating when you're typing, reading, etc. The problem never happens to the other monitor AND, more importantly, also doesn't happen to that same monitor when I switch to my desktop, which sends its output through DVI (DVI -> Switch -> VGA), thus isolating the problem on the laptop side (the monitor, switch and pretty much cable are fine)

Findings : Initially, I realized that when switching to the laptop, Windows discovered the monitors differently (one had a factory name, and one just was recognized as a Generic monitor). I saw that along with those discovery settings, the frequency was different (60Hz on the correct output, 59 on the other one). So I set the second one to 60Hz and that seemed to fix the problem temporarily. A few days later, the problem showed up again and this time the frequency was still correct, sitting at 60Hz. So I ended up getting a correct picture again by doing a new discovery run by clicking on "Detect" (when both screens go black for a second and then the display comes back).

Current trick : This is a very weird bug, possibly in the display output/drivers/hardware etc of the laptop, which I can't control anyways cause it's a company laptop. All I know is, a straight up dicovery run doesn't always thoroughly discover, and so to do the trick, I unplugg the VGA output, hit "Detect" again, and then plugg the VGA cable again. That seems to force the "deep" discovery and fix my problem temporarily.

Obviously, if anyone can pinpoint the root cause and fix the problem long term, please write something here! All I know is, all of the answers above are pure speculation and hence, largely irrelevant.

nairod
  • 11
0

I met the same issue, I am also using AOC monitor. I almost chose to use the recommended answer with the green tick.

But I did a bit of research, there is a setting 'Extra->Auto Config', the default is No, change it to Yes, the Monitor then showed a process bar, and when 100% finished, the issue was gone.

No need buy anything!

The question was asked 10 years ago, hope my answer can help.