3

I'm trying to understand what causes screen tearing. Suppose that a monitor could update every single one of its pixels instantaneously. Then I imagine refreshes would work like this:

  1. A monitor decides its going to start a refresh
  2. It looks at whatever frame the GPU is currently sending it.
  3. It, atomically, updates all pixels instantaneously.

With this kind of procedure, it seems like it should be completely impossible to get screen tearing, ignoring refresh rates and FPS entirely. Only a single image is drawn at a given time.

Now I know this isn't how CRT monitors work with their scanning gun (or whatever it's called). But I was under the impression that newer monitor technologies didn't work like this. Do they actually update pixels gradually, and not at once?

alecbz
  • 368

2 Answers2

2

Steps 2 & 3 assume that all frame data is transferred to the monitor in an immediate and atomic fashion, they are not. A "dumb" monitor never "sees" (or buffers) a full frame of video. Monitors still work on the same principles from when we used scanning electron beams to draw pictures. Snazzier TVs might buffer images and do inter-frame processing, but a computer monitor probably won't.

What the monitor sees is merely a data stream coming from your graphics card. There are all sorts of preliminary information sent to the monitor to tell it what format that data stream will be, so it gets details of timing information, number of horizontal lines, a number of vertical lines and colour format but what it actually gets is simply a long string of pixel colour data.

Your steps 2 & 3 actually occur in the graphics card, and step 3 will only "appear" to be the case if you enable vertical sync.

At any point in the video frame the GPU can decide to swap its video buffer to a new picture and carry on sending data from that point in the buffer. If vertical sync is not enabled then it will carry on sending the new buffer data to the monitor from the exact same point it left off in the old buffer. This is your "tear" point.

If you have vertical sync enabled then the GPU will wait for the full frame to be sent before it switches buffer in which case you will not see a "tear".

Mokubai
  • 95,412
0

Screen tearing is a visual artifact in video display where a display device shows information from multiple frames in a single screen draw. The artifact occurs when the video feed to the device is not in sync with the display's refresh rate.

Source: Wikipedia.

All monitors, LCD and CRT, refresh on a clock at a known and predictable time. Various technologies allow the graphics card to know this clock cycle so that the GPU can send data at the best time for the monitor to receive it and display the next screen's-worth of pixels.

This is typically called the refresh rate or on some TVs you'll see a Hz (that is cycles per second) number advertised. The most common these days is probably 60Hz, which means 60 cycles per second. Though 120Hz is become more common.

The various synchronization technologies mentioned above are generally used for the purpose of preventing screen tearing. They usually have limits as to how much they can resolve though.

music2myear
  • 49,799