102

Modern monitors often have a "sharpness" setting.

But I don't really understand how it makes sense for such a setting to exist.

The software asks the monitor to display a particular pattern of 32-bit RGB values, right?
i.e. The OS might ask the monitor every frame to display a particular 1920×1080×32 bitmap.

But adjusting "sharpness" means letting nearby pixel values affect each other, which would seem to imply that the input is no longer being faithfully represented... meaning it would no longer be displaying what it is asked to display, which doesn't make sense. So I just don't see where this leaves any logical room for sharpness adjustment.

Where exactly does the degree of freedom for adjusting sharpness come from?

user541686
  • 23,629

11 Answers11

82

Per https://www.cnet.com/uk/how-to/turn-down-your-tv-sharpness-control/ , "sharpness" on an LCD is part of the post-processing.

Even leaving out rescaling/upsampling (e.g. if you try to display an SD signal on an HD monitor), and the complexities of colour calibration, the monitor does not always display the image as given. This is an unfortunate side effect of marketing.

Monitor manufacturers like to distinguish their product from other products. From their point of view, if you feed the same signal to their monitor and a cheaper competitor, and it looks identical, this is bad. They want you to prefer their monitor. So there's a bunch of tricks; usually out of the box the brightness and contrast are wound right up beyond what is sensible. The "sharpness" is another trick. How do you make your picture look sharper than a competitor that is displaying the picture exactly as sent? Cheat.

The "sharpness" filter is effectively that used in Photoshop and similar programs. It enhances the edges so they catch the eye.

pjc50
  • 6,186
67

Original Question: Where exactly does the degree of freedom for adjusting sharpness come from?

Sharpness is directly related to the type of signal and content you are viewing. Movies typically look better when sharpness is turned down and the pixels are allowed to blur together a bit. On the other hand, a computer display would want high sharpness for clear text and sharp images. Video games are another example where higher sharpness is better. Low quality TV signals also can be enhanced with sharpness controls.

Being monitors can be used for displaying a computer screen, or movie, or virtually any video source, sharpness is still a useful setting.

https://www.crutchfield.com/S-biPv1sIlyXG/learn/learningcenter/home/tv_signalquality.html

EDIT: The OP has indicated in comments that this does not answer the question.

OP: Where in the problem is there room for any adjustment? Like if I tell you x = 1 and y = 2, and then say "oh, and I want x - y = 3". That makes no sense.

The process of converting a live image/video to electrical analog/digital signals, transmitting over some medium, and recreating that image on a display device is NEVER a 1 to 1 process.

Signal noise, compression loss, manufacturing and equipment variations, cabling/signal type, and other factors come in to play. All the adjustments on a monitor are designed to work together to give the end user the highest quality viewing experience - according to the end user. The interpretation is entirely subjective.

OP: This answer does not answer the question of why have the viewer adjust the sharpness when this is already defined by the content creator (be it Spielberg or Excel).

If we are to follow this logic, then why do monitors need or have ANY adjustments at all? The answer is that what we see on the screen is not a 100% accurate representation of the original data.

Appleoddity
  • 11,970
38

You are correct that for a perfect reproduction on the input, the monitor should simply present each pixel as it is delivered.

However, your eyes (and your brain) don't see pixels as separate entities, they see a picture formed from pixels. Depending on what it represents, a picture looks 'better' (more attractive) if parameters are intentionally 'falsified'.

Sharpness typically increases the contrast at color change edges, for example, a letter in this text is represented by rows of pixels, one line might look (simplified) 2-2-2-2-7-7-7-2-2-2-2 where 2 is light gray, and 7 is dark grey. Increasing the 'sharpness' increases the brightness falloff at the edge, so the last 2 before the first 7 becomes even lighter (= 1), and the first 7 after the last 2 becomes even darker (=8). repeat for the other edge, and you get 2-2-2-1-8-7-8-1-2-2-2. This will look a lot 'sharper' to your eyes.
This is done in both dimensions, and a bit more sophisticated, but that should explain you the base idea.

Edit: I thought I made that clear in my answer, but the OP claims he didn’t understand it:
OP Question: ‘what sense does it make’ -> Answer: it appears sharper to your brain.
Many people want that; if you don’t care for it, don’t use it.

Aganju
  • 10,161
38

The answer is that a pixel is not what you think it is. There is not a 1 to 1 correlation between digital pixels and physical pixels due to "Subpixel Rendering". The way colors are displayed is different in each monitor but most LCD monitors have distinct RED, GREEN, and BLUE elements arranged in a triangle. Some additionally have a white pixel making a quad of elements per "pixel".

enter image description here

Thus, not all layouts are created equal. Each particular layout may have a different "visual resolution", modulation transfer function limit (MTFL), defined as the highest number of black and white lines that may be simultaneously rendered without visible chromatic aliasing.

Monitor drivers allow renderers to correctly adjust their geometry transform matrices in order to correctly compute the values of each color plane, and take the best profit of subpixel rendering with the lowest chromatic aliasing.

The "sharpness" on your monitor reduces the natural blending algorithm used to make lines appear to be contiguous when they are not. Turning the sharpness up will increase chromatic aliasing while producing cleaner lines. Reducing the sharpness will give you better color blending and smooth the lines that fall between the subpixel dot pitch.

For more detailed information, see this article: https://en.wikipedia.org/wiki/Subpixel_rendering

HackSlash
  • 5,015
17

You're absolutely right that setting sharpness on your monitor somewhat "distorts" the image from the pixel-accurate data as sent by the computer (or whatever is attached to the other end of the video cable). However, it allows the user to improve their visual experience if the sharpness of the pixel-accurate data being sent does not correspond to their desired sharpness in the image they're viewing.

So the monitor is in effect not doing this:

  1. Receive bitmap from cable
  2. Render bitmap
  3. Goto 1.

but this:

  1. Receive bitmap from cable
  2. Modify bitmap based on user's preferences
  3. Render bitmap
  4. Goto 1.

So the degree of freedom for adjusting sharpness is explicitly added in by the monitor manufacturer, for the purpose of improving user experience.

13

The software asks the monitor to display a particular pattern of 32-bit RGB values, right? i.e. The OS might ask the monitor every frame to display a particular 1920×1080×32 bitmap.

That's not how VGA works at all. At the monitor level there are no pixels at all.

How displays traditionally worked before the age of LCD is this:

  1. Software asks the device driver to display a bitmap image

  2. Device driver splits the image into three waveforms for R, G and B. That's right, waveforms! Exactly like audio waveforms. Now, these waveforms have a specific format because while audio is 1d pictures are 2d.

  3. The analog signal for lines on the screen are sent to the monitor.

The monitor never sees a pixel, it only sees lines.

  1. The monitor spits out electrons moving at nearly light speed from three electron guns and the beam is deflected by controlling group of electromagnets causing them to paint the entire screen.

Here is where the sharpness control comes in.

Due to manufacturing tolerances the electron beams almost never converge correctly and produce blurry pictures right off the assembly line. In the really old days it is up to you, the person who bought the monitor, to adjust the sharpness at home. Later more modern of these ancient displays have automatic adjustment process at the factory but the sharpness adjustment must still be built in for the process to work.

So the answer is really simple. The sharpness adjustment is there to ensure the picture on the displays are sharp.

slebetman
  • 712
6

On a (digital) TV, sharpness controls a peaking filter that enhances edges. That is not so useful on a display if used as a computer monitor.

In the previous century, on a high-end analog CRT monitor, sharpness may have controlled the focus voltage of the electron gun. This affects the spot size with which the picture is drawn. Set the spot size too small (too sharp) and the line structure becomes too visible. Also there may be annoying "Moiré" interference with the structure of the shadow mask. The optimum setting depends on the resolution (sample rate) of the picture, as many CRT monitors were capable of multiple resolutions without scaling (multi-sync). Set it just sharp enough.

High-end CRT TVs had Scan Velocity Modulation, where the scanning beam is slowed down around a vertical edge, and also a horizontal and vertical peaking filters and perhaps a horizontal transient improvement circuit. Sharpness may have controlled any or all.

Sharpening in general enhances edges by making the dark side of the edge darker, the bright side brighter, and the middle of the edge steeper. A typical peaking filter calculates a 2nd order differential, in digital processing e.g. (-1,2,-1). Add a small amount of this peaking to the input signal. If you clip off the overshoots then it reduces to "transient improvement".

On some digital devices, the sharpness of a scaler may be controlled, e.g. in my digital satellite TV receivers. This sets the bandwidth of the polyphase filters of a scaler, which converts from a source resolution to the display resolution. Scaling cannot be perfect, it is always a compromise between artefacts and sharpness. Set it too sharp and annoying contouring and aliasing are visible.

This may be the most plausible answer to your question, but only if the monitor is scaling. It would do nothing for an unscaled 1:1 mode.

Source: 31 years of experience in signal processing for TV.

StessenJ
  • 259
5

It doesn't make sense. Or at least it doesn't on most LCD monitors. You will almost always want your "sharpness" set to 0, depending on the monitor or TV (some will blur the signal at 0, so the real unfiltered setting might be somewhere in the middle), otherwise, it will apply an edge enhancement filter, which makes the darker side of an edge darker and the lighter side lighter. This is especially noticable on cartoons and text. Your mileage may vary, but I think it looks bad in nearly every case.

This is a lossy, irreversible filter that you will probably not want to be activated. Your computer is sending pixel-perfect data, so "sharpness" and blurring filters are generally undesirable.

Also note that the "sharpness" filter/setting is a misnomer. It is impossible to make an image sharper (i.e. having more detail), only less detailed. The only way to get a sharper image is to use a higher definition source image.

Beefster
  • 167
4

Sharpness settings exist on LCD panels because manufacturers think digital effects will sell more monitors and TV's. Rather than faithfully represent the input from the computer, the manufacturer gives the user options to tweak the picture to suit personal tastes, however poor those tastes may be.

"Sharpness" is relevant for analog signals (like VGA) and for CRT displays, where the signal is represented by waveforms at some point. Because analog tends to be imprecise, sharpness settings allow calibration for tolerances and compensation for imperfections in analog display output and signal transmission.

Sharpness ought to be irrelevant on LCD panels using DVI, HDMI, and other "pixel-perfect" data sources with a 1:1 resolution mapping. Yes, sharpness distorts the picture in this scenario. Displays in boxes stores often have sharpness and other digital filters cranked to extremes to appear more dramatic than the surrounding displays. Some consumers might actually want these effects because they have grown accustomed to the filters' effects or because they are trying to compensate for a poor-quality LCD panel that looks bad to the eye at native output. Sharpness might also be relevant when using a digital signal that must be resized because the source and display have different resolutions.

Overall, you probably want sharpness set to Off or 0 on a modern LCD display with a 1:1 digital signal.

http://hifi-writer.com/wpblog/?page_id=3517 and https://forums.anandtech.com/threads/why-does-an-lcd-tv-with-hdmi-input-need-a-sharpness-control.2080809/

bendodge
  • 241
4

Many monitors can accept a video signal that doesn't have the same resolution as the panel, and attempt to scale it as appropriate. If a monitor which is 1280 pixels wide is called upon to display an image which is 1024 pixels wide, and the source material consists of black and white stripes that are one pixel wide, the display would likely show a repeating 5-pixel pattern. On a scale of 0-4, the pattern would likely be 03214. If the black and white stripes in the original are "meaningful", showing them on as above may be helpful. On the other hand, the 5-pixel repeating pattern would be a distraction which isn't present in the original. Adding some blur to the image would reduce the aliasing effects of scaling.

supercat
  • 1,819
  • 10
  • 8
-1

Different settings are good for differnt content. These settings could also be changed on the source, but I and probably many others don't know where you can change the sharpness setting on a PC.

So there is an easy to access menu on the monitor, where sharpness can be changed.

Christian
  • 247