The Definitive Guide to the truth about Video Specifications.

by Scott Wallace

Shopping for a video display - whether a flat panel TV or projector - can be a virtual mine field littered with a host of specs whose number is supposed to imply superior performance. While yes, numbers tell part of the tale, they are far from the only barometer, and arguably not even the clearest indicator of potential performance. Video performance is a complex mix of many design choices that ultimately show up on screen in one way or another. Let’s look at some of the headlining specs that companies often use to promote a product’s capability.

Contrast Ratio

Contrast ratio

Easily the most abused specification in the industry. Contrast ratio is, by definition, the delta between how close to black a TV can get as ‘contrasted’ with its brightest white point. The resulting ratio (1000:1 for example) becomes the one published. But…if one adjusts their video display ONLY to achieve the deepest black it can do, the resulting picture is most likely unwatchable, as the impact to the picture irrespective of other settings is significant. Similarly, if one adjusts their video display to only be as bright as possible, the resulting picture is not only unwatchable, but likely painful to watch! Actual contrast ratio is a byproduct of all settings to achieve the proper balance of black level and white level (Brightness, hue, contrast etc). Often, the video display with the larger published contrast ratio is not the one that has a higher contrast ratio once a display is properly set up.

When the video display is a projector, there are even MORE factors at play. The size of the screen, the type of screen material (white screen, gray screen, is it acoustically perforated or woven for sonic transparency as opposed to being a solid surface, etc.), the amount of light in the room (and how much of that light is hitting the screen), and the color of the wall and ceiling surfaces in the room all conspire and comingle to create a picture that can be very different from one room to the next, and can negatively impact the output of even the most high end projectors if not properly managed.

Oh but wait there’s more!

Couple watching movie on big TV

Once the light from the projector hits the screen, the resulting picture becomes a light source itself that then broadcasts into the room and hits adjacent wall and ceiling surfaces. The lighter the surroundings, the more they act as a light source reflecting back on to the screen. There is, after all, a reason that movie theaters all have very dark color schemes, and that all but the stair and exit lights go off once the movie starts. The closer a home theater can get to this environment (and in fact, it can easily surpass it, since one can choose to have all light in the room off during a movie or video), the better the video performance will be… especially with projection.

A flat panel TV is its own light source. Therefore what changes is how our eyes see the picture, not the picture itself. The reason to have dark surroundings with a flat panel TV has more to do with removing visual distractions.

It is worth noting that the contrast control itself only adjusts the white level.
So, what adjusts black level? You guessed it…


The brightness control of a video display raises or lowers the black level. When raised to a higher number, our eyes perceive the picture as getting lighter but it’s actually diluting those deep blacks into a milkier kind of gray. When brightness is lowered, the image gets darker and gets close to black, but with significantly less detail visible in darker areas of the picture. Cinematographers actually work very hard to shape light to create very defined areas of light and shadow. So when setting up a video display to accurately portray this - one MUST adjust to allow for the deepest black possible while also assuring that not all shadow details are lost into a indiscernible mass of black.

This was actually big topic of discussion a few years ago during Game of Thrones season 8, when many fans took to the internet to complain that the episode “The Long Night” way too dark to see anything. A perfect segway into our next setting…

Game of Thrones still that is dark


‘Gamma’ defines how quickly a picture comes out of or goes into its deepest black. A (measured) gamma of 2.2 is the industry standard. Gamma is akin to how we might compare a good subwoofer to a poor one. A good subwoofer will stop and start as the signal dictates. A poor subwoofer that can’t stop and start on the proverbial dime will produce sound after the actual signal has stopped. And so, with video, proper gamma will reveal those light and shadow details as they were designed to be seen by the Director and Cinematographer. Improperly set gamma will see those details fall into black or fade out of black incorrectly.

Gamma correction depiction

Color Temperature

Vivid screen setting

Video content is mastered at a color temperature of 6500 Kelvin, sometimes referred to as D65. It is common for TV manufacturers to ship TVs to retail locations on a setting that produces a higher color temperature, as that results in a bluer picture that the eye can perceive as being brighter and cleaner. Remember those laundry detergent commercials promising ‘whiter whites’? Yeah, tinting things blue is how that’s done. Ick. That is an optical illusion and not technically correct. To reproduce a white without tint or color shift requires a color temperature of 6500 Kelvin. For TVs, this color temperature is usually hidden underneath verbiage like ‘Cinema’ or ‘Expert’, whereas a bluer color temperature uses verbiage like ‘Standard’ or ‘Natural’. Don’t be fooled. If the picture seems to take on a slightly warmer, browner tone when in Cinema or Expert, this is much closer to correct. Projectors however tend to have settings that specifically call out the target color temperature, and in such cases, D65 represents true 6500 Kelvin.

It’s not a bug… it’s a FEATURE!

For flat panel TVs more specifically, they often promote “features” that claim to improve contrast, sharpness, or motion. How to best set these flat panel TV picture ‘enhancements’? In most cases, don’t use them at all. These controls fall into three primary categories:

  1. Control of the TVs LED array to enhance subjective contrast.
  2. Adjustments that are (effectively) sharpening tools.
  3. Motion related adjustments that play with the refresh rate.

Any adjustment that alters the picture on the fly to overcome what is effectively a limitation of the video display itself is best left off. For those times when you may perceive a benefit, there will be many others where its use will have a negative impact to subjective picture quality.

Overzealous picture sharpening of one form or another represents another sub-category of adjustments. “Sharpness” when set too high is a picture artifact that produces artificial outlines around edges. The resulting subjective impression is that the image is sharper. In fact, you have added noise to the picture that wasn’t there before engaging the ’feature’. Different manufacturers have different landing spots for the actual sharpening control to have a correct level of picture sharpness without defocusing the image but also without artificial outlining. Ask your Definitive Audio salesperson where it should be for the TV that you have!

Refresh Rate

The adjustment of refresh rate is a controversial subject. A faster refresh rate can be perceived by the human eye as being crisper motion that remains in focus throughout the movement. However, this runs counter to how our vision works. Still, for some this is appealing for that very reason. For others, the effect can read as artificial as it deviates from the way our own eyes perceive motion. If you scan quickly across something with your eyes (go ahead, look at something to your left, then quickly look to your right and focus on something there), your eyes rest and focus only two times; when you first start on an object and then where your eyes stop and land on an object. You are not perceiving sharp detail throughout the movement. The standard refresh rate for our North American TV system is 60Hz (movies are historically shot at 24 frames per second but digging into this is a subject for another time, and in the modern era is no longer always the case). Once refresh rates enter 240Hz territory, we get into a phenomenon that has come to be known as “the soap opera” effect. This high a refresh rate gives motion an artificial ‘clarity’ that may look technically sharper, but for most people is simply too unnatural. We have one employee for whom an all-day headache awaits if he even looks at a TV with this high of a refresh rate! We in general do not recommend this high a refresh rate ever be engaged as a “feature”.

Next up in our video series…”Should I get a flat panel TV or a projector?”. We have opinions ????