Home > News & Events > Industry News

Industry News

A Guide to HD Small Pixel LED Screen TVs

In this HD Small pixel pitch LED Screen Display Guide

Page 1: Introduction
Page 2: Video Basics
Page 3: Screen Size & Viewing Dist.
Page 4: LCD vs. Plasma
Page 5: Brightness, Black Level & Contrast Ratio
Page 6: Color Reproduction
Page 7: Viewing Angles
Page 8: Image Retention, Burn-in & Dead Pixels
Page 9: Motion Handling
Page 10: Screen Uniformity
Page 11: Power Consumption
Page 12: 3D
Page 13: Source Material
Page 14: Calibration
Page 15: Conclusion


One of the most desirable pieces of technology steadily infiltrating homes across the world is flat panel High Definition Television (HDTV). These TVs retail with a perplexing range of names, features, sizes and technical specifications, guaranteed to confuse all but the most tech-savvy buyer. Even after you purchase your exciting new HDTV and get it home, working out how to get the most out of it and ironing out the kinks can be a chore.


I'm sympathetic to the fact that most people aren't necessarily interested in the details and tech jargon surrounding HDTVs. They just want to know how to select the best display, and how to set it up for optimal viewing enjoyment. Unfortunately things are never that simple. There's a lot of conflicting and often deliberately misleading information available on the Internet which makes things very difficult. This is compounded by the TV manufacturers filling up technical spec sheets with vague marketing terms and dubious claims. Furthermore, there is currently no perfect display technology; they all have pros and cons, and require certain compromises.

The only way to make an educated decision, both at the time of purchase, and subsequently when adjusting the settings on your TV, is to develop a good understanding of the fundamental workings of HDTVs, combined with careful consideration of your own particular circumstances and tastes. This guide helps you to achieve just that, with information ranging from the very basic to more advanced topics for HDTV buyers and owners alike.


If you're not familiar with the terminology frequently thrown around when discussing HDTVs, then now's the time to get a handle on the basics of how video material is played back before going any further. Below is a very condensed run-through of the key concepts.

The picture on a modern HDTV is made up of lots of individual still images of digital material called Frames, shown in rapid succession to create the illusion of moving video. A frame is a single still image, like a digital photo. And just like any digital image, it's made up of lots of individual dots called Pixels, which are the smallest unit of graphical information. The Resolution of a video image is measured in pixel width pixel height, with the most common resolutions for digital video being: 640x480, 720x480 and 720x576 for DVD; and 1920x1080 for Blu-ray. The resolution is often shown in shorthand notation such as 480i or 1080p, referring to the pixel height of the image, and whether it isProgressive (p) or Interlace (i) video - more on these last two shortly.

LCD and Plasma flat panels have a fixed number of pixels on their screen, and this determines their Native Resolution, again measured in pixel width x pixel height. For example, a 1080p flat panel display has a 1920x1080 native resolution. For any TV to be considered "High Definition", it must be able to natively display a 720p, 1080i or 1080p image. In marketing speak, HDTVs limited to a 720p maximum (e.g. a 1280x768 native resolution) are referred to as "HD Ready", while those which can do 1080p natively are called "Full HD". Where the digital video source being played back on the TV has a resolution which doesn't match the TV's native resolution, the source video will be automatically rescaled up or down by the DVD or Blu-ray player, or by the TV itself, to best fit the screen. Furthermore, if the video maintains its original Aspect Ratio - that is, the ratio of its width to its height - such that it isn't squashed and doesn't have portions cut off, then you may see black bars to the sides or above and below the image.

Original video content can be shot at varying framerates of 24, 25, 30, 50 and 60 Frames Per Second (FPS). The majority of movie content is filmed at 24 FPS. This is a low framerate, and if left unaltered, during fast action it can appear choppy, and can also produce noticeable flickering. To appear more pleasing to the eye, the video needs to undergo some changes. One way to do this is to adapt the original video frames to a higher Refresh Rate, measured in hertz (Hz), which is the number of times per second the screen updates the image it displays. Frame rate and refresh rate are not always the same. In a cinema for example, the projector will actually flash (refresh) each frame of a 24 FPS movie two or three times per frame, resulting in a 48Hz or 72Hz refresh rate, which reduces the perception of flickering that a 24Hz image would otherwise show. The primary benefit of a higher refresh rate is that it leads to less visible flickering, while higher framerate results in smoother motion.

On traditional Cathode Ray Tube (CRT) TVs, it was originally decided for reasons to do with mains power frequencies to use 60Hz as the standard refresh rate in NTSC countries, such as North America and Japan, and a 50Hz standard in PAL countries, which includes most of Europe, China, Africa and Australasia. There is some trickery involved in converting a 24 FPS movie into the 50Hz or 60Hz standard as relevant.

On an analog CRT television, the image is actually composed of Fields, not frames. A frame is a whole image, while a field is part of an image. To save on bandwidth in broadcast television, a process known as Interlacing was used whereby each frame on a TV screen was actually composed of two separate fields, each containing half a frame. One field would show only the odd-numbered lines of one frame, while the other showed only the even-numbered lines of the next frame. So two slightly different half frames (fields) would be interlaced together, and when shown rapidly in sequence on a phosphor-based CRT, the human eye didn't notice the interlaced fields. The benefit of this method was that it doubled both the original frame rate and refresh rate, resulting in much less flicker and smoother perceived motion than if the original source was unaltered.

But how is a 24 FPS movie actually converted into a 50 Hz or 60 Hz refresh rate? The number 24 doesn't divide evenly into either 50 or 60. In PAL countries, it commonly involves speeding up the 24 FPS movie to 25 FPS, which is only a 4% increase in speed and thus not noticeable. When doubled via interlacing, that 25 frames per second becomes 50 fields per second (50Hz) which is the PAL standard. Things are more complex for NTSC video. If the source is 24 FPS, a process known as Pulldown is used, also known as 2:3 Pulldown or 3:2 Pulldown. Instead of repeatedly interlacing two slightly different fields, pulldown employs an alternating pattern, such as 2 fields, then 3 fields, then 2 again, then 3, and so on - see the diagram below. This 2:3:2:3 field pattern repeats every four frames. With 10 fields being generated for every 4 frames, this equals 60 fields per second (60Hz) for every 24 frames per second (24 FPS), which accomplishes the required conversion. The main problem with Pulldown is that it introduces some Judder, whereby the uneven repeating field pattern can make motion appear slightly jerky at times.

This process changes slightly on HDTVs, because modern digital displays always show whole frames, and don't generate fields made up of partial frames. This is known as Progressive scan video, and provides a smoother image than interlaced video. On an HDTV, any interlaced video (e.g. 1080i broadcasts) must be converted to progressive via a process known as Deinterlacing. This deinterlacing is not perfect, and depending on the method used, may result in some visual glitches known as artifacts. As a rule, progressive video is always smoother and clearer than interlaced video, especially for fast motion. Fortunately, video content stored in the form of Blu-ray disc is encoded in the original 24 FPS progressive scan format, also known as 24p. This means no deinterlacing is required, as whole frames, not fields, are being output. However, conversion to 50Hz or 60Hz (and multiples thereof, such as 100Hz or 120Hz) using speeding up or some form of pulldown will still need to occur.

There is an alternative available for film purists who own an HDTV and a Blu-Ray player which are both capable of "native" 24p playback: the movie can be played back at its original 24 FPS without any conversion such as pulldown. The TV may still refresh each frame multiple times to achieve a higher refresh rate to reduce flicker (e.g. 48Hz, 96Hz or 120Hz), but the original film frame rate is unaltered via pulldown or speedup.

The above is of course a highly simplified summary, and there are a lot of complexities, nuances and omissions which videophiles will undoubtedly point out. For now though, it's enough if you feel you have a reasonable understanding of what's covered above. We will expand upon some of these topics later in this guide.


Screen Size & Viewing Distance

Before considering anything complex, let's first address the most commonly-asked question when people go to purchase a new HDTV these days: "How big should I go?" Unfortunately the most commonly-provided answer these days is: "The bigger the better!". This is not a universal truth - bigger is not always better; indeed bigger can sometimes be worse, especially if you sit close to a large screen.

There is no precise scientific value for the distance you should sit from a particular sized screen, but several very general rules which are commonly cited include:


  • 3 x Picture Height - Measure the actual height of the screen area, then multiply that by three to determine the viewing distance.
  • 1.5 - 3 x Diagonal Size - Use the diagonal size of your screen - which is what the listed screen size actually refers to - and multiply it by 1.5 and 3 to get the minimum and maximum recommended distances respectively.
  • THX Viewing Angle - THX recommends that the TV screen take up 40 degrees or less of your field of view to give an immersive experience. Divide your diagonal screen size by 0.84 to get the viewing distance required to meet this recommendation.

    To make things easier, you can also use a Viewing Distance Calculator which takes into account a few of these types of recommendations. Fill in the relevant details at the top of the calculator and click the Calculate button. Click the 'Switch to metric units' button if you want to use meters instead of feet, or simply remember that roughly 3.3 feet = 1 meter.

    Now here's the controversial part: viewing distance calculators may tell you that, to take one example, you can sit up to 6.5 feet away from a 50" TV and still fully resolve all the detail on a 1080p screen. However this advice should not necessarily be taken literally to mean that you should sit 6.5 feet or less from a 50" screen. Similarly, the 'THX recommended viewing angle' result from the calculator says that at 6.5 feet viewing distance, a 58" screen is actually recommended, which reinforces the mindset that bigger is better. Big screen flat panels are getting cheaper by the day, and people are now automatically opting for the largest screen size they can afford based on this sort of advice. This is not the correct approach. Instead, the three most important factors you will need to take into account when considering screen size and viewing distance are: pixel structure, personal taste, and the source material you will typically view.

    Pixel structure: Flat panel TV's are fixed-pixel displays. This means that a 1920x1080 HDTV for example has a total of 2,073,600 pixels, regardless of its screen size - whether 32", 42", 65" or 103", the same number of pixels are on the TV. So the larger the screen size, the larger the individual pixels. Keep in mind also that a 720p display has less pixels than a 1080p display (typically 1280x768 = 983,040 pixels). If you sit close enough to any fixed-pixel display, it will become obvious to your eyes that the image is actually composed of small dots or rectangles. This is also known as Screen Door Effect, because it looks like the image is being viewed through the fine mesh of a screen door - see the picture below for an example.

    If you can see any hint of pixel structure on the screen at any time, you're sitting too close to the TV. Move back to the point where you can't see any pixel structure, and that is your true minimum viewing distance.

    Personal Taste: Image size is a subjective choice in large part, which is why some people sit in the first few rows of a cinema, some sit in the middle, and some sit towards the back. The THX Viewing Angle recommendation in the Viewing Distance Calculator further above takes this into account by providing a 'Maximum THX viewing distance' number which corresponds to the THX requirement for a movie theater's screen to have at the very least a 26 degree viewing angle when viewed from the back seats. So you can use that recommendation instead if you're the type who sits towards the back of a theater for example. Otherwise the only viable option is for you to visit a store and find out how close you can get to a particular screen before the sheer size of it becomes overwhelming for you, and you find it uncomfortable to track the image with your eyes. It's true to say that most people become used to a larger screen over time, however this is a matter of degree. You will almost certainly adjust to a screen which at first seems slightly larger than you expected. But don't automatically expect to become accustomed to a 65" screen at five feet for example, despite a viewing distance calculator or other people telling you otherwise.

    Source Material: This is the most important consideration, and one which all the distance calculators and viewing recommendations often fail to take into account. While you may be able to easily afford a very large screen, and indeed not feel the size to be overwhelming at all, the reality is that the majority of source video material today is actually laden with subtle and not-so-subtle image quality flaws, even when viewed using the best quality players and TVs available. See the Source Material section for more details and examples of the common flaws visible in all forms of digital video. Though present on most Blu-rays, these issues are exacerbated on DVD and poor quality Digital TV broadcasts. There are ways of improving image quality and reducing these flaws, which we examine in the Calibration section. Ultimately however, it is a fact that viewing any image on a larger screen and/or closer to the screen will mean you will notice more of any flaws in the image. Only a very small percentage of the best quality Blu-ray transfers will stand up to closer scrutiny without noticeably exhibiting these sorts of flaws. Keep the issue of source material quality foremost in your mind before automatically opting for a larger screen, because quality should be more important than size when it comes to HDTV. If a reasonable proportion of your viewing is of mediocre quality, particularly digital TV and DVD sources for example, then it would be best to opt for the size which ensures cleaner image quality across a range of sources, not just the biggest screen you can afford.

    OK so where does that leave you with your choice of screen size? The viewing distance rules and calculators we discussed earlier are still applicable, however remember that they only provide you with a range within which you might be comfortable with a particular screen size. Ideally, you should take an average quality live action movie disc which represents your commonly viewed source material (not the best Blu-ray you own) with you to a store and ask them to play it back for you on various TV sizes. Since the quality of the movie on the disc is already known to you, what should become apparent is the difference in the quality of the TVs and the distance at which particular screen sizes exhibit more flaws, and at which the image becomes overwhelmingly large to you, and difficult for your eyes to follow comfortably, or shows hints of pixel structure on the screen.

    Finally, if you're concerned about how a particular TV will fit into your existing display area, then check the manufacturer's website as they usually have the specifications on the width, height and depth of each TV. Some manufacturers even have applications which help you to better visualize how a TV will look in your viewing environment. Panasonic's free Viera AR Setup Simulator App for example allows you to use an iOS device with a camera, such as an iPhone or iPad, to simulate the placement of one of their Viera TVs in your viewing environment.

    When it comes to size and viewing distance, any calculated minimum or maximums should only be taken as a general guide. None of these calculators or rules, nor the individuals who give you advice, take into account your commonly viewed source quality or your personal taste. Don't be goaded by others into buying a bigger TV just for the sake of size over quality or to keep up with the Joneses. By the same token, be aware that it is highly likely that you will adapt to a TV which is at least one size larger than you would initially prefer.


 LCD vs. Plasma

Let's turn to perhaps the biggest area of contention and confusion for the average HDTV buyer: the type of display technology to choose. At present there are two major types of display technology which are commonly used in the consumer HDTV arena: Plasma TV and LCD TV. Within the LCD category, there are various types of displays, including the more recent LED-backlit LCD displays.

It should be noted at this point that the Projectors category, which includes Rear Projection TV (RPTV), is not discussed in this article. Front projectors are a perfectly valid choice for home theater enthusiasts, especially those who want the largest screen real estate. But they are not included in this guide simply because they are not as convenient nor as versatile as flat panel displays. Projectors are more suited to specialist home theater applications requiring fairly strict control over the light in the viewing environment. Rear projection TVs are more convenient in this respect, but have been almost completely phased out of production, and are also not discussed. A fourth display type which you may have heard about, called Organic Light Emitting Diode (OLED), is extremely expensive and not yet produced in large enough screen sizes to be considered a genuine proposition for HDTV buyers. OLED is discussed in more detail under Future Technology in the Conclusion section of this guide.

So it boils down to the classic LCD vs. plasma war which wages daily on the Internet. As mentioned in the Introduction to this guide, there is currently no perfect display technology, and none is on the horizon. This guide doesn't pretend to resolve the LCD vs. plasma debate; each of these display types has its pros and cons, and requires a level of compromise. The only way you can make the choice is by understanding the technologies involved, the technical specifications and what they really mean, the various quirks and issues of each, and thus the suitability of these displays to your particular circumstances.

The Underlying Technology

As we'll soon come to see, every pro and con for LCD and plasma derives from the underlying technology they use. These two display types may look similar when sitting side-by-side in a store, but they're very different in the way they approach the reproduction of video.



A Plasma TV earns its name from the fact that the primary component used to generate light output is a highly charged gas known as plasma. A plasma screen is made up of a grid array of small gas-filled cells which are very similar to neon lights. Each plasma pixel consists of a set of three of these phosphor-coated cells called sub-pixels, one for red, green and blue respectively. The charged gas in a cell lights up the phosphor coating inside it, producing the relevant colored light. These lit pixels shine through a glass panel to display an image.

Plasma TV has a lot in common with the traditional CRT TV many of us grew up with. In a CRT, an electron gun at the back of the set shoots beams at a glass screen coated with phosphors to produce light, and there are red, green and blue phosphors to produce a color image.



An LCD TV has a grid array of small liquid crystals which can each change shape, twisting to allow various amounts of the light shining from behind them to come through and help produce an image on the screen. Each pixel in an LCD display is made up of three sub-pixel crystals, one for red, green and blue respectively. The light shining from behind the crystals is called Backlighting, and traditionally, LCD displays have used Cold Cathode Fluorescent Lamps (CCFL). The introduction of Light Emitting Diodes (LED) as the backlighting for LCD displays has improved the way in which they can perform but has somewhat inaccurately earned them the title of LED TVs. This gives the impression that the screen is made up of LED lights, which is false. The correct term is LED-backlit LCD TV, or LED-LCD for short. There are two main types of LED-backlit LCD displays (Full Array and Edge-lit) and one associated factor (Local Dimming) which can apply to either as covered below:


  • Full Array LED - The LED lights are situated just like a normal CCFL-backlit LCD TV, in an array across the back of the screen. The image quality results are similar to CCFL-backlit LCD, but there may be better screen uniformity and better color reproduction.
  • Edge-lit LED - The LED lights are situated at the edges of the panel and their lighting is then projected towards the middle of the screen and distributed via a diffuser. This is the most common configuration and the main benefit is that it allows for very thin LCD TVs, at the cost of screen uniformity.
  • Local Dimming - An important factor which can be used with either type of LED backlighting, local dimming provides the best results when combined with a Full Array LED-backlit screen. It basically allows portions of the backlighting to be independently dimmed or brightened depending on the scene. This provides much better contrast ratios and black levels on an LCD screen.



    So in summary, plasma is a self-emitting technology because it creates light directly within each pixel. LCD works on the opposite principle, with each pixel filtering the light from a source shining from behind the pixel. The commonality between plasma and LCD-based displays is that they're both relatively thin, hence the name flat panel, and they're both fixed-pixel digital displays with red, green and blue sub-pixel structures. It's their differences however that make them more or less suitable to certain applications, and this is what we examine throughout the rest of the guide.


Brightness, Black Level & Contrast Ratio

A major determinant of the image quality of any display is its brightness, black level and contrast ratio. These related factors affect how much depth and detail the image on an HDTV screen appears to have. A display where blacks look more like greys, or where the image isn't particularly bright, or looks flat is not a particularly desirable one. Unfortunately choosing the right display is not just a case of picking one which is the brightest, or looks to have the darkest blacks in the showroom. For one thing, the typical TV store's display area is quite bright, which favors TVs which are brighter, but which may not have good black levels. It is difficult to distinguish a TV's true contrast ratio in a bright environment. There are also a range of tricks of the trade manufacturers use to enhance these aspects of any TV, at the cost of other areas of image quality. Let's understand the fundamentals first before getting onto those.

The Brightness of a display is measured by its Luminance, usually presented as a figure in either candela per square metre (cd/m2) or foot-lamberts (fL), where 1 fL= 3.426 cd/m2. You will see various reviews or technical specifications quoting the maximum and average brightness of a display, sometimes with very high values up to 500 cd/m2 or more shown. In practice a target value of between 80 - 120 cd/m2 is suitable for most displays in a normal viewing environment. While an LCD can typically provide greater maximum brightness than a plasma, both display technologies can usually achieve sufficient brightness to suit most people. Keep in mind however that when discussing the brightness of a TV, there are two other factors to consider: the amount of ambient light in your viewing environment, and the contrast ratio of your TV - we'll discuss these shortly.

One important difference between plasma and LCD is that while the liquid crystal in an LCD can twist to varying degrees to allow different amounts of light to filter through, and hence vary its brightness that way, plasma phosphors are either lit up brightly (on) or dark (off) at any one moment. Once lit up, they only stay lit for the merest fraction of a second. By using Pulse Width Modulation (PWM) to pulse the amount of current flowing through the cell, the phosphors are lit up hundreds of times a second to maintain brightness, and by varying the width of the pulse so that each phosphor stays on for slightly more or less time for each pulse, the level of brightness of the image can be varied.

Black Level is another oft-quoted but not-fully-understood metric which is a critical element of good image quality. The ability to create darker blacks allows a TV to have a higher contrast ratio, a term which will be explained shortly. The darker the blacks, the greater the appearance of depth and richness in the image shown. Black level is actually not particularly complex; it's just a measure of the level of brightness of a display when showing video black. It usually has as a very low luminance value, such as 0.004 fL (0.013 cd/m2), but rarely reaches 0.0 cd/m2/fL (true black) because most HDTVs can't achieve this.

To confirm this for yourself, show a black screen on a typical LCD or plasma TV in a pitch black room, and you will still see some light coming from the screen, as the photo above demonstrates. So why isn't black on an HDTV actually equivalent to zero luminance, which is the total absence of light?

On an LCD-based display, black is simulated by the twisted crystals of the panel being completely shut, along with a polarized layer behind the crystals, to prevent light from the backlight filtering through when not required. Yet precisely because the backlight is always on, and the structure of the crystal array is not perfect, some amount of light will leak through the crystals and be seen - this is discussed further in the Screen Uniformity section. More recently, with the advent of local dimming backlighting, some LCD-based displays can switch off portions of their backlight to produce close to true black in parts of the image which require total darkness. Unfortunately this method isn't perfect, as there may be haloing of light around any brighter parts of the image.

On a plasma display, each pixel can be independently switched off to remove its light output, and since there is no backlight, in theory a plasma can produce true black. The reality is that each plasma cell has to be consistently pre-charged so that it can respond quickly enough when light output is required, giving plasma its extremely fast response time. The side-effect of this pre-charging is that there is always some residual glow in the pixels, and thus true black is usually not possible on a plasma. On average though, plasmas provide much darker black levels than LCDs.

It should be noted that what some consider the king of black levels, the traditional CRT TV, does not necessarily achieve perfect black either. A CRT's black is darker than either plasma or LCD, primarily because a CRT can simply have its electron beam avoid lighting up particular portions of the screen. However when displaying any scene containing brighter elements, some stray light may affect the dark areas of the screen. In other words, when a CRT is showing an all black screen, black levels are pretty much true black, but when displaying a normal scene containing a mix of brighter and darker elements, black levels on a CRT are not true black, and may be similar to or even worse than a plasma screen.

Now that we understand the way HDTVs can display brightness and the lack of it on the screen, it's time to look at a metric which is supposed to show the range between these extremes on any TV. Contrast Ratio measures the difference in the luminance between the whitest image and the darkest image that a display can show. It's usually presented in a format such as 4,000:1 - this example would indicate that the whites on this display can be up to 4,000 times brighter than the blacks. The main benefit of a high contrast ratio is that in a scene containing both bright and dark elements, a TV can reproduce both elements correctly. That is, the dark areas will look suitably dark, while the bright areas will remain bright. Displays with poor contrast ratios will give more of a "washed out" image due to less of a difference between dark and bright areas.

Unfortunately manufacturers quickly became aware that consumers were paying attention to contrast ratio figures, and since there is no standard enforced as to how to consistently measure it, contrast ratio figures have now been elevated into ridiculously high numbers, such as 5,000,000:1 or 9,000,000:1. This can be achieved for example by taking measurements from a pixel when it is completely switched off, then comparing it to the pixel when it is lit to the maximum possible level of brightness, then contrasting the two numbers, even though this in no way represents real-world contrast ratios in normal scenes consisting of both bright and dark images. As we see in the Motion Handling section, a similar approach is taken to Response Time measurements. It makes the contrast ratio numbers you see in technical specifications virtually meaningless.

In order to achieve darker blacks and whiter whites, which in part justify these unrealistic figures, a technique known as Dynamic Contrast Ratio is now frequently used in HDTVs. The way it works is that the display constantly alters the brightness of the entire image, reducing screen brightness for scenes which are predominantly dark, and increasing the overall brightness for scenes which are mostly bright. The true measure of contrast ratio, also known as Static or Native Contrast Ratio, should provide the difference between the darkest and brightest luminance possible in the same scene - and most displays don't have native contrast ratio capabilities anywhere near the dynamic contrast ratio numbers. A dynamic contrast ratio has several unwanted side-effects, including greyer blacks in bright scenes, and washed out whites in dark scenes. Additionally, depending on how it's implemented, the constant shift in overall panel brightness may become noticeable to the viewer, resulting in what's often described as Floating Blacks, Fluctuating Brightness or Fluctuating Gamma.

If the TV has any option to disable dynamic contrast then you can turn it off, however some TVs do not have any such option available. Furthermore, for plasma owners, something known as the Automatic Brightness Limiter (ABL) can't be turned off. This is a protective feature built into plasmas to control power consumption, since on a plasma brightness is directly related to the amount of power consumed. In scenes with a high Average Picture Level (APL) - that is, scenes which have a high proportion of bright elements - the ABL will reduce the overall brightness output of the plasma panel to stabilize power consumption; conversely in scenes which have a lower proportion of brighter elements (low APL), the overall brightness of the scene is allowed to be higher. For example a full white screen is not going to have as much luminance as a small window of white on a dark background, precisely due to ABL.

From this discussion we can gather an important fact: most HDTVs, whether LCD or plasma, now use some type of dynamic contrast ratio. This can have annoying side-effects, and can render contrast ratio figures meaningless. So how does someone make sense of all of this? The answer is to take into account your viewing environment, combined with the measured black level and maximum luminance of a display as typically given in reviews. These factors will be sufficient to make a determination, as explained below.

Even the best of us can't see differences in the range of brightness at any one time beyond a notional 1,000:1 contrast ratio. Our eyes work by having a form of built-in dynamic contrast ratio, whereby depending on the ambient lighting of our surroundings, we can detect a lesser or wider range of differences up to the 1,000:1 ratio. As our surroundings become darker or brighter, our iris adjusts to allow more or less light in, and this affects our perception and general sensitivity to brightness and darkness at any point in time. For example, in a pitch black room, if someone shines a weak torch in your eye it can effectively blind you; in a bright sunlit room, the same torch would have much less effect. Similarly, in a pitch black room, you will notice some light coming from a black screen on even the best HDTV, while even a small amount of ambient lighting in the room can make that same screen look completely black.

Taking advantage of this property of our eyes, if most of your viewing is done in a brighter environment, such as a sunlit room, then a display capable of higher levels of brightness, such as an LCD-based TV, is advisable. The black levels on such a display may not necessarily be great, but your perception of black levels will also be reduced in a bright environment, making it less of an issue. Plasma TVs suffer more than LCDs from having their image "washed out" when subjected to bright ambient lighting. Many plasmas and some LCDs come with special coatings on the screen, known as Anti-Reflective (AR) Filters, designed specifically to counter reflections and glare and thus help preserve a good image under bright light. While the AR filter helps, in practice it still doesn't fully prevent plasmas from suffering more than LCDs in a bright environment.

In darker environments, our perception of light becomes heightened, and thus blacks can look more like greys if the TV doesn't have a good black level. For this reason, plasma is more appropriate for those who do most of their viewing in a darker environment given its superior black level. Alternatively, if you have an older plasma or an LCD TV with relatively poor black levels, or indeed any TV viewed in near darkness, you can install what is known as Bias Lighting - soft ambient lights situated behind the TV which greatly improve perceived contrast on the TV.

For those who have mixed viewing habits in both dark and bright environments, an LED-LCD with a full-array local dimming backlight is a reasonable compromise, capable of both good blacks at night and higher brightness levels during the day.

Previous page: LED Video Screen Display Purchasing Guidance Next page: LED electronic Sign display general specification criteria
Contact Us
Direct Lines

Office Tele: 86-755-33123095
Manager's Mob: 86-13714518751
What's APP: 8613714518751
Skype: cxg-11

Factory Address

Building B,Tongfukang Industry

Email Contact

Enquiry: Enquiry@htldisplay.com
Sales: Sales@htldisplay.com

Copyright © 2014-2016 HTL Display Group Co.,Ltd. All Right Reserved.  Privacy policy | Sitmap