Technology Training

TV Technologies explained

LED and LCD

Despite having a different acronym, an LED TV is just a specific type of LCD TV. The proper name would actually be “LED-lit LCD TV,” but that’s too much of a mouthful for everyday conversation, so people generally just refer to them as LED TVs. An LED TV uses a liquid crystal display (LCD) panel to control where light is displayed on your screen. These panels are typically composed of two sheets of polarizing material with a liquid crystal solution between them. When an electric current passes through the liquid, it causes the crystals to align so that light can (or can’t) pass through. Think of it like a shutter, either allowing light to pass through or blocking it out.

Since both LED and LCD TVs are based around LCD technology, you’re probably wondering what the difference is. Actually, it’s about what the difference was. Older LCD TVs used cold cathode fluorescent lamps (CCFLs) to provide lighting, whereas LED LCD TVs used an array of smaller, more efficient light-emitting diodes (LEDs) to illuminate the screen.

Since the technology is better, all LCD TVs now use LED lights and are colloquially considered LED TVs.

Backlighting

There are three basic forms of illumination that have been used in LCD TVs: CCFL backlighting, full-array LED backlighting and LED edge lighting. Each of these illumination technologies is different from one another in important ways.

CCFL Backlighting

CCFL backlighting is an older, now-abandoned form of display technology in which a series of CCFLs sit across the inside of the TV behind the LCD display. The lights illuminate the crystals fairly evenly, which means all regions of the picture will have similar brightness levels. This affects some aspects of picture quality, which we discuss in more detail below. Since CCFLs are larger than LED arrays, CCFL LCD TVs are thicker than LED-backlit LCD TVs.

Full-array backlighting

Full-array backlighting swaps the outdated CCFLs for an array of LEDs spanning the back of the LCD screen, comprising zones of LEDs that can be lit or dimmed in a process called local dimming. TVs using full-array LED backlighting make up a healthy chunk of the high-end LED

TV market, and with good reason — with more precise and even illumination, they can create better picture quality than CCFL LCD TVs were ever able to achieve, with higher efficiency to boot.

Edge lighting

Another form of LCD screen illumination is LED edge lighting. As the name implies, edge-lit TVs have LEDs along the edges of a screen. There are a few different such configurations, including LEDs along just the bottom, LEDs on the top and bottom, LEDs left and right, and LEDs along all four edges. These different configurations result in differences in picture quality, but the overall brightness capabilities still exceed what CCFL LCD TVs could achieve. While there are some drawbacks to edge lighting when compared to full-array or direct backlight displays, the upshot is edge lighting allows for manufacturers to make thinner TVs that cost less to manufacture.

To better close the local-dimming quality gap between edge-lit TVs and full-array back-lit TVs, manufacturers like Sony and Samsung developed their own advanced forms of edge lighting. Sony’s technology is known as “Slim Backlight Master Drive,” while Samsung has “Infinite Array” employed in its line of QLED TVs. These keep the slim form factor achievable through edge-lit design but with local dimming quality more on par with full-array backlighting.

What is local dimming?

Local dimming is a feature of LED LCD TVs wherein the LED light source behind the LCD is dimmed and illuminated to match what the picture demands. LCDs can’t completely prevent light from passing through, even during dark scenes, so dimming the light source itself aids in creating deeper blacks and more impressive contrast in the picture. This is accomplished by selectively dimming the LEDs when that particular part of the picture — or region — is intended to be dark.

Local dimming helps LED/LCD TVs more closely match the quality of older Plasma displays (RIP) and modern OLED displays, which feature better contrast levels by their nature — something CCFL LCD TVs couldn’t do. The quality of local dimming varies depending on which type of backlighting your LCD uses, how many individual zones of backlighting are employed, and the quality of the processing.

OLED vs. QLED

As if it wasn’t already confusing enough, once you begin exploring the world of modern display technology, new acronyms crop up. The two you’ll most commonly find are OLED and QLED. Despite the similar-sounding name, OLED (organic light-emitting diode) TVs are in a category all their own.

An OLED display uses a panel of pixel-sized organic compounds that respond to electricity. Since each tiny pixel (millions of which are present in modern displays) can be turned on or off individually, OLED displays are called “emissive” displays (meaning they require no backlight). They offer incredibly deep contrast ratios and better per-pixel accuracy than any other display type on the market.

Because they don’t require a separate light source, OLED displays are also amazingly thin — often just a few millimeters. OLED panels are often found on high-end TVs in place of LED/LCD technology, but that doesn’t mean that LED/LCDs aren’t without their own premium technology.

QLED is a premium tier of LED/LCD TVs from Samsung. Unlike OLED displays, QLED is not a so-called emissive display technology (QLED pixels are still illuminated by lights from behind). However QLED TVs feature an updated illumination technology over regular LED LCDs in the form of Quantum Dot material (hence the “Q” in QLED), which raises overall efficiency and brightness. This translates to better, brighter grayscale and color, and enhances HDR (High Dynamic Range) abilities.

Things will get even more confusing in the future, with Samsung currently working on tech that combines QLED and OLED to give folks the best of both worlds.

4K vs. UHD

The simplest way of defining the difference between 4K and UHD is this: 4K is a professional production and cinema standard, while UHD is a consumer display and broadcast standard. The term “4K” originally derives from the Digital Cinema Initiatives a consortium of motion picture studios that standardized a spec for the production and digital projection of 4K content. In this case, 4K is 4,096 by 2,160, and is exactly four times the previous standard for digital editing and projection (2K, or 2,048 by 1,080). 4K refers to the fact that the horizontal pixel count (4,096) is roughly four thousand.

The 4K standard is not just a resolution, either: It also defines how 4K content is encoded. Ultra High Definition, or UHD for short, is the next step up from what’s called full HD, the official name for the display resolution of 1,920 by 1,080. UHD quadruples that resolution to 3,840 by 2,160. It’s not the same as the 4K resolution made above — and yet almost every TV or monitor you see advertised as 4K is actually UHD. Sure, there are some panels out there that are 4,096 by 2,160, which adds up to an aspect ratio of 1.9:1. But the vast majority are 3,840 by 2,160, for a 1.78:1 aspect ratio.

4K or 8K

In a nutshell, 8K has a resolution that’s four times greater than 4K to deliver an even clearer picture. Your 4K TV at home will have a maximum resolution of 4096 x 2160 pixels. With 8K, that’ll increase to a massive 7680 x 4320 pixels.

More pixels mean more detail

That’s a lot of pixels, and it makes all the difference, with even more detail crammed into the screen. If you thought 4K detail was immense, just wait until you see 8K in the flesh. You won’t even need to get up close to see intricate details.

Nature documentaries will be a real treat – you’ll love how 8K displays lush greens and outdoor settings. And sport will look better than ever; you’ll really feel part of the action.

Every little detail appears to pop out from the screen like never before. Having seen a 4K picture alongside an 8K picture, we can confirm there is a real improvement. You’ll notice the difference, too.

More intelligent TVs

This is smart TV, but not as we know it. Samsung’s Q900R 8K QLED TV uses artificial intelligence to upscale pictures. That means it can transform lower quality images into glorious 8K.

But it’s how the TV does this that makes it so interesting. It uses a machine learning tool that analyses content from multiple sources, and applies that data to upgrade the picture as you watch. The TV just knows what to do, and how to make the picture better.

It reduces noise, smooths the edges, and removes all the rough bits. The result is a picture that’s smoother and more vibrant than anything you’ve seen before.

More colours

8K TVs will be able to display more colours, thanks to better HDR skills. Samsung’s new TV, for example, uses HDR10+, which means you’ll see a wider colour palette – no matter what you’re watching. The end result is a picture that looks more natural. An 8K TV won’t just give you a better picture, but better sound, too. While watching, you’ll feel like you’re in a cinema, at a gig, or at the ground of your favourite football team.

What do you need to run 8K?

Besides an 8K screen, 8K video requires high speed leading into that screen. Four times as many pixels, each of which might have more information than pixels typically do, means 8K video takes up a lot of bandwidth. That’s a concern whether you’re watching 8K content on an as-yet uninvented 8K optical disc that supports it or streaming it over a 5G internet connection. The short answer is that 8K requires (at least) HDMI 2.1. HDMI applies to cables and home theater devices. That ensures your cables and the source devices you use can handle the bandwidth requirements necessary to carry 8K content.

HDMI 2.1 is a relatively new standard intended for high- quality 4K and 8K content. It features a maximum bandwidth of 48Gbps—three times that of the HDMI 2.0 standard (18Gbps), which supports up to 4K60 video. The lower a bandwidth’s connection, the lower the resolution and video frame rate you can send over it, and the more compressed the video has to be, which hurts fine details. HDMI 2.1 can handle high-quality, uncompressed 8K video at up to 60 frames per second, and it can carry uncompressed 4K120 video as well.

So yes, that means you’ll probably have to buy new cables. Top-of-the-line certified HDMI Premium High-Speed cables are rated for only up to 18Gbps; to handle 8K’s requirements, you’ll need to wait until the HDMI Forum officially certifies some Ultra High-Speed HDMI cables, or otherwise look for cables that meet the criteria and have a maximum bandwidth of 48Gbps (Monoprice currently has two 8K HDMI cables out of its dozens of different versions). Your source device needs to support 8K video and have HDMI 2.1 compliance as well. HDMI 2.1 defines everything in the signal path from source to screen, including the ports on your media streamer, game system, or Blu-ray player. Even if your device can play 8K video, that won’t matter if it can’t actually get that video to your TV.

Is Now the Time to Jump to 8K?

Now is not the time—at least, not for the majority of buyers. You can buy 8K TVs right now. They’re just very, very expensive, and you can’t watch any native 8K consumer content on them. They’re early-adopter toys for people who can easily drop five digits on a TV almost purely for bragging rights.

Published
Categorized as Blog

Leave a comment

Your email address will not be published. Required fields are marked *