MORE PIXELS AGAIN!

Is 8,294,400 Pixels 4 Times Better Than HD?

Radio’s a little box that you buy on the installment plan and before you tune it they tell you there’s a new model out.
-- Melvyn Douglas to Greta Garbo, “Ninotchka” 1939


Las Vegas CES2014 interleaved all its city-consuming racecar and rubber-iPhone-case awesomeness with the usual loud and colorful aviary of video screens. This year's struggle for supremacy in the perennial mating dance between display and distributor passed over the quickly-dying 3D fad and was waged through loud, squawking proclamation of the superiority of "4k" or "UHD" over the now-common "HD" resolution. Nobody can really blame display vendors for trying to accelerate a replacement cycle (e-waste, pooh, but new revenues, yay!), but it is fair to look at the real and claimed justifications. And to take a stab at guessing where things may actually settle out.

SPOILER: JR thinks that 4k will win.

There are always other features that want to conflate themselves with a high-profile runner. At CES, screens were curving into 4k, flexing into 4k, back and front projecting into 4k, ultra-green-coloring into 4k, three-D-ing into 4k, and supersizing into 4k. But for the nonce, we will peel off those extra blandishments and just look at the 4k vs. HD issues.

Usefulness
What do you need it for? For most content, it's unlikely you'll notice the difference without training and conscious effort. The effect is a lot more subtle than SD to HD, and UHD doesn't change the screen shape to cue you in as HD's widescreen did. If you're watching normal programming from a normal viewing distance while munching normal takeout, most of UHD's benefit will come via the placebo effect. It might be a good idea to thumbtack a brochure to the wall by that screen, actually, to help remember why you bought the thing.

Data bandwidth
It takes data to feed those pixels. A 4k x 2k UHD screen has four times the pixels of a 2k x 1k HD screen (put your hands down, we all know it’s 3840 x 2160 and 1920 x 1080), so ultimately, it has to scrounge up four times the data to light them up. Bandwidth at the network doesn't actually quadruple because compression algorithms are smart enough to take advantage of qualities like spatial and temporal redundancy (a blue sky in 4k doesn't carry much more information than the same sky in HD, for example), but you can count on a good solid doubling of data load.

What? Good question, let me repeat it for those who didn't hear:
"Yabbut hey Rodman how 'bout H.265 instead of H.264 and VP9 instead of VP8 ha ha?" The newer generation of video compression algorithms (coder-decoders, or codecs), and I'll include H.264 High Profile in this set as a sort of H.264-and-a-half, take advantage of the decreasing cost of high complexity (meaning that codecs can do lots more computation now) to deliver about twice the coding efficiency: they just need half the network bandwidth of the H264/VP8 generation of codecs to produce the same quality image for the same resolution. But HD images benefit from this incremental improvement as well as UHD, so apples being apples, a UHD video coded with one of these newer codecs will still suck the data pipe about twice as hard as an HD video if it’s using the same newer codec.

Screen manufacturability and cost
Nah, I'm just messing with ya. You and me, we don't care about display manufacturability and cost. Maybe the display industry is biting their fingernails about these things, but that's not our problem. We need to think about UHD vs. HD assuming that these display formats are both available, and that they're cost competitive. If their cost is not competitive, the question answers itself through simple market dynamics. If the cost is competitive...well, market dynamics apply here too, and depend on what that weaselword "competitive" means.

4k, same cost? Go for it.

4k, incremental cost increase? Weigh the other tradeoffs in this list.

4k, major cost increase? These users will have a compelling application (medical, industrial, education, etc.) that has a quantitative justification, but the unshaven fan with the beer hat and 49'ers T-shirt is no longer the target audience. Which is a shame because I love those guys, they're my peeps.

UHD specsheet has bigger numbers I got nothing here. Those numbers are sure bigger. Look how big they are. Big, big, big, oh my. So yes, UHD has a definite specsmanship advantage. If you really want big numbers, though, you might want to hold off and see how soon the 8k x 4k proofs-of-concept we saw at CES enter the mainstream.

UHD has a growing momentum I think that's the dominant reality here. What we saw at CES was a broad adoption of 4k displays despite, at the moment, minimal technical or market justification. But assuming their price comes into line with reasonable speed, it won't be hard to convince the display-buying public that an incremental cost bite is well worth not getting stuck with a nearly obsolete 1948 Philco TV set (or its modern descendant, the HD display).

Victory in the UHD vs. HD battle is not just a matter of which wins, but when. UHD vs. HD is one of those places where the right player will win, but maybe at the wrong time: that is, 4k will win, but just about now and not in another few years. It's not a badly misplaced victory, but in my view of a perfect world (insert reader's snark here), 4k would go mainstream in 2017-2018, not 2014-2015.

In summary: in the imperfect world we're all sharing today, 4k is irrevocably among us, and is rapidly moving to become the dominant display format.