With the recent announcement of the Retina 5k iMac, a staggeringly high resolution display, I began to wonder… At what point is the pixel density of a display so high, that it no longer matters? At what point does 2k/4k/5k just become a marketing gimmick?
What is ‘High Definition’?
To begin with, lets define some nomenclature. What is HD? High definition. Easy. But what constitutes high definition? That gets tougher. In general, it seems to be agreed that if a screen can produce 480 or more horizontal scan lines, it is considered HD. Scan lines can loosely be defined as a row of pixels on the display. This is where we get 480p, 720p, 1080p, etc. The p just stands for “progressive,” this is the means in which the scan lines display an image. The higher the HD ‘p’ number, the more horizontal scan lines, the more high definition the picture looks. If we have a 480p 50” display and a 1080p 50” display, the 1080p display is going to have a lot higher of a pixel density, and therefore a more crisp image. This is because more pixels packed closely together are harder for the eye to discern, individually, than fewer pixels.
What is a ‘Retina’ Display?
What about Apple’s lingo? What about Retina? Retina is Apple’s way of saying the pixels on their Retina devices are packed so tightly, that the human retina is unable to differentiate them from a certain distance. This distance varies depending on the device. With iPhones, iPads, and MacBooks being 10, 15, and 20 inches, respectively.
So, is ‘Retina’ considered HD? The short answer is yes. The iPhone 6 is considered 720p while the iPhone 6 Plus contains a 1080p display. The 5 series iPhones come in between 480p and 720p at their own unique 640p. Unique, but still considered HD. The resolution of an iPad however is greater than 1080p. Retina iPads clock in at about 1500p.
We’ve covered the ‘p’ now what about the ‘k’? Well, k is still p, but people didn’t feel right saying kp or pk. What I mean is a 4k display has around 4,000 horizontal scan lines. They are still progressive scan lines, and saying 4000p is still technically correct, but 4k is more catchy. Yay marketing! Speaking of catchy, anything over 2,000p, or ‘2k’ is considered ULTRA HD. So 2k, 4k, and 5k are all a whole new level of HD.
The 5k iMac
This brings us to the 5k iMac. The average viewing distance between a pair of eyeballs and an iMac is 20”. Therefore, to make extra sure they could consider their display ‘Retina’, Apple threw a 5k ‘Retina’ display in there. 5k is pretty aggressive considering UHD is still in it’s infancy, but, is it necessary? Well, it depends on where your sitting. If you have two identical displays, one being 1080p and one being 4k, and your standing 10 feet away, you won’t be able to tell the difference. Both will be considered ‘Retina’ at that point and you will be unable to discern any pixels. However, if you were only 5 feet away, the 4k display will look much more ‘HD’ and vibrant. At 5 feet, the 1080 display is no longer Retina, but the 4k still is. This is why the last generation of iMac was considered 1080p HD, but not Retina. At a close 20” viewing distance, the pixel density has to be very high to be considered Retina HD, therefore, 5k.
Why 5k? I Thought 4k was the New Thing
But why 5k and not 4k? 4k does seem to be the new norm for UHD, but this is for watching video. The 5k iMac is intended to edit video. If you’re editing 4k, you need to be able to play the video, plus have the extra real estate for your editing program. Thus making the extra space provided by 5k necessary.
If you have a chance, get to an Apple store and check out the 5k iMac. Seeing it in person is the only way to truly experience 5k. …unless you already have a 5k resolution screen at home…
You can already buy an 8k television. Good luck finding 8k content though.
Seen an IMAX movie lately? IMAX is considered 6k.
Has the Retina display in your iPhone or iPad stopped working? Shatter Buggy can fix that. Book today!
-Shatter Buggy, Denver