These are some photos comparing different macOS retina scaling options. All use a non-vector test pattern displayed at 100% (AAPM TG18). All photos are taken at roughly the same distance, using the same crop. Forgive the slight hand-held motion blur, and focus on these variables:
- Pixel size
- UI size
- Pixel perfection (has any interpolation taken place?)
- Real estate (how much of the image is displayed in this representative patch?)
27" 16:9 109 PPI
Pixel perfect. Small real estate. Typical size UI (same size as 109 PPI @ 100%, obviously)
40" 21:9 140 PPI
Pixel perfect. Relatively small real estate. Large size UI (same size as 70 PPI).
Not pixel perfect. Larger real estate. Typical size UI (same size as 105 PPI).
13.3" 16:9 220 PPI
Pixel perfect. Relatively small real estate (similar inch for inch as immediately above, given the similar UI size, however less overall due to the much smaller panel size). Typical UI size (same size as 110 PPI).
Not pixel perfect. Larger real estate. Slightly smaller UI size (same size as 124 PPI).
Thanks for this. For the first two images, I don't quite understand how something that "looks like 1080p" with a large size UI allows for more real estate in the same physical area as something that "looks like 1440p" (is 1440p) with a typical size UI. Shouldn't it be the other way around? Are all of these photos taken from the same distance away from the screen?