Why EDID is not trustworthy for DPI

Adam Jackson ajax at redhat.com
Wed Oct 5 14:30:56 UTC 2011

On Tue, 2011-10-04 at 19:05 -0700, Adam Williamson wrote:

> 96dpi, however, is almost *never* correct, is it? So just taking a
> hardcoded number that Microsoft happened to pick a decade ago is hardly
> improving matters.

The X default used to be 72dpi.  Maybe it'll be something else in the
future, and then I can get bitched at more for having changed it yet
again by people still using a fundamentally unreliable API.

> It still seems to me that taking the EDID number if it seems reasonably
> plausible and falling back to 96dpi otherwise is likely a better option.

I reiterate: X gives you the actual sizes (as best as we can guess) on
the RANDR outputs.  The global "size" that we default to 96dpi is broken
to rely on in any event, because X simply has no mechanism for updating
it besides reconnecting to the display.

We could add a request to re-fetch the connection handshake block, but
if you're going to update all your apps to use that request, you might
as well update all your apps to use the existing RANDR's geometry
information instead.

If the UI wants to be sensitive to DPI, then do me the favor of using
the DPI numbers that map 1:1 to actual monitors, instead of a single
number that can never be an accurate reflection of reality.

> Your examples lean a lot on TVs and projectors, but are those really the
> key use cases we have to consider? What about laptops and especially
> tablets, whose resolutions are gradually moving upwards (in the laptop
> case despite the underlying software problems, in the tablet case
> because the underlying software doesn't have such a problem)? Is it
> really a great idea, for instance, if we put Fedora 17 on a 1024x600, 7"
> tablet and it comes up with zonking huge fonts all over the place?

I'm going to not mention the traditional monitors I've seen with bad
EDID.  I'm going to not mention the laptops I've seen that report 0x0
physical size, or something non-zero and fictitious.  I'm going to not
mention the laptops where you simply don't get EDID, you get some subset
buried in the video ROM, and you get to hope that it might have physical
size encoded in it.  I'm going to not mention that DPI is only
approximately what you want anyway, and that you actually need to know
dots per unit arc, which is a function of both display size and view

I'm going to simply quote myself from another message in this thread:
How people use this information is entirely not my concern.  My job is
to get the pixels on the screen; it might be to try valiantly to tell
you how big they are; it is not to decide if they're big enough.

> I think it's worth considering that, even though Microsoft's crappiness
> with resolution independence has probably hindered the market
> artificially for a while, the 96dpi number which comes from the
> capabilities of CRT tubes circa 1995 bears increasingly little
> resemblance to the capabilities of modern displays, and assuming we can
> just keep hardcoding 96dpi and monitor technology will remain
> artificially retarded forever is likely not a great thing to do.

I don't believe that was a position I was defending.

I would caution you against thinking that there's some DPI revolution
right around the corner.  That's the same fallacy that rages against the
TV industry for "stalling" at 1080p.  Linear increases in DPI are
quadratic increases in link bandwidth, and maxed-out single-link DVI
(the source of the 1080p limit) is already a higher symbol rate than
gigabit ethernet.

- ajax
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 198 bytes
Desc: This is a digitally signed message part
Url : http://lists.fedoraproject.org/pipermail/devel/attachments/20111005/124586df/attachment.bin 

More information about the devel mailing list