Why EDID is not trustworthy for DPI

Matthew Garrett mjg59 at srcf.ucam.org
Wed Oct 5 20:31:15 UTC 2011


On Wed, Oct 05, 2011 at 12:31:50PM -0700, Adam Williamson wrote:

> Like I replied to ajax, I suspect when the problem of assuming
> everything's 96dpi becomes simply too acute, instead of fixing
> everything really properly so that all displays correct report their
> size and all desktops actually do resolution independence perfectly so
> it doesn't _matter_ if one of your displays is 98dpi and the other is
> 215dpi, everything still looks perfect, the industry will just wind up
> with a slightly more sophisticated bodge where we have a few 'standard'
> resolutions and just figure out which one your displays are closest to.
> But that's still going to require some kind of sensible handling of the
> case where one monitor is roughly 100dpi and the other is roughly
> 200dpi, unless we simply say 'you can't do that, all your displays have
> to be in the same DPI Category'.

Sure, in the future when we have font renderers that run in GPU shaders 
we can think about whether there's a plausible way to make applications 
work when they have to deal with multiple DPIs simultaneously. But we 
don't have any technology that can do any of that at the moment, and so 
the simple fact is that right now the decision to have gnome run at 
96dpi regardless of the output is an entirely rational one and anyone 
who argues otherwise gets to explain how all the difficult bits would 
work. The end.

-- 
Matthew Garrett | mjg59 at srcf.ucam.org


More information about the devel mailing list