Why EDID is not trustworthy for DPI

Nicolas Mailhot nicolas.mailhot at laposte.net
Thu Oct 6 19:22:22 UTC 2011


Le jeudi 06 octobre 2011 à 16:41 +0100, Matthew Garrett a écrit :
> On Thu, Oct 06, 2011 at 05:33:48PM +0200, Nicolas Mailhot wrote:
> > 
> > Le Jeu 6 octobre 2011 17:18, Matthew Garrett a écrit :
> > > The heuristic isn't the problem. The problem is that we have no
> > > technology that allows us to handle the complicated case of multiple
> > > displays, and solving it purely for the simple case makes the
> > > complicated case *worse*.
> > 
> > How does it make it worse? The heuristic does not solve the complicated case
> > *at all*. How does removing it could possibly make it worse?
> 
> What heuristic?

The one you were writing about

Me, I don't care about which heuristic you were thinking of. Any
heuristic will annoy large numbers of users when you're dealing with
text. The rules should be simple and stupid:

A. on a single-screen system
 1. use the xorg-detected screen size to compute actual DPI, base font
sizes on it
 2. if autodetection didn't work or the results looks wrong (because the
hardware is broken, or it's not designed to be used on a desk but is a
TV/a projector), ask the user to provide the screen size (displaying
slider + a ruler in the locale's length unit, with the length unit
displayed on screen too; users are smart enough to fake lengths if they
want to). If you want market forces to work, crowdsource the complaining
and tell the user his hardware is broken and he should take it with the
manufacturer.
 3. save the results and reuse them each time the same screen is used
 4. propagate the resulting dpi so every toolkit can use them (ideally
propagate it down to the xorg level so every toolkit that uses xorg dpi
will just work)

B. when a second screen is detected
 1. use the same rules to get its size
 2. if the computed dpi for the second screen is too different from the
first one, ask the user what to do (optimize for screen 1, for screen 2,
deoptimize both with a middle setting)
 3. save the results to apply them automatically the next time this
screen combination is encountered
 4. longer term start thinking about how to apply different dpi to
different outputs as screen densities have been clearly diverging for
some years, and the compination of fixed-size laptops and increasing
resolutions can only mean more divergence in the future

C. for font sizes
 1. display them in points (pt) or pixels (px),
 2. display the unit you're using. Don't make the user guess what the
perverted font dialog author had in mind
 3. let the user specify them in points or pixels as he prefers,
regardless of the default display unit
 4. accept decimals, do not try to round sizes to integers
 5. do not try to invent a new unit. Yes historical units are stupid and
badly thought, but so are imperial units and letter format, and that's
what some countries still use. Units are not there to be smart units are
there so different people can agree on measurements. Points is what
users will see in every electronic document they will be filling, no new
unit will be better enough to outweight the hassle of the user having to
deal with a new unit. If you have time to waste, go rewrite every
electronic document app and format in the market to use your unit, and
only afterwards inflict it on desktop users. And if you still think
being pedantic about font size units is a good idea, try to use only
Kelvins when speaking to others about temperatures and see how they
react.

-- 
Nicolas Mailhot



More information about the devel mailing list