Is there a way to set the DPI setting for GDM, and thus the X sessions called from it, without changing this in the individual user profiles?
On one system I have, F11 gdm was automatically set to 120DPI, which looked absolutely hideous at 1440X900 resolution. On another system, it set itself automatically to 94DPI, which looked better, but not ideal. I'd really like the default on both to be 96DPI (or perhaps 100DPI).
How do we do this, or are we just simply at the mercy of whatever gdm decides to do?
Cheers,
Chris
-- ==================================================== "Patriotism is when love of your own people comes first; nationalism, when hate for people other than your own comes first."
--Charles de Gaulle
On Mon, 11 May 2009 15:50:03 -0600 Christopher A. Williams wrote:
How do we do this, or are we just simply at the mercy of whatever gdm decides to do?
I've been fighting with DPI a lot. My simplest solution is to switch from gdm to kdm and edit the kdmrc file to add -dpi 96 to the server args, but you can fool gdm into setting dpi for the duration of the login screen by copying your own ~/.gconf/desktop/gnome/font_rendering directory to ~gdm (and fixing ownership to be owned by gdm). (This assumes you have already set dpi to 96 in the fonts tab of the window appearance preferences).
For more on the DPI saga, see:
On Mon, 2009-05-11 at 18:40 -0400, Tom Horsley wrote:
On Mon, 11 May 2009 15:50:03 -0600 Christopher A. Williams wrote:
How do we do this, or are we just simply at the mercy of whatever gdm decides to do?
I've been fighting with DPI a lot. My simplest solution is to switch from gdm to kdm and edit the kdmrc file to add -dpi 96 to the server args, but you can fool gdm into setting dpi for the duration of the login screen by copying your own ~/.gconf/desktop/gnome/font_rendering directory to ~gdm (and fixing ownership to be owned by gdm). (This assumes you have already set dpi to 96 in the fonts tab of the window appearance preferences).
For more on the DPI saga, see:
Thanks. Interesting read to be sure. It would actually be comical if it wasn't so serious. Hopefully someone with decision authority in X.org will understand the folly of the "improvements" that have been made and provide a practical way to have a manual override.
Fortunately, I have an nVidia card on one of my systems. Unfortunately, my laptop has switchable graphics - with the choice of Intel or ATI chipsets. Guess there really isn't going to be much of a workable solution which covers all of the bases for that one...
Cheers,
Chris
-- ============================= "You see things as they are and ask, 'Why?' I dream things as they never were and ask, 'Why not?'"
-- George Bernard Shaw
Christopher A. Williams wrote:
Fortunately, I have an nVidia card on one of my systems. Unfortunately, my laptop has switchable graphics - with the choice of Intel or ATI chipsets. Guess there really isn't going to be much of a workable solution which covers all of the bases for that one...
I'd recommend setting it to Intel only (i.e. disable the ATI one) in the BIOS if possible, you'll save power and you'll get well-working Free drivers.
Kevin Kofler
On Tue, 2009-05-19 at 21:40 +0200, Kevin Kofler wrote:
Christopher A. Williams wrote:
Fortunately, I have an nVidia card on one of my systems. Unfortunately, my laptop has switchable graphics - with the choice of Intel or ATI chipsets. Guess there really isn't going to be much of a workable solution which covers all of the bases for that one...
I'd recommend setting it to Intel only (i.e. disable the ATI one) in the BIOS if possible, you'll save power and you'll get well-working Free drivers.
Already did that.
...and I'm staying with the Intel chipset until such time as the level of suckage with the ATI drivers for the Radeon HD3400 series reduces to reasonably tolerable levels.
However, because of how the chip sets are set on this laptop, the ATI chipset actually does use less power overall.
Christopher A. Williams wrote:
On one system I have, F11 gdm was automatically set to 120DPI, which looked absolutely hideous at 1440X900 resolution. On another system, it set itself automatically to 94DPI, which looked better, but not ideal. I'd really like the default on both to be 96DPI (or perhaps 100DPI).
It defaults to the actual correct DPI value for your screen (monitor), unless your screen is broken and reports a nonsense DPI.
Kevin Kofler
On Tue, 2009-05-19 at 21:38 +0200, Kevin Kofler wrote:
Christopher A. Williams wrote:
On one system I have, F11 gdm was automatically set to 120DPI, which looked absolutely hideous at 1440X900 resolution. On another system, it set itself automatically to 94DPI, which looked better, but not ideal. I'd really like the default on both to be 96DPI (or perhaps 100DPI).
It defaults to the actual correct DPI value for your screen (monitor), unless your screen is broken and reports a nonsense DPI.
...Umm, no.
My TP T400 laptop defaulted to 120DPI, and my desktop at home with a GeForce Series nVidia chipset and a 24" Acer LCD panel defaulted to 94DPI.
Neither one of these was "optimal". At 120DPI (1440x900 native resolution), the T400 display looks absolutely crummy. Changing it to 96DPI made it crystal clear and sharp. Doing the same on my LCD panel on my desktop had exactly the same effect, albeit less so since it was pretty close already.
In every case where I have had the opportunity to load F10 and F11 (many different systems / hardware combinations), I have yet to see a "correct", as you put it, DPI set up as a default. It's not possible that all of those systems' displays were, again as you put it, broken. In fact, none of them were.
..and what about projector displays where there actually is no specific screen size? By the standard you give, all of them are "broken" by definition since you can't optimize DPI based on the screen size. What about laptops that are routinely connected to projectors for presentations, while simultaneously using the local display?
Regardless of all of that, there should always be a way to tell X what DPI you want anyway. Who said the manufacturer's "correct" setting is the best for you, and that's assuming they use a standard way of specifying that? Clearly different manufacturers do this differently. As such there is no standard per se. Even if there was, you should be able to modify this to taste with a reasonable level of safety.
Cheers,
Chris
On Tue, 19 May 2009 16:40:28 -0600 Christopher A. Williams wrote:
Regardless of all of that, there should always be a way to tell X what DPI you want anyway.
There is: The -dpi option of the X server, but gdm has made it impossible to set server startup options (which is why I switched my login manager to kdm).
On Tue, 2009-05-19 at 16:40 -0600, Christopher A. Williams wrote:
On Tue, 2009-05-19 at 21:38 +0200, Kevin Kofler wrote:
Christopher A. Williams wrote:
On one system I have, F11 gdm was automatically set to 120DPI, which looked absolutely hideous at 1440X900 resolution. On another system, it set itself automatically to 94DPI, which looked better, but not ideal. I'd really like the default on both to be 96DPI (or perhaps 100DPI).
It defaults to the actual correct DPI value for your screen (monitor), unless your screen is broken and reports a nonsense DPI.
...Umm, no.
My TP T400 laptop defaulted to 120DPI, and my desktop at home with a GeForce Series nVidia chipset and a 24" Acer LCD panel defaulted to 94DPI.
Neither one of these was "optimal". At 120DPI (1440x900 native resolution), the T400 display looks absolutely crummy. Changing it to 96DPI made it crystal clear and sharp. Doing the same on my LCD panel on my desktop had exactly the same effect, albeit less so since it was pretty close already.
In every case where I have had the opportunity to load F10 and F11 (many different systems / hardware combinations), I have yet to see a "correct", as you put it, DPI set up as a default. It's not possible that all of those systems' displays were, again as you put it, broken. In fact, none of them were.
It sounds like you're not really following the concept of DPI. I'm not sure how you could possibly see different DPI settings as "sharp" or "not sharp", that just isn't the effect of changing DPI at all. All it does is cause characters to be rendered larger (high DPI) or smaller (low DPI).
Regardless of all of that, there should always be a way to tell X what DPI you want anyway. Who said the manufacturer's "correct" setting is the best for you, and that's assuming they use a standard way of specifying that?
OK, clearly you don't understand the concept.
There's no such thing as a DPI that's 'best for you'. DPI means dots per inch. The correct DPI is a pure mathematical calculation based on the size of the display and the resolution in use. There is no room for subjectivity.
The objective of setting the correct DPI is so that fonts will be rendered at the correct size. There is an absolute, standard definition of exactly how big - in real, physical units, not pixels - a twelve point letter A should be, for instance. When you set your display to the correct DPI, a 12 point letter A will be displayed at exactly the right size in actual physical terms (inches, centimetres, whatever it says on your ruler).
If everyone set their DPI correctly (and all applications and desktops rendered fonts, and physical size-specified graphical items, correctly), then - no matter what particular display you happened to be looking at things on - they'd always look the same size, and that would be the size the creator intended.
There's several technical obstacles to this goal, which is why GNOME recently decided to forget about trying to respect this concept ('resolution independence') and just override everyone's DPI setting to 96. This is a regrettable but fairly sensible decision. However, if you don't understand the theory here, you're never going to get what's going on exactly right.
Clearly different manufacturers do this differently.
Manufacturers do not specify the correct DPI, nor do they 'do it differently'. Display panels provide their physical size via their EDID information. X uses this information combined with its knowledge of what resolution is in use to calculate the correct DPI. Some displays provide their size in stupidly boneheaded ways, but most of these special cases have been caught by now, and the values that were calculated for your displays both sound like they were correct, to me. If you're not confident in your monitor manufacturer, it's fairly trivial to figure out the correct DPI setting for your monitor with nothing but a ruler. Measure the display in inches horizontally, and divide the result into the horizontal resolution in pixels. That's the correct DPI setting for the display.
As such there is no standard per se.
Yes, there is.
Adam Williamson wrote:
There's several technical obstacles to this goal, which is why GNOME recently decided to forget about trying to respect this concept ('resolution independence') and just override everyone's DPI setting to 96. This is a regrettable but fairly sensible decision. However, if you don't understand the theory here, you're never going to get what's going on exactly right.
Uh, I thought that was an old decision and now it got changed to actually respect the native DPI. Did they revert it again? (If so, that'd be regrettable, as everything else, including KDE, is honoring DPI these days.)
Kevin Kofler
On Wed, 2009-05-20 at 02:14 +0200, Kevin Kofler wrote:
Adam Williamson wrote:
There's several technical obstacles to this goal, which is why GNOME recently decided to forget about trying to respect this concept ('resolution independence') and just override everyone's DPI setting to 96. This is a regrettable but fairly sensible decision. However, if you don't understand the theory here, you're never going to get what's going on exactly right.
Uh, I thought that was an old decision and now it got changed to actually respect the native DPI. Did they revert it again? (If so, that'd be regrettable, as everything else, including KDE, is honoring DPI these days.)
Yeah, it ping pongs. :) If I'm keeping score correctly, the last state is that it got ping ponged back to '96dpi by default for everyone' again, but I may be off. It doesn't seem to be easily Google-able.
On Tue, 2009-05-19 at 17:06 -0700, Adam Williamson wrote:
In every case where I have had the opportunity to load F10 and F11 (many different systems / hardware combinations), I have yet to see a "correct", as you put it, DPI set up as a default. It's not possible that all of those systems' displays were, again as you put it, broken. In fact, none of them were.
It sounds like you're not really following the concept of DPI. I'm not sure how you could possibly see different DPI settings as "sharp" or "not sharp", that just isn't the effect of changing DPI at all. All it does is cause characters to be rendered larger (high DPI) or smaller (low DPI).
Hmm, actually, I just realized how this could possibly be...I *think* there's hinting for certain pixel sizes built in to some fonts, so they'll look better if they happen to be rendered at that exact pixel size. And that hinting tends to be done for the pixel sizes you get in common point sizes at 96dpi.
Still, doesn't invalidate the overall principle.
On Tue, 19 May 2009 17:06:49 -0700 Adam Williamson wrote:
There's no such thing as a DPI that's 'best for you'. DPI means dots per inch. The correct DPI is a pure mathematical calculation based on the size of the display and the resolution in use. There is no room for subjectivity.
It is utter and complete nonsense like this which leads to so many idiotic decisions in linux. Caring about DPI having a formally correct definition is for those afflicted with OCD. Being able to read the damn characters most apps put on the screen is what virtually everyone in the real world would prefer.
There are two ways to make the characters readable:
1. Rewrite every app in creation to conform to yet another new complicated Xorg visibility layer for applying magnification factors computed from visual acuity of the user, and distance from display, combined with several years worth of human factors AI algorithms.
2. Lie about the DPI and achieve the mathematically identical effect without modifying a single app.
In fact, quite a lot of monitors don't report the correct physical dimensions, so unless you can lie about the DPI, you can't even correct these display devices to show the absolute anally correct DPI, much less the DPI that makes characters visible on your 52DPI HD monitor you are sitting 3 feet away from.
On Tue, 2009-05-19 at 20:36 -0400, Tom Horsley wrote:
On Tue, 19 May 2009 17:06:49 -0700 Adam Williamson wrote:
There's no such thing as a DPI that's 'best for you'. DPI means dots per inch. The correct DPI is a pure mathematical calculation based on the size of the display and the resolution in use. There is no room for subjectivity.
It is utter and complete nonsense like this which leads to so many idiotic decisions in linux. Caring about DPI having a formally correct definition is for those afflicted with OCD.
...and anyone who ever, say, prints anything.
Being able to read the damn characters most apps put on the screen is what virtually everyone in the real world would prefer.
Ironically, that's exactly what resolution independence is intended to achieve.
There are two ways to make the characters readable:
- Rewrite every app in creation to conform to yet another new complicated
Xorg visibility layer for applying magnification factors computed from visual acuity of the user, and distance from display, combined with several years worth of human factors AI algorithms.
- Lie about the DPI and achieve the mathematically identical effect
without modifying a single app.
Er, how does this 'achieve the mathematically identical effect'?
I bought a new laptop a few months back. It has an 8", 1600x768 resolution screen. That's a DPI up around 180. If I set it to a DPI of 96, then it's almost impossible to read anything at default font sizes. Setting the wrong DPI and then doubling all my font sizes to compensate would be absurd. The correct solution is to set the correct DPI - 180 - so that characters get rendered at a sensible physical size without me having to go around changing font sizes to something ridiculous (like a default of 20 points) all over the place.
(If I were to set the default font size to 20 points in OpenOffice, so I could actually read a document I was typing, then when I printed it out, it'd be ridiculously large...)
If you go out and read the reviews for the Vaio P, lots of reviewers (running Windows, remember, which doesn't allow you to set an arbitrary DPI, and can't get above 120dpi with any hack) panned it because 'the fonts are too small'. Which is an absurd complaint caused by the limitations of using an arbitrary default DPI setting. As more and more very high resolution displays are released - which they will be - this will become an issue for more and more people. Proper resolution independence is the only sane solution. There's no other way you can properly work when a display's actual physical resolution may be 80dpi (some old 19" 1024x768 monitor) or 250dpi (a high-end monitor in a year's time, or an e-ink display...)
In fact, quite a lot of monitors don't report the correct physical dimensions, so unless you can lie about the DPI, you can't even correct these display devices to show the absolute anally correct DPI, much less the DPI that makes characters visible on your 52DPI HD monitor you are sitting 3 feet away from.
So? It's quite easy to exclude obviously wrong dimensions and just go with a default, or if they're only slightly wrong, the result won't be terrible (and likely no worse than the entirely arbitrary 96dpi).
On Tue, 19 May 2009 17:48:14 -0700 Adam Williamson wrote:
...and anyone who ever, say, prints anything.
Printing has absolutely nothing to do with viewing display devices. Precise dimensions are important on printed media for filling out forms, getting inside label boundaries, etc. None of those reason apply to display devices. What you do want on a display device is to see the same proportions to know how it will look when you print it, but same proportions are nothing at all like same absolute size.
Ironically, that's exactly what resolution independence is intended to achieve.
Not the way you define it. A 9 point font is perfectly readable on a piece of paper I'm holding in my hand. On a 1920x1080 42" HD monitor, lower case characters rendered in a 9 point font have about 4 pixels available to render the character. It does me absolutely no good to have the size be "correct". All I get are little indistinguishable blobs.
Lying about the DPI scales everything up so that the image of the characters on the screen has just about the same angular diameter from my viewing location as the printed characters on the page in my hand.
Unless I'm trying to read a secret message revealed by laying a piece of tracing paper on the screen to complete an image partially printed on the paper, I can't imagine any use for forcing screen DPI to be physically "correct".
On Tue, 2009-05-19 at 17:06 -0700, Adam Williamson wrote:
On Tue, 2009-05-19 at 16:40 -0600, Christopher A. Williams wrote:
On Tue, 2009-05-19 at 21:38 +0200, Kevin Kofler wrote:
Christopher A. Williams wrote:
On one system I have, F11 gdm was automatically set to 120DPI, which looked absolutely hideous at 1440X900 resolution. On another system, it set itself automatically to 94DPI, which looked better, but not ideal. I'd really like the default on both to be 96DPI (or perhaps 100DPI).
It defaults to the actual correct DPI value for your screen (monitor), unless your screen is broken and reports a nonsense DPI.
...Umm, no.
My TP T400 laptop defaulted to 120DPI, and my desktop at home with a GeForce Series nVidia chipset and a 24" Acer LCD panel defaulted to 94DPI.
Neither one of these was "optimal". At 120DPI (1440x900 native resolution), the T400 display looks absolutely crummy. Changing it to 96DPI made it crystal clear and sharp. Doing the same on my LCD panel on my desktop had exactly the same effect, albeit less so since it was pretty close already.
In every case where I have had the opportunity to load F10 and F11 (many different systems / hardware combinations), I have yet to see a "correct", as you put it, DPI set up as a default. It's not possible that all of those systems' displays were, again as you put it, broken. In fact, none of them were.
It sounds like you're not really following the concept of DPI. I'm not sure how you could possibly see different DPI settings as "sharp" or "not sharp", that just isn't the effect of changing DPI at all. All it does is cause characters to be rendered larger (high DPI) or smaller (low DPI).
Actually I do understand this quite well.
Regardless of all of that, there should always be a way to tell X what DPI you want anyway. Who said the manufacturer's "correct" setting is the best for you, and that's assuming they use a standard way of specifying that?
OK, clearly you don't understand the concept.
There's no such thing as a DPI that's 'best for you'. DPI means dots per inch. The correct DPI is a pure mathematical calculation based on the size of the display and the resolution in use. There is no room for subjectivity.
I could go on for a while here. I understand the concept of DPI a lot better than you attribute to me.
The objective of setting the correct DPI is so that fonts will be rendered at the correct size. There is an absolute, standard definition of exactly how big - in real, physical units, not pixels - a twelve point letter A should be, for instance. When you set your display to the correct DPI, a 12 point letter A will be displayed at exactly the right size in actual physical terms (inches, centimetres, whatever it says on your ruler).
Technically correct, but utterly useless from a practical perspective.
If everyone set their DPI correctly (and all applications and desktops rendered fonts, and physical size-specified graphical items, correctly), then - no matter what particular display you happened to be looking at things on - they'd always look the same size, and that would be the size the creator intended.
But that's the problem, isn't it. This doesn't work unless _everyone_ in the chain, from the developer through the manufacturer, sets this properly, and in the case of manufacturers, builds their equipment to support it. That simply doesn't happen in the real world, does it.
...And you conveniently omitted the issue about projector displays. How does your algorithm set the "correct" DPI when the screen size changes based on the distance from the projector to the screen, which is always random and can't be detected? Oh, and that distance is always different every time a portable projector is used. And, assuming it was possible to account for the distance in setting that "correct" DPI, do you really expect people to read that "correct"-ly calculated and displayed DPI from 30 feet away or more?
There's several technical obstacles to this goal, which is why GNOME recently decided to forget about trying to respect this concept ('resolution independence') and just override everyone's DPI setting to 96. This is a regrettable but fairly sensible decision. However, if you don't understand the theory here, you're never going to get what's going on exactly right.
Except that I _do_ understand the theory here - and very well. And the above points to the crux of my argument. As a practical matter, going this "correctly calculated DPI" route was an utterly stupid decision to begin with. If the above is true (I really hope it is), perhaps the GNOME developer community has at least partially come to their senses. Defaulting to 96 DPI, _and_ still letting you set X to something different if you wanted to, would count as credit for coming completely to their senses. The only regrettable part of it really is that it's taken this long for GNOME to figure this out. Trying to do things the way they did initially is a clear demonstration of the difference between "smart" and "wise". That's why so many people have basically been asking the question, "How could people who are so smart do something so stupid?"
Clearly different manufacturers do this differently.
Manufacturers do not specify the correct DPI, nor do they 'do it differently'. Display panels provide their physical size via their EDID information. X uses this information combined with its knowledge of what resolution is in use to calculate the correct DPI. Some displays provide their size in stupidly boneheaded ways, but most of these special cases have been caught by now, and the values that were calculated for your displays both sound like they were correct, to me. If you're not confident in your monitor manufacturer, it's fairly trivial to figure out the correct DPI setting for your monitor with nothing but a ruler. Measure the display in inches horizontally, and divide the result into the horizontal resolution in pixels. That's the correct DPI setting for the display.
Correct DPI? Again maybe from a technical perspective, but NOT from a practical one, let alone what the manufacturer likely intended for that display. Even if it worked (and we have demonstrated this is not consistently the case), what is displayed is "correct" only in that you can use a ruler to measure the height of the characters, even if it means you have to interpolate between pixels on an LCD display and make everything look like smudged crap in the process. It also smacks of that the developers of GNOME have arrogantly decided they know more about how to make every manufacturer's display work "correctly" than the manufacturers do, let alone the person who is using that display.
And manufacturers do "do it" differently because, as you have already stated yourself, each is capable - and often does - report display sizes differently, sometimes far differently from what the actual display size is. They do it for their own purposes, but they do it.
I would also challenge your submission that most exceptions have been caught. New models of displays, with different ways of reporting EDID information, are coming out all the time. You can't possibly argue that you've caught these exceptions too, unless you're arguing that GNOME is somehow clairvoyant.
Besides, using a ruler to measure the accuracy of the height of the characters isn't the standard most people use to measure the quality of their display (which is really what I originally meant by "correct for the user"), is it...
As such there is no standard per se.
Yes, there is.
There may be a _technical_ standard, but its applicability to, and use in, the real world is such that it's not really followed. Reference your "bone-headed" comment about display sizes, among others. Thus, there is, by de-facto, no standard.
-- ==================================================== "In theory there is no difference between theory and practice. In practice there is."
--Yogi Berra
On Tue, 19 May 2009 19:26:34 -0600 Christopher A. Williams wrote:
Defaulting to 96 DPI, _and_ still letting you set X to something different if you wanted to, would count as credit for coming completely to their senses. The only regrettable part of it really is that it's taken this long for GNOME to figure this out.
Actually, I don't think it should have anything to do with gnome. It should be in X where it would work for all toolkits across the board, and allow you to establish different DPI values for different monitors in a multi-monitor setup.
https://bugs.freedesktop.org/show_bug.cgi?id=20545
I'm seriously considering starting work on the EDID filter for X I proposed in there. It would be great for making X pretend it had EDID info even when the monitor isn't online when you bring up X (talking to another system on you KVM switch perhaps).
Probably never be accepted though - everyone seems to hate allowing users to control any aspect of their systems :-).
On Tue, 2009-05-19 at 19:26 -0600, Christopher A. Williams wrote:
It sounds like you're not really following the concept of DPI. I'm not sure how you could possibly see different DPI settings as "sharp" or "not sharp", that just isn't the effect of changing DPI at all. All it does is cause characters to be rendered larger (high DPI) or smaller (low DPI).
Actually I do understand this quite well.
Regardless of all of that, there should always be a way to tell X what DPI you want anyway. Who said the manufacturer's "correct" setting is the best for you, and that's assuming they use a standard way of specifying that?
OK, clearly you don't understand the concept.
There's no such thing as a DPI that's 'best for you'. DPI means dots per inch. The correct DPI is a pure mathematical calculation based on the size of the display and the resolution in use. There is no room for subjectivity.
I could go on for a while here. I understand the concept of DPI a lot better than you attribute to me.
Fair enough. If you do, that's fine I have nothing to add. However, the way your message was written didn't seem to imply a good understanding of the issue.
There are clearly problems with the current practical implementation of resolution independence, and the cited use cases (long viewing distances etc) are some of them. That's (partly) why we don't have it already thus making everyone super happy.
To answer the practical issues raised -as Felix said, it's certainly possible to configure the DPI at the X server level, but (again as he said) the way to do this varies depending on the driver in use, unfortunately. You used to be able to do it fairly definitively for any driver using /etc/X11/Xresources (there's an Xft.dpi setting in that file which is supposed to override the X server's DPI value), but this doesn't appear to work consistently any more, unfortunately. I think a bug report requesting a consistent place to override the automatically calculated (or just arbitrarily chosen) DPI setting for X would certainly be valid.
When GNOME's not defaulting to 96 dpi, it automatically inherits X's setting, but if you override it via GNOME's font configuration dialog, it sets it in a private way and the changed setting applies only to GTK+ apps. I think KDE is the same way. It might be nice if this were all co-ordinated between X, GNOME and KDE so that you can choose a manual setting either directly in some config file, or the GNOME / KDE apps would just poke that config file. Then it'd be nice and consistent.
But in the long run the issue isn't just going to go away, and arbitrarily defaulting to 96dpi on all displays isn't the answer. It's horrible for very high-resolution displays, which are already fairly easily available and will only become more so.
2009/5/20 Adam Williamson awilliam@redhat.com:
When GNOME's not defaulting to 96 dpi, it automatically inherits X's setting, but if you override it via GNOME's font configuration dialog, it sets it in a private way and the changed setting applies only to GTK+ apps. I think KDE is the same way. It might be nice if this were all co-ordinated between X, GNOME and KDE so that you can choose a manual setting either directly in some config file, or the GNOME / KDE apps would just poke that config file. Then it'd be nice and consistent.
In my experience, KDE uses a weird mishmash of the strictly accurate DPI (window contents) and something that I suspect is hard-coded 96DPI (window decoration, taskbar): https://bugzilla.redhat.com/show_bug.cgi?id=468451 http://bugs.kde.org/show_bug.cgi?id=179962 Then Gnome apps run under KDE seem to do something different again, but I haven't worked that out exactly.
I've got acceptable-looking results on both of my high-DPI screens by setting the KDE fonts to 120DPI through the config tool, which gets things close enough, but that's not super elegant or satisfying.
MEF
On 2009/05/19 16:40 (GMT-0600) Christopher A. Williams composed:
On Tue, 2009-05-19 at 21:38 +0200, Kevin Kofler wrote:
Christopher A. Williams wrote:
On one system I have, F11 gdm was automatically set to 120DPI, which looked absolutely hideous at 1440X900 resolution. On another system, it set itself automatically to 94DPI, which looked better, but not ideal. I'd really like the default on both to be 96DPI (or perhaps 100DPI).
It defaults to the actual correct DPI value for your screen (monitor), unless your screen is broken and reports a nonsense DPI.
...Umm, no.
Umm, sorta. s/actual correct DPI/accurate DPI/ and he's absolutely right. OTOH, "correct DPI" really is open to interpretation, because the consequences of _not_ assuming, and applying, the "standard" 96 can be rather miserable.
My TP T400 laptop defaulted to 120DPI, and my desktop at home with a GeForce Series nVidia chipset and a 24" Acer LCD panel defaulted to 94DPI.
Neither one of these was "optimal". At 120DPI (1440x900 native resolution), the T400 display looks absolutely crummy. Changing it to 96DPI made it crystal clear and sharp. Doing the same on my LCD panel on my desktop had exactly the same effect, albeit less so since it was pretty close already.
What do you mean by "looks"? Ugly fonts? Other manifestations of ugly, like text that doesn't fit right in the alloted window or field space, or icons disproportionately sized compared to accompanying text?
The problem is resolution independence is needed, but not available yet. Achieving resolution independence is no small task, but forcing 96 on everyone without their consent will serve to make the achievement more difficult in several ways. Among them, fewer complaints mean less motivation to do the work.
The numbers of really high DPI displays in actual use hasn't reached critical mass yet. The developers who would do the work need to have this equipment in order to facilitate doing it, so it's a bit of a chicken/egg problem.
In other ways it's just a matter of developers breaking old habits, not the least of which is sizing objects without using px to specify any of those object sizes.
Regardless of all of that, there should always be a way to tell X what DPI you want anyway....you should be able to modify this to taste with a reasonable level of safety.
As long as resolution independence is not yet fully implemented, absolutely! It used to be that you could, but currently, whether you can at all, and how you do it, often depends on the chip and driver your gfxcard uses, in addition to the subversion of the version of particular Xorg components. It's a mess. :-(
Who said the manufacturer's "correct" setting is the best for you, and that's assuming they use a standard way of specifying that? Clearly different manufacturers do this differently. As such there is no standard per se.
Again, try not to confuse discussion of accurate DPI with "correct" DPI. The former is unequivocal, the latter, anything but.