On Thu, 15 Jul 2004, jludwig wrote:
> is really bugging me. I have ATI RADEON 7200 card and Sun
21"
> monitor which Sony GDM-20E20. Here is the problem...
>
> In the windows manager I can set my resolution to 1600x1200@60. It works
> great. But when I reboot the box both during "start up" and "login
screen"
> screen goes blank. Since I set to do "auto login" I do get the desktop but
> at 1152x864. This happens every time. I manually have to change the
> resolution back to 1600x1200. I am lost. Please Help.
>
> Here is /etc/X11/xorg.conf......
>
> # Xorg configuration created by system-config-display
>
> Section "ServerLayout"
> Identifier "single head configuration"
> Screen 0 "Screen0" 0 0
> InputDevice "Mouse0" "CorePointer"
> InputDevice "Keyboard0" "CoreKeyboard"
> EndSection
>
> Section "Files"
>
> # RgbPath is the location of the RGB database. Note, this is the name of the
> # file minus the extension (like ".txt" or ".db"). There is
normally
> # no need to change the default.
> # Multiple FontPath entries are allowed (they are concatenated together)
> # By default, Red Hat 6.0 and later now use a font server independent of
> # the X server to render fonts.
> RgbPath "/usr/X11R6/lib/X11/rgb"
> FontPath "unix/:7100"
> EndSection
>
> Section "Module"
> Load "dbe"
> Load "extmod"
> Load "fbdevhw"
> Load "glx"
> Load "record"
> Load "freetype"
> Load "type1"
> Load "dri"
> EndSection
>
> Section "InputDevice"
>
> # Specify which keyboard LEDs can be user-controlled (eg, with xset(1))
> # Option "Xleds" "1 2 3"
> # To disable the XKEYBOARD extension, uncomment XkbDisable.
> # Option "XkbDisable"
> # To customise the XKB settings to suit your keyboard, modify the
> # lines below (which are the defaults). For example, for a non-U.S.
> # keyboard, you will probably want to use:
> # Option "XkbModel" "pc102"
> # If you have a US Microsoft Natural keyboard, you can use:
> # Option "XkbModel" "microsoft"
> #
> # Then to change the language, change the Layout setting.
> # For example, a german layout can be obtained with:
> # Option "XkbLayout" "de"
> # or:
> # Option "XkbLayout" "de"
> # Option "XkbVariant" "nodeadkeys"
> #
> # If you'd like to switch the positions of your capslock and
> # control keys, use:
> # Option "XkbOptions" "ctrl:swapcaps"
> # Or if you just want both to be control, use:
> # Option "XkbOptions" "ctrl:nocaps"
> #
> Identifier "Keyboard0"
> Driver "keyboard"
> Option "XkbModel" "pc105"
> Option "XkbLayout" "us"
> EndSection
>
> Section "InputDevice"
> Identifier "Mouse0"
> Driver "mouse"
> Option "Protocol" "IMPS/2"
> Option "Device" "/dev/input/mice"
> Option "ZAxisMapping" "4 5"
> Option "Emulate3Buttons" "yes"
> EndSection
>
> Section "Monitor"
> Identifier "Monitor0"
> VendorName "Monitor Vendor"
> ModelName "Sony GDM-20SE2T5"
> HorizSync 30.0 - 96.0
> VertRefresh 48.0 - 160.0
> Option "dpms"
> EndSection
>
> Section "Device"
> Identifier "Videocard0"
> Driver "radeon"
> VendorName "Videocard vendor"
> BoardName "ATI Radeon 7200"
> EndSection
>
> Section "Screen"
> Identifier "Screen0"
> Device "Videocard0"
> Monitor "Monitor0"
> DefaultDepth 24
> SubSection "Display"
> Viewport 0 0
> Depth 24
> Modes "1600x1200" "1400x1050"
"1280x1024" "1280x960"
> "1152x864" "1024x768" "800x600" "640x480"
> EndSubSection
> EndSection
>
> Section "DRI"
> Group 0
> Mode 0666
> EndSection
What I don't see here is the amount of video memory available. If the
window manager decides you don't have enough it will drop to a lower
resolution.
Never ever put the VideoRam setting in the X server config file.
When that setting gets used, in almost all cases it is used
incorrectly and *causes* problems for the user. Every video
driver contains code to autodetect video memory, which is
automatically used, and in most of the video drivers it is
correct. In the Radeon driver it is correct 100% of the time,
and I have intentionally patched our driver to ignore the
VideoRAM option to prevent people from breaking their
configuration by overriding the amount of video memory.
The mga driver permits the VideoRAM option, because the proper
way to do memory detection for G400 is unknown. All other Matrox
hardware memory detection should work correctly for and should
never be specified by the user.
The nv driver, permits it for Nvidia RivaTNT, as it isn't known
how to detect memory on that hardware either. All other Nvidia
hardware should autodetect properly however.
The i810 driver permits this setting, but it uses it slightly
differently that most. In addition to being able to limit memory
lower than what the BIOS has configured memory for, it is capable
of increasing the amount of memory stolen from the system - *but*
this feature only works on certain systems, and not at all on
others.
Some of the remaining drivers have the odd board or two they
can't autodetect memory for also, and so the VideoRAM setting can
be used to tell the driver the real amount of Video memory on the
card - but it should only be used by experienced users who truely
know how much memory the card has on it and would bet their life
savings on the number they provide to the driver. Getting this
number wrong has one of 2 major consequences.
In the first case, you specify memory lower than what you really
have - this limits the amount of memory available and depending
on how low you go wrong, it may cause certain video modes/refresh
rates to no longer be available, or it may cause the driver to
disable DRI as there isn't enough memory for textures, or it may
cause video overlays to no longer work, or some other feature to
stop working. Generally the log file will inform about anything
being disabled and why.
In the worst case, you specify memory HIGHER than what you really
have, either because "the guy at the computer store told me this
was a 128Mb video card, but X says it's only 64Mb, so X is broken
and I'm fixing it", or for some other reason you are convinced
the video driver's video memory autodetection is wrong. In
almost 100% of bug reports received in which videoRam is being
used to override memory autodetection in this manner, the
"computer store guy" lied, or was otherwise wrong, or the person
was somehow otherwise misled into thinking they had a card with
more memory on it than it really does. A good test of how much
video memory is available, is to boot into Microsoft Windows with
the manufacturer's official Windows drivers and see how much
video memory there is. The exact procedure to do so is left as
an exercise to the reader however. ;o)
If the video driver is told there is 128Mb of memory, it WILL TRY
TO USE IT ALL. This means that if you have a 64Mb card, it will
write to memory beyond 64Mb. What happens at that point depends
on how things are wired on the card and a number of other
factors. In some cases memory address access wraps around to the
beginning of video memory, causing writes over 64Mb point to be
written to the 0Mb point, etc. That will corrupt video memory,
pixmap memory, mouse pointer memory, buffers used by the 2D
and/or 3D engines, or any other number of problems.
Another problem, is if you're using a multihead card. VideoRAM
option needs to be specified per head, and the total has to be
equal or less than the total video memory that is really
available. People usually put the real total in both places, so
the video driver thinks each head has the total amount of memory,
which of course breaks things.
When people experience these odd problems, it usually appears as
if the video card is severely broken, or the video drivers are
just totally insanely broken. Bug reports come in which are
usually quite obscure, and unless the person reviewing them
notices VideoRAM being used and being a possible culprit, a great
number of hours or days of time could potentially be wasted on a
red herring bug report.
As such, when I see people suggesting the use of VideoRAM for a
particular problem, I feel compelled to take the soapbox and
spread the truth about this evil option. ;o)
On the good side of things however, I've disabled the VideoRAM
setting on the Radeon driver, so that it is impossible to
override the driver detected memory. The driver will refuse the
setting and log a comment to the log file stating:
(II) VideoRAM override ignored, this driver autodetects RAM
While I've made the Radeon driver sane, it's slightly trickier to
disable this setting in most other drivers, as each driver seems
to have one or two ancient cards that require this parameter. As
such, I decided to leave the other drivers be when I did the
Radeon patch, with plans on labotomizing this option on the other
drivers once I had time to google around to determine what chips
in each driver really do need the option - and then to disable
the option on all chips except those that truely do need it.
Nowadays though, most hardware works fine with autodetection, so
it might be a good idea to just disable the option across the
board except on hardware we already know needs it (like RivaTNT,
G400 and a few others), and then just wait for bug reports from
people who say "the video driver detects the wrong amount of
video memory". It would be easier to troubleshoot these cases
for people, and to update the driver to permit VideoRam overrides
on chips the driver turns out to get it wrong on.
I believe this will make the drivers more robust against user
misconfiguration and will reduce the number of unnecessary bug
reports received both upstream and by us.
In the mean time however, the VideoRAM setting is not your
friend. It is there to make your life harder, and to allow
people to unintentionally and innocently break their setups.
Ignore the naysayers and repent! ;o)
</diatribe>
There, I feel better now. Please spread the knowledge of the
evils of VideoRAM around any time you hear someone suggest using
the option. Eventually it will lead to world peace.
;o)
I would try setting for 16 bit DefaultDepth color setting.
Shouldn't make a difference, a Radeon 7200 has enough video
memory (32Mb minimum) to use the largest resoltion the card is
capable of displaying (2048x1536) at the highest color depth,
with 3D acceleration enabled. If changing the color depth does
however make a difference, there is a bug somewhere. If so,
please bugzilla it and attach the X server log file and the
config file that was used during that log file's invocation.
Thanks in advance!