We have guidelines for asking and answering questions. Linux questions only, please.
We make no guarantees about answers, but you can be anonymous on request.
See also: The Answer Gang's Knowledge Base and the LG Search Engine
From Chris Gianakopoulos
Answered By: Ben Okopnik, Robos
Hello Gang, how are all of you?
I picked up an ATI Radeon 7500 PCI card last month to use in my Linux machine. A lot of the time, (noting that my login is console based rather than GUI based), when I launched X, I would get an indication that no monitor could be found. My card has an S-video connector, a panel connector (the big rectangular thing), and a VGA connector. My monitor is connected to the VGA connector.
[Robos] By the last sentence you mean in reality (IIRC): the second vga connector. The first "device" is the panel connector (dvi) which can become a vga connector with an adapter (cheap to have everywhere). The signal quality on that is in most test far better than that what the second vga spits out.
Looking at the Internet, it seems that some OEM boards have problems recognizing the monitor when it is connected to the VGA connector.
I think that I fixed it so that it works reliably all of the time. Here is a section of my XF86Config file. Note, that I have seen other solutions to the problem, but, I wanted to try this.
Section "Device" BoardName "RV200 QW" BusID "0:14:0" Driver "radeon" Identifier "Device" Option "CloneDisplay" "0" Option "ForcePCIMode" Screen 0 Option "EnablePageFlip" "On" Option "Rotate" "off" VendorName "ATI" EndSection
Setting CloneDisplay equal to 0 appears to force the driver to default the monitor to a CRT rather than trying to probe for the monitor. Two days, and so far, so good.
Looking at /var/log/XFree86.0.log (the X log file) does indicate that the monitor is no longer autoprobed, and a CRT is selected (Primary Display == Type 1). Looking at the driver, radeon_driver.c gives me the same indication that I did override the autoprobing.
We'll see what I say after another month. Time and experimentation will validate my assumption.
[Ben] You might want to take a look at the X "radeon" man page. Since I use ATI's "fglrx" server, it doesn't help me much (I ran across it while researching my Radeon 9200), but you may find the option set and the explanations useful.
Yea, that's what I wound up doing. It was a combination of looking at the man page (the radeon one), and the log files that clued me in. glxgears gives me a maximum frame rate of 234 FPS. My nvidia geforce2 MX400 gave me 670 FPS with glxgears, but, I would get system lockups (the flashing keyboard with kernel panics when running scilab, mozilla firebird 0.7), thus, I got the Radeon board. I can live with less 3D performance since I got a stable driver with my combination of motherboard plus video card.
[Ben] Wow - I guess the 9200 is a pretty fair gadget in that regard, then.
ben@Fenrir:~$ glxgears 9073 frames in 5.0 seconds = 1814.600 FPS 9576 frames in 5.0 seconds = 1915.200 FPS 9562 frames in 5.0 seconds = 1912.400 FPS 9604 frames in 5.0 seconds = 1920.800 FPS 9570 frames in 5.0 seconds = 1914.000 FPS 9607 frames in 5.0 seconds = 1921.400 FPS
Anyway, my flight simulator (FGFS) works like a charm. That's what's important.
Ah, and to the other replies: glxgears is not really an indication of what the graphics card is capable of! By any means! Rather use some demo proggy like quake3 in demo mode or some screensaver or something.
Cool. I noticed that torcs runs a little choppy compared to the nvidia card though. Still though, I'll heed your advice about the glxgears thing. I'm kinda new to this 3D stuff. I had no idea that so much OpenGL stuff was available.
Thanks much. Chris G.