Linux and ATI drivers 8.22

General discussion about Celestia that doesn't fit into other forums.
Topic author
lumiwa
Posts: 2
Joined: 08.03.2006
With us: 18 years 8 months

Linux and ATI drivers 8.22

Post #1by lumiwa » 08.03.2006, 01:16

Hi!

I have Linux, Celestia 1.4.1 and ATI drivers 8.22 (fglrx). When I run Celestia I got:

Initializing ARB vertex programs . . .
Loading ARB vertex program: shaders/diffuse_arb.vp
Loading ARB vertex program: shaders/specular_arb.vp
Loading ARB vertex program: shaders/haze_arb.vp
Loading ARB vertex program: shaders/bumpdiffuse_arb.vp
Loading ARB vertex program: shaders/bumphaze_arb.vp
Loading ARB vertex program: shaders/shadowtex_arb.vp
Loading ARB vertex program: shaders/diffuse_texoff_arb.vp
Loading ARB vertex program: shaders/rings_arb.vp
Loading ARB vertex program: shaders/ringshadow_arb.vp
Loading ARB vertex program: shaders/night_arb.vp
Loading ARB vertex program: shaders/glossmap_arb.vp
Loading ARB vertex program: shaders/diffuse2_arb.vp
Loading ARB vertex program: shaders/haze2_arb.vp
Loading ARB vertex program: shaders/diffuse_texoff2_arb.vp
Loading ARB vertex program: shaders/specular2_arb.vp
Loading ARB vertex program: shaders/night2_arb.vp
Loading ARB vertex program: shaders/ell_galaxy_arb.vp
All ARB vertex programs loaded successfully.
render path: 3
Segmentation fault

I I uncommented "IgnoreGLExtetensions..." than I got and error:
render path 1
Segmentation fault

Thanks

Topic author
lumiwa
Posts: 2
Joined: 08.03.2006
With us: 18 years 8 months

Post #2by lumiwa » 11.03.2006, 01:16

Now I have ATI drivers 8.23 and same problem. I cannot run Celestia.

Avatar
selden
Developer
Posts: 10192
Joined: 04.09.2002
With us: 22 years 2 months
Location: NY, USA

Post #3by selden » 11.03.2006, 01:52

Evidently the new ATI drivers have developed a new bug :(

You have my sympathy.
Sorry I can't help.
Selden

giacomolg
Posts: 4
Joined: 24.04.2006
With us: 18 years 7 months
Location: Bologna (ITALY)

Post #4by giacomolg » 24.04.2006, 23:26

lumiwa wrote:Now I have ATI drivers 8.23 and same problem. I cannot run Celestia.

I have 8.24 ATI drivers on a Radeon 9200 Pro and I get more or less the same error

Code: Select all

giacomo@ubuntu:~$ celestia
Initializing ARB vertex programs . . .
Loading ARB vertex program: shaders/diffuse_arb.vp
Loading ARB vertex program: shaders/specular_arb.vp
Loading ARB vertex program: shaders/haze_arb.vp
Loading ARB vertex program: shaders/bumpdiffuse_arb.vp
Error in vertex program shaders/bumpdiffuse_arb.vp, line 10: Error on line 12: malformed declaration (hint: '0.5')
render path: 1
Segmentation fault


I suppose something wrong with shaders :(

Hope u can fix it soon, I'm lost without Celestia :cry:

Avatar
t00fri
Developer
Posts: 8772
Joined: 29.03.2002
Age: 22
With us: 22 years 8 months
Location: Hamburg, Germany

Post #5by t00fri » 25.04.2006, 11:12

giacomolg wrote:
lumiwa wrote:Now I have ATI drivers 8.23 and same problem. I cannot run Celestia.
I have 8.24 ATI drivers on a Radeon 9200 Pro and I get more or less the same error

Code: Select all

giacomo@ubuntu:~$ celestia
Initializing ARB vertex programs . . .
Loading ARB vertex program: shaders/diffuse_arb.vp
Loading ARB vertex program: shaders/specular_arb.vp
Loading ARB vertex program: shaders/haze_arb.vp
Loading ARB vertex program: shaders/bumpdiffuse_arb.vp
Error in vertex program shaders/bumpdiffuse_arb.vp, line 10: Error on line 12: malformed declaration (hint: '0.5')
render path: 1
Segmentation fault


But that's an old story, really. It's not a CELESTIA bug but a familiar ATI driver bug in their proprietary drivers. I have a Radeon 9200SE in one of my desktops and know much of this segfault history. Since you did not tell us what OS you are using, I have to stop here with specific advice.

Anyhow, my DELL laptop performs nicely with Celestia 1.4.1 under /Windows/ with some fairly recent ATI driver (don't recall it's number now) and a Radeon Mobility card.

In Linux, however
the proprietary fglrx drivers always segfault with the 9200SE while the ATI driver coming with the recent xorg X-servers works perfectly fine.

Hope u can fix it soon, I'm lost without Celestia :cry:


Definitely NOT, since it's an ATI bug.

Bye Fridger

giacomolg
Posts: 4
Joined: 24.04.2006
With us: 18 years 7 months
Location: Bologna (ITALY)

Thanks for answering

Post #6by giacomolg » 25.04.2006, 14:48

Is there a way to disable shaders in Celestia?

Avatar
selden
Developer
Posts: 10192
Joined: 04.09.2002
With us: 22 years 2 months
Location: NY, USA

Post #7by selden » 25.04.2006, 15:25

Please read the "Preliminary Celestia User's FAQ", which is a sticky near the top of the Users Forum.

In particular, see Q/A #1.
Selden

giavoltor
Posts: 2
Joined: 20.05.2006
With us: 18 years 6 months

Post #8by giavoltor » 20.05.2006, 01:44

Problem:
--------
Celestia 1.4.1: segfault upon startup under Linux using ATI Binary drivers.

System info:
------------
Linux 2.6.16-gentoo-r7 #6 PREEMPT Sun May 14 17:51:46 EEST 2006 i686 Intel(R) Pentium(R) M processor 1500MHz GNU/Linux
01:00.0 VGA compatible controller: ATI Technologies Inc Radeon R250 Lf [FireGL 9000] (rev 02)
ati-drivers-8.24.8

Segmentation fault in:
----------------------
celestia-1.4.1/src/celengine/render.cpp, line 7115
glDrawArrays(GL_QUADS, 0, nStars * 4);

Possible reason:
----------------
Too deep recursion in some function in the ATI binary file atiogl_a_dri.so

Workaround:
-----------
Disable star rendering.
For example in GNOME:
start gconf-editor and uncheck apps->celestia->render->stars

Quick fix:
----------
celestia-1.4.1/src/celengine/render.cpp, line 162
starVertexBuffer = new StarVertexBuffer(64); //instead of 2048

Suggested fix:
--------------
The buffer capacity should be a configurable parameter
(e.g placed in /usr/share/celestia/celestia.cfg )

chris
Site Admin
Posts: 4211
Joined: 28.01.2002
With us: 22 years 10 months
Location: Seattle, Washington, USA

Post #9by chris » 20.05.2006, 02:44

giavoltor wrote:Quick fix:
----------
celestia-1.4.1/src/celengine/render.cpp, line 162
starVertexBuffer = new StarVertexBuffer(64); //instead of 2048

Suggested fix:
--------------
The buffer capacity should be a configurable parameter
(e.g placed in /usr/share/celestia/celestia.cfg )


Decreasing the size of the star vertex buffer could significantly hurt performance when many stars are being rendered. In general, batching primitives is very important for attaining good performance with 3D graphics hardware. I'm reluctant to add a hackish workaround such as a config file parameter because of a broken driver. However, if the problem is widespread enough, I might consider it.

--Chris

giavoltor
Posts: 2
Joined: 20.05.2006
With us: 18 years 6 months

Post #10by giavoltor » 20.05.2006, 08:58

chris wrote:Decreasing the size of the star vertex buffer could significantly hurt performance when many stars are being rendered. In general, batching primitives is very important for attaining good performance with 3D graphics hardware. I'm reluctant to add a hackish workaround such as a config file parameter because of a broken driver.


I would definitely appreciate if the buffer size is made configurable in some way. This will spare me from patching Celestia any time a new update comes out. If the default configuration value is 2048, the 99,9% of the Celestia users will not notice any difference at all, but for the remaining 0,1% of us who run Linux and are stuck with an ATI card for some reason (for example I cannot replace the Mobility Radeon 9000 that comes in my IBM ThinkPad T41), this parameter will be all the difference between having an application that does not want to run and enjoying a great space simulator program.

As for performance considerations: the ATI binary linux driver (if/when it works) provides a drastically better performance than the X.org one. Although the opensource driver works, I can only achieve a top framerate of 5 (five) frames per second while looking at Earth with the Milky Way as a backdrop and Auto Magnitude set to maximum. Using the proprietary drivers and after decreasing the buffer size to 64, Celestia segfaults no more and for the same scene achieves framerates of at least 20 FPS. The scene in both cases is rendered with the textures that come bundled with the source tar.gz, i.e, without any addons.

Additionally, performance is probably gained by the fact that compiling Celestia from source on Linux allows for processor-specific and other optimizations. As in my case, I compiled it explicitly for the Pentium-M that runs in my Notebook.

Avatar
t00fri
Developer
Posts: 8772
Joined: 29.03.2002
Age: 22
With us: 22 years 8 months
Location: Hamburg, Germany

Post #11by t00fri » 20.05.2006, 11:44

Since I have studied this Radeon 9200SE ATI driver problem since a while on one of my computers, I also have run a number of benchmarks comparing (well configured and /recent/) x.org drivers versus the latest ATI drivers. Note the latest x.org drivers have MUCH improved. I get fps differences around a factor of <=1.5, but certainly NOT a factor of 4! It could be that in addition your CPU is slow... Mine is 2.8 GHz/1GB ram.

My bechmarks were also taken with other OpenGL applications and the overall statement remains true there, too.

Bye Fridger
Image


Return to “Celestia Users”