Major Artifacts in Celestia 1.3.1
Major Artifacts in Celestia 1.3.1
I just upgraded to Celestia 1.3.1 (compiled from source, without GTK or KDE). The stars are causing severe artifacts... It's hard to explain...
Stars flicker/twinkle and get repainted directly in front of the camera as large triangles ... The next frame, they are back to normal... And the process repeats... Everything else seems to work fine, except for 100% CPU useage... The artifacting gets worse when I increase the number of stars...
I'll post screenshots of the problem if someone can find me webspace.
I am running Mandrake Linux kernel 2.4.19-16mdk with XFree86 4.3.0 and ATI FireGL 3.7.0 drivers. The videocard is an ATI Radeon 9600 made by FIC.
Stars flicker/twinkle and get repainted directly in front of the camera as large triangles ... The next frame, they are back to normal... And the process repeats... Everything else seems to work fine, except for 100% CPU useage... The artifacting gets worse when I increase the number of stars...
I'll post screenshots of the problem if someone can find me webspace.
I am running Mandrake Linux kernel 2.4.19-16mdk with XFree86 4.3.0 and ATI FireGL 3.7.0 drivers. The videocard is an ATI Radeon 9600 made by FIC.
Confirmed with ATI 3.7.0 and 3.7.6. The affect is still present in ATI 3.2.8 drivers though is lessened. It is only visible when the stars are on fuzzy or disc. Approaching planets is ok for about 5 seconds then they explode is gigantic cataclysmic pointy triangles of death. Thank god there's no collision detection.
I blame crap linux ATI drivers. I don't expect a fix.
I blame crap linux ATI drivers. I don't expect a fix.
A couple of suggestions ...
* Add the following line to the celestia.cfg file, under the existing "# Ignore" line...
* Try using the Ctrl+V keystroke combo to set the render path to Basic or Multitexture.
* Compile using the current CVS source code. Several display bugs have been fixed in 1.3.2 pre6, which is the current binary release.
* Add the following line to the celestia.cfg file, under the existing "# Ignore" line...
Code: Select all
# IgnoreGLExtensions [ "GL_ARB_vertex_buffer_object" ]
* Try using the Ctrl+V keystroke combo to set the render path to Basic or Multitexture.
* Compile using the current CVS source code. Several display bugs have been fixed in 1.3.2 pre6, which is the current binary release.
-Don G.
My Celestia Scripting Resources page
Avatar: Total Lunar Eclipse from our back yard, Oct 2004. Panasonic FZ1 digital camera (no telescope), 36X digital zoom, 8 second exposure at f6.5.
My Celestia Scripting Resources page
Avatar: Total Lunar Eclipse from our back yard, Oct 2004. Panasonic FZ1 digital camera (no telescope), 36X digital zoom, 8 second exposure at f6.5.
selden wrote:However, don't forget to take out the # from in front of the "ignore" command. Otherwise it'll ignore you telling it to ignore that routine
Duhhhh. Thanks Selden.
Mine is currently commented to test with recent CVS. Forgot to remove the comment indicator in the message.
-Don G.
My Celestia Scripting Resources page
Avatar: Total Lunar Eclipse from our back yard, Oct 2004. Panasonic FZ1 digital camera (no telescope), 36X digital zoom, 8 second exposure at f6.5.
My Celestia Scripting Resources page
Avatar: Total Lunar Eclipse from our back yard, Oct 2004. Panasonic FZ1 digital camera (no telescope), 36X digital zoom, 8 second exposure at f6.5.
Found the offending line
I found the offending line. Is this an ATI driver bug?
StarVertexBuffer::render() line 5335
ATI now has a place to submit driver feedback. If it is a bug, and enough people report it, they might fix it.
StarVertexBuffer::render() line 5335
Code: Select all
glDrawArrays(GL_QUADS, 0, nStars * 4);
ATI now has a place to submit driver feedback. If it is a bug, and enough people report it, they might fix it.
From the little I know, Chris added this feature for nVidia cards but has not yet determined how it needs to be coded (changed, etc.) for ATI and other cards. For ATI, I'm not sure if it's a bug, an omission, or just requires different parameters. I'm sure Chris knows though -- or will find out.
-Don G.
My Celestia Scripting Resources page
Avatar: Total Lunar Eclipse from our back yard, Oct 2004. Panasonic FZ1 digital camera (no telescope), 36X digital zoom, 8 second exposure at f6.5.
My Celestia Scripting Resources page
Avatar: Total Lunar Eclipse from our back yard, Oct 2004. Panasonic FZ1 digital camera (no telescope), 36X digital zoom, 8 second exposure at f6.5.
I am having the same problem, I think, with my Radeon 9600 and the 3.7.6 drivers for XFree 4.3.0 and Celestia 1.3.0. However, I was also receiving an error/warning message when starting Celestia:
"celestia: libGL.so.1: no version information available"
(Or similar) I have since solved this warning message (remotely) but have not been able to test (i'm not at the console) the triangle problem again to see if they are related as I had previously suspected. I solved the warning message by recompiling Celestia AFTER installing the ATI GL libraries.
So, I don't know whether they are related but I will report back after I test it in case nobody beats me to it
Jay
"celestia: libGL.so.1: no version information available"
(Or similar) I have since solved this warning message (remotely) but have not been able to test (i'm not at the console) the triangle problem again to see if they are related as I had previously suspected. I solved the warning message by recompiling Celestia AFTER installing the ATI GL libraries.
So, I don't know whether they are related but I will report back after I test it in case nobody beats me to it
Jay
Just to report back in on my testing...
First, fixing the libGL.so message doesn't make the ugly triangles go away, but it seems to improve things a little overall, including general performance. Apparently, Celestia can run without whatever it was looking for, but runs better when it finds it. No surprise there, really.
Next, the triangle problem is less annoying when you make the view window very large, at least the center of the window is usually pretty clean then.
Also, the Basic/Multitexture Render option has no noticeable effect on the triangles.
I haven't tried the configuration option suggested by Selden, I will do that soon and report back, since I don't see anyone else piping up about it.
And finally... I can also confirm that swapping out the ATI for a GeForce FX 5200 cures the problem, not that anyone has any doubt
Cheers,
Jay
First, fixing the libGL.so message doesn't make the ugly triangles go away, but it seems to improve things a little overall, including general performance. Apparently, Celestia can run without whatever it was looking for, but runs better when it finds it. No surprise there, really.
Next, the triangle problem is less annoying when you make the view window very large, at least the center of the window is usually pretty clean then.
Also, the Basic/Multitexture Render option has no noticeable effect on the triangles.
I haven't tried the configuration option suggested by Selden, I will do that soon and report back, since I don't see anyone else piping up about it.
And finally... I can also confirm that swapping out the ATI for a GeForce FX 5200 cures the problem, not that anyone has any doubt
Cheers,
Jay
Are you using a Celestia version of 1.3.2 pre6 or pre7? There were several display bugs fixed in these newer versions.
-Don G.
My Celestia Scripting Resources page
Avatar: Total Lunar Eclipse from our back yard, Oct 2004. Panasonic FZ1 digital camera (no telescope), 36X digital zoom, 8 second exposure at f6.5.
My Celestia Scripting Resources page
Avatar: Total Lunar Eclipse from our back yard, Oct 2004. Panasonic FZ1 digital camera (no telescope), 36X digital zoom, 8 second exposure at f6.5.
don wrote:Are you using a Celestia version of 1.3.2 pre6 or pre7? There were several display bugs fixed in these newer versions.
Not yet. I've got 1.3.0. I did try the config options and they seem to have no effect on the "triangles of death" but will try 1.3.2 pre7 as soon as I can (sometime this week) and report back.
Again, this only happens with my Radeon 9600, not my GeForce FX 5200.
Thanks again and stay tuned
- Jeam Tag
- Posts: 540
- Joined: 01.04.2003
- Age: 60
- With us: 21 years 7 months
- Location: Southern suburb, Paris, France
Hello. I found it herebmg300 wrote:Where can I download a prerelease binary? All I see is 1.3.1 on sourceforge.
http://www.spacesim.net/prerelease/celestia-win32-1.3.2pre7.exejeam
Catalogue des ajouts /Catalog for the Add-Ons in French
...PAGES LOSTS, SORRY
...PAGES LOSTS, SORRY
The bug still exists in the latest CVS.
The line of code that causes the problem appears to be src/celengine/render.cpp:5843
Of course I cannot confirm this. Seeing that I know little about opengl and commenting out that line has the same effect as disabling non-point stars anyway.
The line of code that causes the problem appears to be src/celengine/render.cpp:5843
Code: Select all
glDrawArrays(GL_QUADS, 0, nStars * 4);
Of course I cannot confirm this. Seeing that I know little about opengl and commenting out that line has the same effect as disabling non-point stars anyway.
bmg300 wrote:The bug still exists in the latest CVS.
The line of code that causes the problem appears to be src/celengine/render.cpp:5843Code: Select all
glDrawArrays(GL_QUADS, 0, nStars * 4);
Of course I cannot confirm this. Seeing that I know little about opengl and commenting out that line has the same effect as disabling non-point stars anyway.
I must be doing something wrong... I couldn't get CVS to grab the source, and the latest source available on sourceforge is 1.3.1. That windows binary is nice, but I'm seeing the problem under Linux... If someone can point me at some source I'd be happy to build it and test this problem, and I'd probably even tar up the binaries and sign them if anyone else wants them.
The fact that bmg300 reports the bug was still in CVS is a bit of a discouragement, because it implies that s/he maybe built the code and saw it still not working. Of course, the possibility remains that the bug belongs to ATI...
Jay