Chris Please Read This!

General discussion about Celestia that doesn't fit into other forums.
Topic author
Don. Edwards
Posts: 1510
Joined: 07.09.2002
Age: 59
With us: 22 years 2 months
Location: Albany, Oregon

Post #21by Don. Edwards » 20.03.2004, 01:51

Hey everyone,
Wow, I really opened a can of worms on this one. 8O
We have gone from what the “Compress Texture True” switch really does to what is best to use for our textures to how the colors are being affected by the conversion and compression process.
Now from what I have been reading so far, the only way to truly know if we are losing color quality is to use a professional screen color calibrator and color checking software. This means we are going to need a graphics arts pro with the equipment to try this for us to put all this to bed once and for all. Hey when in doubt do what the pros would do to check the color values.
Anyone know someone with the equipment and the software needed. :wink:
I don't off hand.

Don. Edwards
I am officially a retired member.
I might answer a PM or a post if its relevant to something.

Ah, never say never!!
Past texture releases, Hmm let me think about it

Thanks for your understanding.

Rassilon
Posts: 1887
Joined: 29.01.2002
With us: 22 years 9 months
Location: Altair

Post #22by Rassilon » 20.03.2004, 02:05

I believe the loss of bit depth is present in Celestia in all aspects of its rendering...A while back I used a television to display Celestia and noticed a significant loss in bit depth compared to other programs displayed on the same television...I at first concluded that it was the tele doing it but after reading some of seldens conclusions on textures I am inclined to believe its in the way Celestia renders its graphics...May be possible that Celestia may not be in true 24 or 32 bit rendering modes but in 16 or 8 bit...
I'm trying to teach the cavemen how to play scrabble, its uphill work. The only word they know is Uhh and they dont know how to spell it!

DBrady
Posts: 66
Joined: 14.07.2003
With us: 21 years 4 months
Location: Sydney

Post #23by DBrady » 20.03.2004, 12:40

Hi selden,
Your results are very interesting. It might be worth while repeating them in anim8or with its opengl environment and with its software renderer. This may narrow it down to opengl in general and not celestia. You may also consider using 'GLDirect' to run celestia through directx drivers and again repeat the experiments. This may gleen some further understanding of where the problem lies.

http://www.scitechsoft.com/products/ent/gld_home.php
Slan

Avatar
selden
Developer
Posts: 10192
Joined: 04.09.2002
With us: 22 years 2 months
Location: NY, USA

Post #24by selden » 20.03.2004, 13:40

Maxim,

I'll admit I don't know the details of how Celestia's Windows screen dump function works. However, I also used Windows' Ctrl-PrintScreen to grab a copy of Celestia's full-screen display. A histogram of its colors showed exactly the same color reduction seen in Celestia's own screen snapshot.

No such color reduction happened when I did a Ctrl-PrintScreen of Windows' Picture Viewer displaying the png texture image file.

In other words, I'm pretty sure the color reduction is "real" and is happening in Celestia.

DBrady,

Anim8or does use OpenGL for its GUI, although not for its software rendering, of course. The color reduction in its 3D GUI display seems to be quite different from Celestia's. This suggests to me that at least some of the color reduction is happening in Celestia, and is not entirely due to OpenGL. I think there is reason to hope that Celestia's onscreen colors can be improved.

Image
Histogram showing Anim8or's color reduction.
Ctrl-PrintScreen was used to do a screen-grab that was saved in BMP format. The edges of the image were cropped to remove Anim8or's control buttons.
Selden

jim
Posts: 378
Joined: 14.01.2003
With us: 21 years 10 months
Location: Germany

Post #25by jim » 20.03.2004, 20:19

Hi Selden,

there must be a bug with your colortest dxt histogramms. All colortest dxt histogramms must be identical. Only in Celestia there should be a difference between dxt1c and dxt3/5 and only on Nvidea grafic cards! Can you check this?

Now I notice a lot of confusion here about the Celestia histogramms. I will prepare a post (tomorrow) and explain some things. General I can say that this histogarmms show that all works correct in Celestia. It shows futher that the color reduction of dxt compressed textures is compensated by the true colore rendering in Celestia. :-)

I think most people see only all this ugly peacks without to know the reson for it. ;-)

Bye Jens

Avatar
selden
Developer
Posts: 10192
Joined: 04.09.2002
With us: 22 years 2 months
Location: NY, USA

Post #26by selden » 20.03.2004, 21:45

Jens,

I'm not sure what you're trying to say.

Please look at the Web page where I compared both the input dds and output Celestia colors.

The histograms show the number of times a particular value was seen in each of the red, green and blue channels. The red channel counts are colored red, counts of blue values are colored blue, and values seen in the green channel are colored green. The original histograms are 256 pixels wide: one bin/pixel, one bin for each of the 256 possible values that an 8bit channel can have. (I expanded them 4x for visibility.)

The dxt3 and dxt5 image files created by nvdxt are identical, but they are not the same as the dxt1c image. This is easily seen in the file sizes, in the color histograms, and by doing a pixel-by-pixel comparison of the images after they've been expanded into a non-compressed format (BMP in this case).

Image
(this links to a full-size 2K jpeg)

The image above was created by subtracting the expanded dxt1c image from the expanded dxt3 image. Both dds texture files were created from the same TGA file. Where the image is black, the expanded images were identical. It's colored where they were different. You can't see it in the thumbnail, but if you look at the 2K image, you'll see dots scattered throughout the image where the two dds files were different. (I used InfranView to create the expanded BMP versions of the dds texture files.)

Image
(this thumbnail does not link to the large version of the image: you don't need the 3MB image to see the differences.)

Above is the difference between the re-expanded dxt3 image and the 2K TGA image it was created from. As you can see, the pictures are not the same: the difference is not black. This is what is meant by saying the dxt3 compression algorithm is lossy: it loses information that was present in the original image.

Celestia cannot restore the original colors (that were in the TGA file) that the dds file doesn't contain. The histogram of the colors of the on-screen display shows that it really doesn't try. That's what the spikes mean: they show that there are many pixels on the screen which have exactly the same color. The low values between those peaks show that there were relatively few pixels with colors in between the colors indicated by the peaks.

I'm not so much concerned about how well Celestia reproduces the colors that are in the lossy DXT texture files. What does bother me is that it does not accurately reproduce the colors that are in the lossless U888 and PNG texture files.

If Celestia's on-screen image had the same coloration as the original U888 or PNG image, there would be no such peaks. The counts of the on-screen colors would vary with smooth curves almost identical to the curves shown for the original texture files. Those curves wouldn't be exactly the same, though, because Celestia's window shows only a portion of the texture file.

The spiky histograms suggest that the differences in the colors shown by Celestia are comparable to the differences between the original TGA image and the DXT3 version of that image.
Selden

Harry
Posts: 559
Joined: 05.09.2003
With us: 21 years 2 months
Location: Germany

Post #27by Harry » 21.03.2004, 11:58

Couldn't the spikes in the histograms be just a result of the normal lighting taking place?

E.g. you start with an image containing pixel values in the range [0:255], and Celestia maps this to the range [20:230], there will be some close values in the original range which will be mapped to the same value in the final range, doubling the number of pixels with this value and thus leading to spikes in a histogram. I hope you understand what I am trying to say ;)

Harald

jim
Posts: 378
Joined: 14.01.2003
With us: 21 years 10 months
Location: Germany

Post #28by jim » 21.03.2004, 14:30

Hi Selden,

First I know how an histogram works. ;-)

selden wrote:The dxt3 and dxt5 image files created by nvdxt are identical, but they are not the same as the dxt1c image. This is easily seen in the file sizes, in the color histograms, and by doing a pixel-by-pixel comparison of the images after they've been expanded into a non-compressed format (BMP in this case).

Selden beliefe me the only difference between dxt1c, dxt3 and dxt5 is the alpha channel. All these formats use exact the same compression and need excact 64bit for a 4x4 block for the RGB part of the texture. The alpha channel of dxt3/5 needs additional 64bit for a 4x4 block and therefore has these format the double size of dxt1c.

Therefore these histogramms must look identical:
Image Image Image
I made my own test that confirm this.

Image
Now this picture showing the error of dxt compression looks quite dramatic. Anyhow if you compare with you eyes the compressed image with the original one it's hard to see a difference. Let me try a little compare: A man stands beside a loudspeaker in a disco and you stand 10meter (30feet) away. As long as the music is on you can't hear this man speak but if the DJ turns the music of you can hear this man very clear. What I try to explain it's unimportant if there is some disturbance as long as the main signal is strong enought to mask it. Or how about this we can look at the blue scy on day but we can't see as long as the Sun is visible the stars ofcourse they are really there. ;-)
Selden, it's really hard to explain.

selden wrote:Celestia cannot restore the original colors (that were in the TGA file) that the dds file doesn't contain. The histogram of the colors of the on-screen display shows that it really doesn't try. That's what the spikes mean: they show that there are many pixels on the screen which have exactly the same color. The low values between those peaks show that there were relatively few pixels with colors in between the colors indicated by the peaks.

Absolute correct what you say.

selden wrote:I'm not so much concerned about how well Celestia reproduces the colors that are in the lossy DXT texture files. What does bother me is that it does not accurately reproduce the colors that are in the lossless U888 and PNG texture files.

Selden the question is: what is the reson for this peaks? It's really very simple. :-)
I've built a test image that has this histogramm. Now Photoshop has not such a nice histogramm tool therefore I show only the red channel.
Image

Now after reducing the contrast about 5% it looks like this:
Image

And if the contrast is reduced about 10%:
Image
Any idea how it would look with 20% or 40% contrast reduction?

But we can also increase the contrast about 10% and get now little gaps and two peacs at the edges:
Image

Now is there any idea where the peaks come from? To reduce the contrast and brightness about 10% of an image each channel of the image is multipicated with 0.9. But this gives a lot of point numbers witch must be rounded to the 8bit per channel and this causes the peaks. ;-)
Now I think it's clear that a lower contrast cause a loss of color information.

Ok, I see already the next question: why does Celstia reduce the contrast and brightness?
First Celestia does not simple display textures Celestia render 3D objects with textures. That's a big difference! In order to show all this nice effects like reflections and bumpmapping it's necessary to reduce the contrast and brightness of all textures a bit. Otherwise it would be not possible to show these effects on a white texture!

I hope I could bring a bit light in this topic?

Jens

Kolano

Post #29by Kolano » 21.03.2004, 18:36

Might this be a good reason to add High Dynamic Range suppport to Celestia?

granthutchison
Developer
Posts: 1863
Joined: 21.11.2002
With us: 21 years 11 months

Post #30by granthutchison » 21.03.2004, 19:48

It would be interesting to compare the histogram for an object in Celestia with the histogram for the same object with Emissive true set. This might illuminate some of the contrast issues Jens mentions.

Grant

chris
Site Admin
Posts: 4211
Joined: 28.01.2002
With us: 22 years 9 months
Location: Seattle, Washington, USA

Post #31by chris » 21.03.2004, 20:39

Celestia doesn't do any contrast adjustment of textures. For planet textures the only calculation applied to texels values is:

T * (diff + ambient) + spec

T is the input texture color, diff is the diffuse lighting term (cosine of the angle between the light direction and surface normal), and spec is the specular lighting term (just zero for most planets.)

When emissive is set to true, no lighting calculation is performed, and texel values are only modified by bilinear interpolation and mipmapping.

I don't know what the source of Selden's strange histograms might be. On my machine, I can't see any noticeable color quantization of deep sky object textures, and a histogram I generated for a screen shot of the Rosette Nebula doesn't hint at any quantization either.

One possibility is that the OpenGL driver is automatically converting textures to 16-bit. I'd be interested in seeing results from different machines.


--Chris

Avatar
selden
Developer
Posts: 10192
Joined: 04.09.2002
With us: 22 years 2 months
Location: NY, USA

Post #32by selden » 21.03.2004, 20:55

While it may be that the dxt1 compression algorithm is supposed to be the same as that used for dxt3 and dxt5, and it does produce the same results for most of the texture, it certatinly does not do so everywhere. If the algorithms were exactly the same, taking a difference between images created by the different algorithms would produce a result of 0 (black) everywhere. It doesn't, as shown above. I could be persuaded that the differences are due to bugs in nvdxt.

I have to admit that I am not at all used to changes in contrast that do such severe damage to the colors.

As a result, I would greatly appreciate a way to specify to Celestia exactly how much contrast reduction (or enhancement) should be applied.

Image
Here's the color histogram of the same model and PNG texture specified as an "Emissive true" SSC object.

(The same distribution of colors is seen when the model is face-on to the sun with "Emissive true" omitted.)

Since I manually positioned the viewpoint, there probably are slight differences from previous histograms, which were of the object as a Deep Space Nebula.
Selden

Avatar
selden
Developer
Posts: 10192
Joined: 04.09.2002
With us: 22 years 2 months
Location: NY, USA

Post #33by selden » 21.03.2004, 21:00

Chris,

What software did you use to generate your color histograms?

I've been using pnmhistmap, which is available on SourceForge as part of the NetPBM package.

I get the same results for both my Ti4200 at home and FX5200 at work.

As best I can tell, their color settings are set to the defaults: brightness, contrast and gamma are right in the middle, while "image settings" are set for "quality".

Added slightly later: both are running 1600x1200 with 32bit color.
Selden

jim
Posts: 378
Joined: 14.01.2003
With us: 21 years 10 months
Location: Germany

Post #34by jim » 21.03.2004, 22:27

chris wrote:I don't know what the source of Selden's strange histograms might be. On my machine, I can't see any noticeable color quantization of deep sky object textures, and a histogram I generated for a screen shot of the Rosette Nebula doesn't hint at any quantization either.


Chris you are right. Now I've made a little test and came to the same result. There is no color change on DSO's in Celestia. The histogramm of the texture and of the Celestia shot is identical on my sytem. It seems something is wrong with Selden sytem.

My System: Geforce3, 1024x768x32, no AA, 4x AF, quality, 32bit textures

Bye Jens

chris
Site Admin
Posts: 4211
Joined: 28.01.2002
With us: 22 years 9 months
Location: Seattle, Washington, USA

Post #35by chris » 22.03.2004, 05:48

Selden,

Would you take a look at something in your display settings control panel? Click the advanced button, select the GeForce tab, and then choose the OpenGL settings from the flyout menu. There's should be a list box to select the default color depth for textures. I have it set to use desktop color depth. Is yours set to 16 bit color? You may want to experiment with forcing 32 bit color.

--Chris

Avatar
selden
Developer
Posts: 10192
Joined: 04.09.2002
With us: 22 years 2 months
Location: NY, USA

Post #36by selden » 22.03.2004, 12:51

There are no explicit OpenGL or DirectX entries on the flyout tab of the new drivers, v56.64, which is what I'm using at home with my Ti4200. I'll try it with the older drivers that are on the system at work with the FX5200.

Image

However I just discovered that there are no spikes in the color histogram if I save the screen image in JPEG format. They only appear if I save it in PNG format. I don't know if the JPEG format is smoothing them out or if there's a bug in the PNG routines. :(
Selden

Avatar
selden
Developer
Posts: 10192
Joined: 04.09.2002
With us: 22 years 2 months
Location: NY, USA

Post #37by selden » 22.03.2004, 15:22

My system at work, with drivers v53.03, was set for "use desktop depth". I have my desktop set to 32bits, but I changed the setting to "always use 32 bits" anyhow.

It made no difference.

I also used XP's Ctrl-PrintScreen to verify that the problem isn't related to Celestia's internal savescreen routines. I pasted the image into XP's Paint program and saved it as BMP, PNG and JPG formats. BMP and PNG images show the spikes. JPG does not.

However, I did verify that JPEG does indeed spread pure colors into a range of colors. That would disguise the problem.

*sigh*
Selden

Harry
Posts: 559
Joined: 05.09.2003
With us: 21 years 2 months
Location: Germany

Post #38by Harry » 22.03.2004, 17:32

Selden,
comparing the histograms on your site I noticed they indeed show different color-ranges - peaks in the celestia screen grabs are shifted to the left, and the maximum intensity seems to be about 225 instead of 255 for the original image. So I guess we really should be looking for a cause of lowered contrast, and not a wrong texture colordepth or problems with saving/loading images.

Can this be because of the specific model you used? Someone mentioned setting emissive may change things, did you try that?

Harald

Avatar
selden
Developer
Posts: 10192
Joined: 04.09.2002
With us: 22 years 2 months
Location: NY, USA

Post #39by selden » 22.03.2004, 17:42

Harry,

I first noticed the problem with a DSC Nebula object, which is what I've been concentrating on. As I understand it, all Nebula objects are drawn as if they had "Emissive true" specified. My one test with an SSC object with "Emissive true" showed the same problem (see above). It had a substantial reduction in the number of colors drawn on-screen: that's what the spikes are telling us.

I'm certainly not doing anything intentionally that would modify the contrast. The color correction graph in the Display Advanced Properties shows a straight line (see above).
Selden

chris
Site Admin
Posts: 4211
Joined: 28.01.2002
With us: 22 years 9 months
Location: Seattle, Washington, USA

Post #40by chris » 22.03.2004, 18:59

There's nothing that Celestia could be doing to reduce the contrast. I think you're just seeing the result of alpha blending, where the texture color is multiplied by the opacity. If the texture were completely opaque, the maximum intensity would be 255.

--Chris


Return to “Celestia Users”