Chris Please Read This!

General discussion about Celestia that doesn't fit into other forums.
Topic author
Don. Edwards
Posts: 1510
Joined: 07.09.2002
Age: 59
With us: 21 years 9 months
Location: Albany, Oregon

Chris Please Read This!

Post #1by Don. Edwards » 17.03.2004, 07:50

Chris,
Bare with me on this please. I think this is going to hit home for some Celestia users.
There have been some questions going around about how Celestia is handling textures.
I may have put my foot in my mouth on a few things but I think I and others could use some clarifying on the subject.
The main questions are do you now at what level NVidia video cards start to use DXT compression on textures. Does it kick in on its own or does it have to be activated with the "Compress Texture True" line in our solarsys.ssc
I have been going over some older info and I think this is how thing are supposed to work. I will use the .PNG texture format as the standard image format.
So lets say we are using an older 32MB GeForce card. We want to use a 4k .png that is 24MB in size, and we want to use a 1MB specmap, and a 2MB bumpmap, and a 5MB cloudmap say all for Earth. This adds up to 32MB of textures. Of course this isn't counting the other textures that are loaded like the sun, galaxies, etc....
Know if I remember right this should crash Celestia when we start it because the amount of textures is too much for the memory on the card.
Now if we go into the solarsys.ssc and add the line "Compress Texture True" under the main texture for Earth, this turns on texture compression on the fly and now we can squeeze these textures into use.
This is all pretty much common knowledge amongst the Celestia old timers.
Now we move up to a more modern video card, say a GeForce4 TI 4200 with 64MB of VRAM.
But now the question is, if someone is using the "Compress Texture True" switch on larger .png or .jpg files say at the 8k or even 16k level instead of the .dds format and the textures are being compressed on the fly for the video card are they loosing visual quality and if they indeed are than what is the end difference than between these post loaded compressed images and those that are pre-compressed using the NVDXT compression tool into the .dds format. Also is the compression being used for this texture compression of the DXT format? If indeed using the "Compress Texture True" does in fact in act this form of compression than logically it would make sense to just use .dds textures from the beginning.
I have a feeling there are quite a few Celestia users that use the "Compress Texture True" switch and they do not realize that they are just converting there nice 8k .png file into an 8k .dds texture when it gets to the screen. I think there are allot of Celestia users that still use this line switch in there .ssc files and have no idea as to how it really works. So in the end the question is really what is going on behind the scenes with all these textures? Is the "Compress Texture True" switch an on the fly DXT compressor? And if it is, what is the difference between its output and what we would see with just using a plain old .dds instead. If it is I think there are going to be some very surprised Celestia users out there.
Hopefully you can understand where I am going with this.

Don. Edwards
I am officially a retired member.
I might answer a PM or a post if its relevant to something.

Ah, never say never!!
Past texture releases, Hmm let me think about it

Thanks for your understanding.

Rassilon
Posts: 1887
Joined: 29.01.2002
With us: 22 years 5 months
Location: Altair

Post #2by Rassilon » 17.03.2004, 13:51

Don,

I think it may be a form of Open GL mipmapping used as a compression utility in Celestia...but I could be wrong...I think this is similar to dds compression but not quite the same...

Heres an article on mipmapping:

http://www.codeguru.com/Cpp/G-M/opengl/texturemapping/article.php/c5589/
I'm trying to teach the cavemen how to play scrabble, its uphill work. The only word they know is Uhh and they dont know how to spell it!

jim
Posts: 378
Joined: 14.01.2003
With us: 21 years 5 months
Location: Germany

Post #3by jim » 17.03.2004, 21:39

Hi Don,

The 'CompressTexture true' switch is a mystery for me. I think it has no funktion. I can definitely say that it makes not a "on the fly" DDS compresion. Why do I know this? There is a very simple test. DXT texture compression has big problems to compress colorful noise. Build a texture with such noise and save it to PNG. Alternative save this map to DXT1 or DXT3 and reopen this to see the difference. Use the PNG verion in Celestia and you will see that no texture compression happens.

So far I know does Celestia following things while opening a JPG/PNG/BMP texture:
- if the texture is a bump map Celestia builds on the fly a normal map (this is the reson why dxt texture compression not works with bump maps!)
- write the texture 32bit uncompressed into RAM (unless the driver setting for textures is 16bit)
- build all mipmaps
- cut the textures into tiles that can be used by your grafic card (Celestia knows internally the "virtual textures" for a long time).

That means a 4kx2k JPG/PNG/BMP texture needs 42,7mb RAM (32bit, uncompressed, mipmaps)

I hope this helps.

Bye Jens

Avatar
selden
Developer
Posts: 10190
Joined: 04.09.2002
With us: 21 years 9 months
Location: NY, USA

Post #4by selden » 17.03.2004, 22:00

One point: the cutting up of textures for use with smaller texture buffers does not seem to be happening for Nebula objects. I dunno if this is a bug or just not implemented.

Specifically, Don G. is unable to use 4K Nebula textures with his ATI card.

Note: At the moment, VTs only work for Celestia's builtin spheres, so they can't be used for 3DS models used for Nebulas.
Selden

jim
Posts: 378
Joined: 14.01.2003
With us: 21 years 5 months
Location: Germany

Post #5by jim » 17.03.2004, 22:16

selden wrote:One point: the cutting up of textures for use with smaller texture buffers does not seem to be happening for Nebula objects. I dunno if this is a bug or just not implemented.

Specifically, Don G. is unable to use 4K Nebula textures with his ATI card.

Note: At the moment, VTs only work for Celestia's builtin spheres, so they can't be used for 3DS models used for Nebulas.


Selden, you are right. I forgot to mention that this is the way how Celestia map a sphere. Nebula objects belong to 3d models and there are several limitations for textured 3DS/CMS models. I think this problem will solved once with the CMOD format.

Bye Jens

Topic author
Don. Edwards
Posts: 1510
Joined: 07.09.2002
Age: 59
With us: 21 years 9 months
Location: Albany, Oregon

Post #6by Don. Edwards » 17.03.2004, 23:10

jim,
The "Compress Texture True" switch does work. How it is working at this point I don't know. Back when I first started using Celestia I had a lowly GeForce2 MX200 with only 32MB of SD VRAM. I had problems with using multiple textures on the Earth or even back when I was making my first Teraformed Mars textures. Celestia would either 1. Crash on load-up of the textures or 2. Would run so slow that you couldn't navigate around much at all. But using the "Compress Texture True" line made a big difference. Celeatia wouldn't crash with the use of multiple textures and navigating around was at least at tolerable frame rate. If it didn't I probably would have given up on Celestia all together. So there has to be a reason for the line to be added and it must be doing something in the background, some kind of texture compresion has be going on. I just would like to in what form it takes and does it effect the visual quality of those textures in what ever form, .jpg or .png, are being used. Just to say it doesn't have a working function is the wrong way to answer the question. As many of us now it does in fact have some kind of working function.

Don. Edwards
Last edited by Don. Edwards on 17.03.2004, 23:18, edited 1 time in total.
I am officially a retired member.
I might answer a PM or a post if its relevant to something.

Ah, never say never!!
Past texture releases, Hmm let me think about it

Thanks for your understanding.

chris
Site Admin
Posts: 4211
Joined: 28.01.2002
With us: 22 years 5 months
Location: Seattle, Washington, USA

Post #7by chris » 17.03.2004, 23:15

Setting CompressTexture to true will cause the texture to be compressed to DXTn format at load time. For large textures, this can be quite time-consuming and memory intensive. The algorithms used by the OpenGL driver for compressing textures are designed for speed, not quality. You're much better off compressing the textures ahead of time. The other disadvantage of CompressTexture is that it only applies to the base texture, not light maps, separate specular maps, or anything else. The easy rule of thumb is: don't use CompressTexture.

--Chris

Topic author
Don. Edwards
Posts: 1510
Joined: 07.09.2002
Age: 59
With us: 21 years 9 months
Location: Albany, Oregon

Post #8by Don. Edwards » 17.03.2004, 23:32

Chris,
Thanks for the answer. That is what I thought all along was being done with the line switch enabled. Now we know for sure what is going. If the line switch "Compress Texture True" is used you are telling Celestia to convert your .JPG or .PNG texture on the fly into a sudo .DDS texture that has questionable quality. 8O
So in essence we are better off using true .DDS textures instead of the "Compress Texture True" line switch. That is good to know. :wink:
It is also helpful to for those that might be using the switch to load a big .PNG or .JPG texturte that they would be in fact better off using the .DDS texture format if there video card supports it. Of course those of you with video cards that have 256MB of VRAM need not aplly. :wink:
I will be sticking to high quality .DDS texture personaly.

Don. Edwards
I am officially a retired member.
I might answer a PM or a post if its relevant to something.

Ah, never say never!!
Past texture releases, Hmm let me think about it

Thanks for your understanding.

Darkmiss
Posts: 1059
Joined: 20.08.2002
With us: 21 years 10 months
Location: London, England

Post #9by Darkmiss » 18.03.2004, 00:02

Every texure I have has been converted to high quality DDS files,
except for ring textures, which are in PNG.
CPU- Intel Pentium Core 2 Quad ,2.40GHz
RAM- 2Gb 1066MHz DDR2
Motherboard- Gigabyte P35 DQ6
Video Card- Nvidia GeForce 8800 GTS + 640Mb
Hard Drives- 2 SATA Raptor 10000rpm 150GB
OS- Windows Vista Home Premium 32

bh
Posts: 1547
Joined: 17.12.2002
With us: 21 years 6 months
Location: Oxford, England

Post #10by bh » 18.03.2004, 01:12

Blimey!...another ESB please landlord!

Avatar
selden
Developer
Posts: 10190
Joined: 04.09.2002
With us: 21 years 9 months
Location: NY, USA

Post #11by selden » 18.03.2004, 03:08

I decided to try to find out what's happening with the colors for the various surface texture image file types.

See http://www.lns.cornell.edu/~seb/celestia/colortest.html for the disheartening results.

To put it bluntly, the colors are being trashed. I'm not sure where it's happening, but many colors are being lost. My guess is that it's due to the card's internal compressed color format. I was surprised that the default compression used for dxt5 produces essentially the same results as dxt1, jpeg and png. dxt3 is slightly better than the others.

But don't take my word for it.
Run your own tests.
I spelled out the steps on the Web page.

Note that the precise details of the color images used probably don't matter. The color histograms show spikes in the screendumps that are not in the original image. This characteristic damage cannot be missed.

sigh
Selden

TERRIER
Posts: 717
Joined: 29.04.2003
With us: 21 years 2 months
Location: West Yorkshire, England

Post #12by TERRIER » 18.03.2004, 10:52

bh wrote:Blimey!...another ESB please landlord!

bh

You don't happen to have an Intel Centrino powered laptop do you ? :wink:

regards
TERRIER

PS
If this was me, I would be asking for....another Landlord* please ESB!
(providing the landlords name was something like Edward Stuart Bloggs.)

*Timothy Taylors Landlord is a very nice pint Image Cheers!
1.6.0:AMDAth1.2GHz 1GbDDR266:Ge6200 256mbDDR250:WinXP-SP3:1280x1024x32FS:v196.21@AA4x:AF16x:IS=HQ:T.Buff=ON Earth16Kdds@15KkmArctic2000AD:FOV1:SPEC L5dds:NORM L5dxt5:CLOUD L5dds:
NIGHT L5dds:MOON L4dds:GALXY ON:MAG 15.2-SAP:TIME 1000x:RP=OGL2:10.3FPS

wcomer
Posts: 179
Joined: 19.06.2003
With us: 21 years
Location: New York City

nice work Selden

Post #13by wcomer » 18.03.2004, 17:34

Hi all,

Given the results of Selden's test, can anyone tell me a good reason I shouldn't just convert all my surface textures to .jpgs? Especially the VT's.

cheers,
Walton

chris
Site Admin
Posts: 4211
Joined: 28.01.2002
With us: 22 years 5 months
Location: Seattle, Washington, USA

Re: nice work Selden

Post #14by chris » 18.03.2004, 19:22

wcomer wrote:Given the results of Selden's test, can anyone tell me a good reason I shouldn't just convert all my surface textures to .jpgs? Especially the VT's.


Because they require eight times as much video memory as DXT1 compressed textures and are slower to load. It's a quality versus speed/size tradeoff.

--Chris

chris
Site Admin
Posts: 4211
Joined: 28.01.2002
With us: 22 years 5 months
Location: Seattle, Washington, USA

Post #15by chris » 18.03.2004, 19:33

selden wrote:To put it bluntly, the colors are being trashed. I'm not sure where it's happening, but many colors are being lost. My guess is that it's due to the card's internal compressed color format. I was surprised that the default compression used for dxt5 produces essentially the same results as dxt1, jpeg and png. dxt3 is slightly better than the others.


I'm not sure I understand--are you saying that the colors are getting messed up even for the losslessly compressed PNG textures?

--Chris

Avatar
selden
Developer
Posts: 10190
Joined: 04.09.2002
With us: 21 years 9 months
Location: NY, USA

Post #16by selden » 18.03.2004, 21:56

Chris,

You wondered
are you saying that the colors are getting messed up even for the losslessly compressed PNG textures


Yes.

The total color pallette that's present in the texture image is not visible on the screen. Each color channel has been reduced from from 8 bits to only 5 or 6 . You can see this by counting the number of spikes in the histogram. What started out as a range of 256 color values has been reduced to less than 64.

When they are drawn on the screen, since the pixels of a texture image don't map 1-1 to the screen pixels, the graphics hardware seems to be interpolating between adjacent texture pixel color values to give a smooth range of color. As a result, you don't see banding, but, if you look closely, you'll realize that the range of colors that are in the original picture is not present in what's shown on the screen.

I noticed this loss of color range some time ago when creating one of my Nebula addons. In the original pictures, circular images of stars have bright centers that fade to blackness at their edges. When those pictures are drawn on the screen by Celestia, however, the stars look like dried waterdrops on glass: round spots of constant color. I blamed it on other causes at the time.
Selden

maxim
Posts: 1036
Joined: 13.11.2003
With us: 20 years 7 months
Location: N?rnberg, Germany

Post #17by maxim » 19.03.2004, 19:17

The nvidia dds plugin is able to show some more comparision. I've never tried this before, but I find it quite informative.

The popup dialog has the option of previewing all formats directly with 3D hardware:

Image

The preview window allows to compare all desired formats in 3D space from all directions:

Image

In normal view there is nearly no difference detectable, but when the option 'show differences' is enabled, the color differences are shown with a magnification of 10x.
For some textures they seem to be quite low, and dxt3/5 seem to be the best choice (that can be badly seen on that jpg-pic, but all frames that seem to be empty show certain errors in original):

Image

For other textures however the color differences are quite remarkable, and equally bad for dxt formats:

Image

Screenshots taken from this preview window show similar discrete color glitches in the histograms as selden reported it.

Conclusion is, that there seems to be no 'best' texture format. It depends on the content and has to be finetuned for every case.

maxim

Avatar
selden
Developer
Posts: 10190
Joined: 04.09.2002
With us: 21 years 9 months
Location: NY, USA

Post #18by selden » 19.03.2004, 22:03

Maxim,

As I understand it, your tests compare the differences between the various types of texture image file formats that Celestia uses. They show the best possible color quality available with each format.

I've been doing some more measurements of the colors that Celestia actually displays on-screen. For each of the texture file formats (png, jpg, u888, dxt1c, dxt3 and dxt5) I created a histogram of the colors present in the texture file and of the colors present "on-screen." The "on screen" images were created by having Celestia save a snapshot of the screen in PNG format so that no color fidelity would be lost.

In all cases, there are far fewer on-screen colors. The histograms show that they've been reduced from 8bits of color per channel to only 5 or 6 bits. This is even true for the U888 DDS format, which the graphics card should be able to use directly without modification. :(

I've updated http://www.lns.cornell.edu/~seb/celestia/colortest.html to show the histograms of colors for textures and screen for all 6 types of images.

Image
Colors in the surface texture image

Image
colors shown by Celestia
Selden

jim
Posts: 378
Joined: 14.01.2003
With us: 21 years 5 months
Location: Germany

Re: nice work Selden

Post #19by jim » 19.03.2004, 22:29

wcomer wrote:Given the results of Selden's test, can anyone tell me a good reason I shouldn't just convert all my surface textures to .jpgs? Especially the VT's.


Hi walton,
I think you should know that the jpg compression causes also ugly artifacts and sometimes more than dds compression depending on the compression factor.

Jens

maxim
Posts: 1036
Joined: 13.11.2003
With us: 20 years 7 months
Location: N?rnberg, Germany

Post #20by maxim » 20.03.2004, 00:06

Selden,

there could be a quality loss during screen capture. The data has to be transfered from video memory back to main memory and decompressed. Do you know how this is done?

maxim


Return to “Celestia Users”