Page 1 of 1
Experience with 32K (then 64K) normal map for earth
Posted: 05.01.2004, 05:32
by timcrews
Hello:
I've spent many hours learning how to produce high-res normal maps this weekend. I finally succeeded in producing a 32K VT normal map for Earth. I have all of the files I need to produce a 64K (scaled up from the 43K raw data), too, with the necessary scripts written. I think my machine might even be able to handle the task (that is, the next time my computer has twenty hours or so with nothing better to do.)
But I don't think it will be worth the effort. I have spent some time with the 32K VT, and I'm really disappointed. The 32K VT surface texture that I'm using already has very pronounced static shadows to the east. In fact, these shadows are so elongated, that when the scene is illuminated from the extreme west (so that the normal map would normally produce long shadows to the east), the dynamic shadows are still almost entirely subsumed by the static shadows.
Needless to say, with a scene illuminated from the extreme east (dynamic shadows to the west), the double shadows to the east and west are really unimpressive.
I was never able to see this before, because the blocky shadows produced by an 8K normal map over a 32K surface texture were such a mess that I couldn't even tell that the static shadows existed.
But my new opinion is that (for earth at least), I'm better off without any normal map, until a non-shaded surface texture becomes available. This is similar to the conclusion of Walton Comer in another thread. Does raw data exist for such a non-shaded image?
It might still be interesting to project a 32K or 64K normal map onto a uniformly-colored sphere, though, just to see the contours. I might try that as an AltSurface. This will also give me an opportunity to find any deficiencies in my process for producing a high-res normal map VT.
A non-shaded surface texture is already available for Mars (although it really has some serious color-resolution issues). I already have the 32K VT normal map for Mars. I think I would like to try my hand at 64K.
Tim Crews
Posted: 05.01.2004, 06:17
by timcrews
Oh! Oh! Oh! I'm so glad I took the time to try this! Cancel my previous lack of enthusiasm.
I defined a blank grey .png file, and created an AltSurface for earth with this file as the surface texture, but keeping the 32K VT spec map, the 32K VT borders overlay, the 32K VT night texture, Fridger's huge location file, and my recently-generated 32K VT normal map.
The result is actually very instructive and interesting to look at. Even without the colors of vegetation/desert/ocean, etc, the planet is really interesting to study in this way.
I can also say that I'm pretty sure I did a good job producing the normal map. With this blank-slate surface texture, I am able to study the normal map pretty carefully, and I don't see any seams or artificial contours as warned about in other posts.
So now I am actually psyched to go ahead and generate a 64K VT normal map, although jumping past the actual 43K resolution of the original DEM raw data will probably not give tremendously better results than I'm already getting with the 32K map.
Also, I can say that if you're going to use a normal map for earth, 32K is unquestionably better than 8K, even with the issues of the interaction with the static shadows in the main surface texture. For comparison, I created two AltSurfaces, one with the 8K normal map and one with the 32K normal map. As I have said before, when zoomed into the levels that are really interesting in the 32K surface texture, the 8K normal map just clobbers it with terrible ugly, blocky shadows. The 32K normal map doesn't do this.
But the effects of the dynamic shading of the 32K normal map on the grey surface are quite subtle compared to the static shading on the 32K Blue Marble texture. I think to use the 32K normal map with the 32K surface texture, the normal map's bump height will need to be exaggerated some (but how much?) when nm16 is first used to create the normal map. Unfortunately, that means starting over again from the first step in the process. Luckily, it's scripted.
Once I am sure I have done a good job (on further study), I will post a diary of the steps that I took to produce the normal map. I found it somewhat frustrating that there were so many posts in this forum about virtual textures, with very sparse information about how they were actually _doing_ it. Still, I'm very appreciative of the contributions all of these folks have made. I will try to do my part to contribute some of my own.
Even if these VTs are far too large to ever be hosted by anyone other than a multi-millionaire, perhaps we can optimize/automate the process of creating them enough, so that people can create them on their own, by downloading the raw images from a specified location, and then running pre-written scripts that will result in the VTs. I know it was certainly crucial on my end to "assembly-line" the process for the two VTs I created this weekend. I was even able to parallelize some of the work between multiple computers. If I can come up with a standard process that seems to meet all of the requirements, perhaps I can document this.
Tim Crews
Posted: 05.01.2004, 08:30
by galileo
wow, very cool man. i'm very much looking forward to this. keep it up.
Posted: 05.01.2004, 10:05
by Buzz
Great! I hope to do this too soon. Did you use Pixels standard exe, a changed one or something else?
Posted: 05.01.2004, 15:55
by timcrews
Buzz:
I used the Windows version of nm16 provided by Pixel for the initial generation of the normal map. [Later edit: WARNING: This produces inverted normal maps. I am currently testing a new version that might solve this problem.] But I also used a "re-normalization" utility provided by Walton Comer. In brief, my procedure was as follows:
0) Downloaded 16 raw images (four rows of four tiles) from
http://www.ngdc.noaa.gov/seg/topo/gltiles.shtml
1) Create normal maps from each raw image using nm16 (specifying bump height of 100). This produces 16 ppm files, usable by the netpbm tools. Many posts have stated that it is important to generate the normal map before resizing, so that's what I did.
2) Stitch together the four rows of four tiles using pnmcat. This results in a 43200 x 21600 image.
3) Resize the result to 64Kx32K using pnmscale. (I did this in one step. It did not do a two-step stretch/compress as suggested in another thread.) Keep that file. Also create a 32K, 16K, 8K, 4K, and 2K version of the file. Keep them all. (Note: I created each power-of-two file from the next higher power-of-two. I.E. the 4K file was produced by down-scaling the 8K file. This saved time compared to generating each lower-res file from the original 43K file, and it does not appear to have caused any problems that I can see.)
4) Re-normalize the resized files, using Walton Comer's normrgb tool. I'm not totally sure this step was necessary. I haven't tried not using it to see what the results look like. But Walton said that it was necessary, and I'm trying not to repeat anyone's mistakes.
5) Use Fridger's virtualtext script on each of the images (64K, 32K, 16K, 8K, 4K, and 2K), saving away the resulting set of numbered .ppm files in the appropriate sub-directory (level5, level4, level3, level2, level1, level0, respectively). I used 1K tiles.
6) Convert all of these (thousands now) of .ppm files into .png files. (I have not tried creating .DDS files yet for this fileset.)
7) Create the .ctx file that points to the parent directory of the output of step 6.
I have a directory layout and a set of scripts that supports the above assembly-line. As you proceed through the steps, it is helpful to remove the intermediate files from the previous steps once you are done with them, otherwise you will need about 50G of disk storage to finish the job.
A zip file containing only the directories and the scripts would be easy to produce. You could then edit the scripts based on the original dimensions of the raw image, the desired bump height, the layout of the tiles in the original image, the desired virtual texture tile size, etc.
The tools that the scripts rely on are:
1) Cygwin under Windows, with bash shell
2) netpbm tools
3) ImageMagick (to support the virtualtex script)
4) Windows version of nm16
5) Walton's normrgb
6) Fridger's virtualtex
7) (eventually) something to create .dds files, probably nvdxt
Tim Crews[/url]
Posted: 05.01.2004, 18:33
by Buzz
Thanks a lot for your extensive description!
Posted: 05.01.2004, 23:56
by Buzz
By the way: you can save yourself the trouble of converting to dds: dxt1, 3 and 5 are ugly (blocky artifacts), 24 bits dds is very big and slows Celestia down. Png was the best option for me: smallest file size and great quality!
Posted: 05.01.2004, 23:58
by Guest
I think 64k earth maps would be a good candidate for Celestia Bittorrents.
some thoughts
Posted: 06.01.2004, 00:21
by wcomer
Tim,
Glad to see you are progressing with the 64k normal map.
1) I personally think it is better to rescale before creating a normal map because you keep the 16-bit resolution. However, is you are using the pnmscale tool then you must rescale after normalization or else you will get terrible artifacts. Regardless you should definitely do the two step procedure, i.e. double the size to 86k then downscale to 64k. The reason is that pnmscale does not do interpolation during upscale, rather it will just repeat every other row. Downscaling with pnmscale does do interpolation. Ultimately you may just want to write your own interpolating filter.
2) From my experience 16k u888 is superior to 32k dxt[1-5]. 32k u888 is superior to 64k dxt[1-5]. 8k anything is completely useless and no one should bother with earth normal maps on that scale, they do not do anything at all. So only build a 64k texture if you plan to use a lossless format, else stick to 32k. Again, this is entirely my opinion, but I did do thourough tests between the different formats.
good luck, and I can't wait for some screen shots.
cheers,
Walton
Posted: 06.01.2004, 12:11
by maxim
Hmmm... it should be possible to remove artificial shadowing. It may be some work writing a convenient script althougt.
How would such a algorithm be outlined?
1. Find areas which are shaded.
2. Move these areas to a different layer by cutting them out.
3. Remove shades by pushing up brightness/contrast or RGB Level balance until the colors equals the lit areas.
4. Merge the two layers again.
Part three should be done manually.
Part one requires quite some considerations of which filter could do such a job. Perhaps there is already one amongst these thousands of plugin filters for photoshop or similar programs. Or you have to build a own one. By now I have no idea of how it has to look like - perhaps someone with a faster mind does? Perhaps the knowledge about the direction of shadowing could help?
Well, as I've seen in another thread, a version of an unshaded hires earth has been found, so this isn't urgent anymore - still would be useful thing for future use perhaps.
maxim
Posted: 06.01.2004, 12:25
by maxim
The above post should have gone to the other thread too.
maxim
deshadowing
Posted: 06.01.2004, 15:00
by wcomer
Maxim,
It is possible to partially remove the shadows, but it is necessary to reverse the original shadow filter methodlogy used to create them. This would have to be done using custom code. It is also necessary to know which lighting algorythm was originally used as well as the original normal map and lighting position. From this you could get rid of most of the shadow effect but I suspect you would still be left with some artifacts. It would be better to work with the unshadowed texture mentioned in another thread.
cheers,
Walton
Posted: 06.01.2004, 22:02
by timcrews
Hello:
I downloaded and converted the unshaded Blue Marble texture discussed in the "Is the Blue Marble texture artificially shaded" thread. The Blue Marble textures that have been made available by other Celestia users have indeed been based on an artificially shaded texture. But the new Blue Marble texture that I downloaded is not statically shaded. BTW, it is 1.2GB in PNG virtual texture format, with 1K tiles. On my machine, it takes about 5 hours of computation to produce the virtual texture from the raw data.
To go with that, I also generated a 64K normal map (640MB in PNG virtual texture, 1K tiles). Previously, I had generated a 32K map, but was only able to see it projected on a gray orb, because the surface texture I was using had static shadows that were more pronounced than the dynamic shadows that normal map could generate.
Now, with a 64K unshaded surface texture, and a 64K normal map, I have no such problem. The results are very satisfactory. I am seeing all kinds of interesting surface detail that I never noticed before.
The normal map takes about an extra hour to generate compared to the surface texture, because of two extra normalization steps.
As for viewing performance, on my 1.2GHz Athlon machine, with 1.25GB of RAM, GeForce 4 Ti4200 with 64MB of RAM, I am able to get a 40fps frame rate while viewing at the highest (64K) resolution.
There are many texture artists out there who would certainly be able to improve on my work. I have only done the mechanical translation from one file format to another. I look forward to seeing what the artists can do.
I do notice a bit of "blockiness" when I shift from the 32K normal map to the 64K normal map. It is not extreme, but it is noticeable. I don't think there is much more I can do to improve the generation process. I did follow Walton's advice for the two-step resizing. I don't see any artifacts such as contour lines, but the shadows are a little blocky. Again, this might be inherent in stretching a 43K texture to 64K.
Once I have caught my breath from four straight days of texture generation, I plan to make available a ZIP file containing a directory structure + script framework that will allow other people to download the raw data from the same place I did, and then run the scripts to produce the virtual texture normal map and surface map. I will also try to provide as much instructional information as possible in this forum. I see this as the only alternative to hosting 1.2G files. I think this might be the only way that the general Celestia user community ever gets to see these beautiful results. I don't have even a token web page, which is why you don't see any screen shots in this post.
Tim Crews
distributing 64k virtual textures!!!
Posted: 06.01.2004, 23:58
by guest
To Tim Crews, and others,
the best way to distribute 1.2G virtual textures is through
bittorrent. it's very very easy to use and you can download
at 60k/s without any server at all!! it just needs a few people
to "seed" the file and everyone can download it.
see
http://bitconjurer.org/BitTorrent/