pastabagel wrote:Regarding celestia, I notice a lot of high res textures for Mars for example, are well in excess of 50MB per level. For example, John van Vliet's VT Mars Surface Map has a four part level 4 texture thats 50MB per part!
The individual images that compose a VT texture are only loaded if you look at them, unlike all-in-one textures. If there's only one JPG or PNG image for the entire surface, the whole thing has to be loaded into main memory and decompressed. DDS image files don't have to be expanded, however.
In fact, I have a 16k texture on MArs (Don.Edwards's) and wehn I get remotely close to the planet, the screen freezes, the hard drive thrashes, and after about 2 minutes, I get mars.
PNG and JPEG textures have to be loaded into main memory before they can be re-compressed and loaded into the graphics card. If your card supports DDS, the system's CPU is used to translate them into that format before loading into the card.
Don't forget that a 16K image is 16K pixels wide by 8K pixels high = 128M pixels x4 bytes/pixel = 512MB. An image that large can't fit into your main memory, so your system is "paging itself to death." Upgrading your system to 1GB or more would help performance more than upgrading the graphics card.
I currently have an nforce4 on-motherboard nvidia GPU that uses 64MB of system ram to do it's thing.
So you really have about 448MB less system overhead: probably less than 300MB available.
So the question is, how does Celestia use video memory and system memory to handle textures? Does it load textures into video memory first, then into system memory?
Nope: the other way around. See above.
Or is video memory a sort of memory scratch pad for painting the display?
Graphics memory has to hold the (compressed) surface texture images and the 3D models as well as the screen-buffer "scratch pad". It also has to hold the OpenGL routines that do pixel shading. Other features, like a Z-buffer, occupy graphics memory, too.
Does the amount of video memory determine the size of textures you can use (8k vs 16k for example) or does that depend more on the card's GPU chipset?
Modern Nvidia graphics chipsets are limited to 4K textures. Celestia cuts larger surface texture images into pieces that will fit onto its internal spherical models. It does not cut up textures used for any other 3D models. Main memory usually is the limiting factor, not graphics memory.