I don't knopw where else to post this, as it pertains to hardware to run Celestia, but I'll put it here...
I currently have a pc with 512MB memory and an on-board nforce 4 graphics card (equivalent to geforce mx 4000 64MB, i think). I'm shopping for a new card.
Which is more important for Celestia, a faster GPU or more graphics card memory? In my budget range, there's a 128MB geforce 5500 and a 256MB geforce 6200. I would think more memory would be better for displaying higher res textures, but I could be wrong.
Thoughts?
Graphic Card for Celestia?
-
Topic authorpastabagel
- Posts: 7
- Joined: 04.01.2006
- With us: 18 years 10 months
Unfortunately, the 6200 with TurboCache does not have 256MB of onboard memory. Instead it uses 256MB of your computer's main memory and a small on-board cache memory.
Since your system has 512MB, that means that running programs will be limited to less than 256MB of memory themselves. I suspect this will be more inconvenient than you'd like. If you load many large textures into Celestia, for example, it needs lots of main memory. If it has to use the paging file, things get painfully slow.
In contrast, the 5500 has its own 128MB of memory.
Since your system has 512MB, that means that running programs will be limited to less than 256MB of memory themselves. I suspect this will be more inconvenient than you'd like. If you load many large textures into Celestia, for example, it needs lots of main memory. If it has to use the paging file, things get painfully slow.
In contrast, the 5500 has its own 128MB of memory.
Selden
-
Topic authorpastabagel
- Posts: 7
- Joined: 04.01.2006
- With us: 18 years 10 months
This is the card I'm looking at:
http://www.evga.com/products/moreinfo.a ... ndow=specs
eVGA 256-A8-N341-LX Geforce 6200 256MB DDR AGP 4X/8X
According to the specifications, it has 256meg on board. It also does not identify it as the turbocache (other 6200's do).
Assuming it does actually have 256MB of memory, is more garphics memory better than a faster GPU, or vice versa?
http://www.evga.com/products/moreinfo.a ... ndow=specs
eVGA 256-A8-N341-LX Geforce 6200 256MB DDR AGP 4X/8X
According to the specifications, it has 256meg on board. It also does not identify it as the turbocache (other 6200's do).
Assuming it does actually have 256MB of memory, is more garphics memory better than a faster GPU, or vice versa?
It depends on what you'll be using it for.
My understanding is that games like Half-Life 2 will make good use of all of the graphics memory that's available.
Celestia's memory usage depends on how many high-resolution Addons you have loaded. The high resolution version of Runar Thorvaldsen's "Journey Through Interplanetary Space" Addon might use more than 128MB if you step through all of the possible viewpoints. I don't know of any other Addons that need as much memory.
My understanding is that games like Half-Life 2 will make good use of all of the graphics memory that's available.
Celestia's memory usage depends on how many high-resolution Addons you have loaded. The high resolution version of Runar Thorvaldsen's "Journey Through Interplanetary Space" Addon might use more than 128MB if you step through all of the possible viewpoints. I don't know of any other Addons that need as much memory.
Selden
-
Topic authorpastabagel
- Posts: 7
- Joined: 04.01.2006
- With us: 18 years 10 months
Thanks, Selden.
I don't really care much about games, I was speaking specifically of celestia. The only game I play is a quake 3 mod, and the clunker I have now handles that fine.
Regarding celestia, I notice a lot of high res textures for Mars for example, are well in excess of 50MB per level. For example, John van Vliet's VT Mars Surface Map has a four part level 4 texture thats 50MB per part! In fact, I have a 16k texture on MArs (Don.Edwards's) and wehn I get remotely close to the planet, the screen freezes, the hard drive thrashes, and after about 2 minutes, I get mars. I currently have an nforce4 on-motherboard nvidia GPU that uses 64MB of system ram to do it's thing.
So the question is, how does Celestia use video memory and system memory to handle textures? Does it load textures into video memory first, then into system memory? Or is video memory a sort of memory scratch pad for painting the display? Does the amount of video memory determine the size of textures you can use (8k vs 16k for example) or does that depend more on the card's GPU chipset?
I understand that celestia doesn't release textures from memory from one object to the next (which is fine because you may revisit objects more than once).
I don't really care much about games, I was speaking specifically of celestia. The only game I play is a quake 3 mod, and the clunker I have now handles that fine.
Regarding celestia, I notice a lot of high res textures for Mars for example, are well in excess of 50MB per level. For example, John van Vliet's VT Mars Surface Map has a four part level 4 texture thats 50MB per part! In fact, I have a 16k texture on MArs (Don.Edwards's) and wehn I get remotely close to the planet, the screen freezes, the hard drive thrashes, and after about 2 minutes, I get mars. I currently have an nforce4 on-motherboard nvidia GPU that uses 64MB of system ram to do it's thing.
So the question is, how does Celestia use video memory and system memory to handle textures? Does it load textures into video memory first, then into system memory? Or is video memory a sort of memory scratch pad for painting the display? Does the amount of video memory determine the size of textures you can use (8k vs 16k for example) or does that depend more on the card's GPU chipset?
I understand that celestia doesn't release textures from memory from one object to the next (which is fine because you may revisit objects more than once).
The individual images that compose a VT texture are only loaded if you look at them, unlike all-in-one textures. If there's only one JPG or PNG image for the entire surface, the whole thing has to be loaded into main memory and decompressed. DDS image files don't have to be expanded, however.pastabagel wrote:Regarding celestia, I notice a lot of high res textures for Mars for example, are well in excess of 50MB per level. For example, John van Vliet's VT Mars Surface Map has a four part level 4 texture thats 50MB per part!
PNG and JPEG textures have to be loaded into main memory before they can be re-compressed and loaded into the graphics card. If your card supports DDS, the system's CPU is used to translate them into that format before loading into the card.In fact, I have a 16k texture on MArs (Don.Edwards's) and wehn I get remotely close to the planet, the screen freezes, the hard drive thrashes, and after about 2 minutes, I get mars.
Don't forget that a 16K image is 16K pixels wide by 8K pixels high = 128M pixels x4 bytes/pixel = 512MB. An image that large can't fit into your main memory, so your system is "paging itself to death." Upgrading your system to 1GB or more would help performance more than upgrading the graphics card.
So you really have about 448MB less system overhead: probably less than 300MB available.I currently have an nforce4 on-motherboard nvidia GPU that uses 64MB of system ram to do it's thing.
Nope: the other way around. See above.So the question is, how does Celestia use video memory and system memory to handle textures? Does it load textures into video memory first, then into system memory?
Graphics memory has to hold the (compressed) surface texture images and the 3D models as well as the screen-buffer "scratch pad". It also has to hold the OpenGL routines that do pixel shading. Other features, like a Z-buffer, occupy graphics memory, too.Or is video memory a sort of memory scratch pad for painting the display?
Modern Nvidia graphics chipsets are limited to 4K textures. Celestia cuts larger surface texture images into pieces that will fit onto its internal spherical models. It does not cut up textures used for any other 3D models. Main memory usually is the limiting factor, not graphics memory.Does the amount of video memory determine the size of textures you can use (8k vs 16k for example) or does that depend more on the card's GPU chipset?
Selden
-
Topic authorpastabagel
- Posts: 7
- Joined: 04.01.2006
- With us: 18 years 10 months