Page 1 of 1

Windows XP System Requirements

Posted: 26.02.2003, 20:24
by Guest
Whare are the requirements? I have a p3 500Mhz with a 16MB TNT card and it works but not real well. At least not very fluidly (sp?). Would upgrading my graphics card only to a nVidia GeForce 2 or something like that help?

Posted: 26.02.2003, 21:23
by bh
I would certainly upgrade. I recently upgraded my card to a GeForce2. A big improvement over my Intel onboard graphics chip set. Get the best card you can afford, GeForce are recommended. I'm running Xp, 1.3 Ghz, 256 RAM.

Posted: 26.02.2003, 21:42
by Darkmiss
you didn't state how much main memory you have
this too is very important

try to upgrade as many things as posible
if you use SD-Ram then where I am it cost ?15 for a 256 stick.

if you have less that 256 megs of ram, try to at least double it.

Posted: 26.02.2003, 21:51
by chris
Make sure that you've gotten the latest drivers from http://www.nvidia.com/ . . . It's quite possible that you have the default Microsoft OpenGL drivers installed. The default drivers do rendering completely in software, and will turn Celestia into a slideshow.

A GeForce2 MX is still a great budget upgrade though . . . You can get a 64MB board for around US$40 now--the extra memory is essential for high-resolution planet textures.

--Chris

Posted: 26.02.2003, 21:55
by selden
Whomever...

The GeForce2 version of the TNT (the TNT-2), especially the 32MB version, probably would help some. However, you certainly should get the fastest with the mostest ;) that you can afford.

FWIW, in gaming benchmarks, the GF2 MX card is about 1.5 times as fast as the GF2 TNT-2 at resolutions of 1Kx768. See http://www.romulus2.com/articles/guides/nvidiachipset/benchmark.htm
The GF4 MX 440 is maybe 1.5-2x the GF2 MX. See http://www.ultimatehardware.net/gf4mx/gf4mx.htm
and the GF4 Ti4200 is about 2x a GF4 MX.
Unfortunately, those benchmarks not a very good match for how Celestia works.

"Not very fluidly" suggests a low framerate. Celestia will display estimated framerates if you type the character "`" Framerates higher than 12fps are highly desirable. Equal to or better than your screen refresh rate is best.

Lots of things affect the framerate, though, and it's hard to separate them. Personally, I'm not familiar with the features of the various cards, so I'm probably not the best person to answer. But what the heck...

1) The most detailed textures are quite large. They take a while to load into the graphics card and then to render. This can slow the framerate quite a bit. You might try using the low-resolution textures to see how much that helps. Type the letter "r" (lowercase) to use the lower resolution texture maps. "R" (uppercase) will switch to higher resolution textures for comparison. Cards with more memory can keep the larger textures readily available so they don't have to be reloaded.

2) On some cards, the more sophisticated textures (like "nightlights") are not supported in hardware. They don't have the "shaders", for example. Software rendering by the main CPU is very slow by comparison. Turning on "nightlights" will slow framerates dramatically on those cards. (Like the ATI Rage series.)

3) The speed of the main CPU affects framerates in other ways, too, of course. It's used to calculate orbital positions, for example.

4) The speed of the AGP port will affect texture loading time. Celestia stops updating the screen while textures are being loaded. That causes a hiccup when you first approach a planet or when the lighted (or dark) side rotates into view the first time.

5) I'm not sure if it's dominated by the CPU or the graphics card, but turning on star and galaxy rendering can make a big difference, too.

I hope this helps a little.

on xp

Posted: 01.03.2003, 10:24
by John Van Vliet
Hi i run win XP with a nVida geforce 2 mx 4oo --64Meg card -- and the 41.09 driver with .78 Gig ram and have no problems using 4 4096x2048 textures on one planet (bump,texture,spec, and clouds )
altough i boug down using 3 8192x4096 textures in .png or.dds formate

Posted: 01.03.2003, 14:14
by Psykotik
For my part, I'm running a GeForce 2 MX 200 (! 32 Mb !) on a Athlon 1,4 Ghz, with 512 Mb RAM; no problem using 8k textures, even with big night textures coupled with big clouds texture; wait, I didn't get the highest framerate, but I can cross the galaxie very well.

IMHO, get the cheapest Nvidia card (trying to avoid the "MX" version), it's worth of it. But buying a card as expensive as a moon's rock, wich be out-of-date 2 months later... Anyway, it's your choice; a choice depending of your purse :lol:

Posted: 01.03.2003, 18:15
by Don. Edwards
seldon,
The Geforce generation of cards is totaly a diferent chip than the TNT/TNT2. The TNT chips are based on the older RIVA chip as a base and doesn't have many of the fuctions found in the GeForce line of cards. The GeForce was built on top of the features found in the TNT2 chipset plus they added the T&L enging and a better memory interface. After the GeForce was releassed NVidia fragmented the TNT line into several diferent versions. The TNT2 Ultra remained as the runner up in the product line but there was a nuetered version that was made. It only had a 64bit memory bus versus the 128bit of the TNT/TNT2 had. This chip is know as the TNT/TNT2 M64. There was also a version that became known as the Vanta. I believe the Vanta is based on the older TNT chip with only a 64bit memory bus and lower amounts of memory not to excede 8megs, I think.
When the third generation GeForce came out a.k.a GeForce2 GTS NVidia fragmented the Geforce line into 2 chipsets. The GTS chip and the MX chip. The GTS retained the 128bit memory bus but the MX or MX 200's got a smaller memory interface at 64bits. A few monthes later and intermediate chip was released. The GeForce2 MX 400. The chip was basicly a downclocked GTS with slower memory but the 128bit memory bus. So as we can see there are many diferent versions of the Geforce chip as well as the TNT chip. Some features that came along for the GeForce got passed down to the TNT2 through drivers. Namely anti-aliasing. But this only worked on the top of the line TNT2/TNT2 Ultra cards. Trying to follow all the bends and curves that NVidia has done the last few years needs a chart to keep track as to whats been going on. Maybe I will make one as a refernce for everyone so they can get an idea of the performance level to expect from there cards. Being that I work part time in a local computer shop I have acces to many older NVidia cards so I can create a running list of what featutes work and don't work on the various cards. At present I have a Riva TNT 16meg AGP, a TNT2 Ultra with 32megs AGP, a GeForce2 MX 200 with 32megs AGP, a GeForce2 MX 400 with 64megs PCI, and I have access to a GeForce4 MX 440 64 meg AGP, and of corse my GeForce4 Ti4600 with 128 megs AGP. If I can come up with a GeForce3 Ti and a GeForce2 GTS, and a GeForce2 MX 400 AGP I will be able to add those to the list. I think this is a great idea. Of course I have other video cards I can add to the list as well. I have a few ATI cards, a few S3 Savage based cards, and I can get ahold of a few Matrox M200 & M400s as well. If I come across any other cards that I think might run Celestia in at least basic mode I will put thoughs in as well. And as other brands of cards come through the shop I can list them and test tnem to there ability to run Celestia. Any aditional ideas?

Posted: 01.03.2003, 18:37
by selden
Don,

Thanks a lot for the Nvidia history lesson! My personal experience with PC graphics cards is a *lot* more limited: I'm only on my 3rd. (Originally I was using 3dfx cards, but then Celestia forced me to upgrade to an Nvidia Ti4200)

Posted: 01.03.2003, 19:34
by Don. Edwards
I didn't use all these cards. I just acumulated them over time working at the shop. Before a got my GeForce2 MX 200 32meg I too used a 3Dfx card. I used a Creative Blaster Banshee 16meg PCI. Before that I used a Matrox Millenium 4meg with a VooDoo2 12meg. Boy that is a far cry from today. I remember when that VooDoo2 was cutting edge and I was thinking of getting a second one to run in %$^&& I forgot what they called it when you ran 2 cards side by side. Oh now I have it, SLI mode.
And don't even get me started on my adventure to getting my Ti4600.