nvidia cards
-
Topic authorFrank
nvidia cards
I am planing to buy a new graphics card especially for running celestia. Can someone tell me what is the best nvidia card at present for celestia.
-
- Posts: 1510
- Joined: 07.09.2002
- Age: 59
- With us: 22 years 2 months
- Location: Albany, Oregon
It of course depends on how much you are willing to spend. You can get an NVidia GeForce4 Ti4200 with 128mb of DDR VRAM on the lower side or you can go hog wild and buy a GeForce FX 5800 with 128mb of DDR VRAM. I don't think anyone in the forum as of yet has purchased a GeForce FX card of any level but they should run Celestia very well. If you look around you can probably find the GeForce4 Ti4600 128MB card for between $199 and $219. You may even find it cheaper. I haven't shoped for one in months as I already have one. With the Ti4600 you can set the antialiasing to 4x and get smooth framrates and fantstic imagery.
But I think you need to steer clear of any 64mb cards. If you want to use realy large textures you are going to want to have the headroom for them. As always the more RAM the better even when it comes to video cards. Good luck with your search and purcase.
But I think you need to steer clear of any 64mb cards. If you want to use realy large textures you are going to want to have the headroom for them. As always the more RAM the better even when it comes to video cards. Good luck with your search and purcase.
I am officially a retired member.
I might answer a PM or a post if its relevant to something.
Ah, never say never!!
Past texture releases, Hmm let me think about it
Thanks for your understanding.
I might answer a PM or a post if its relevant to something.
Ah, never say never!!
Past texture releases, Hmm let me think about it
Thanks for your understanding.
Don. Edwards wrote:It of course depends on how much you are willing to spend. You can get an NVidia GeForce4 Ti4200 with 128mb of DDR VRAM on the lower side or you can go hog wild and buy a GeForce FX 5800 with 128mb of DDR VRAM. I don't think anyone in the forum as of yet has purchased a GeForce FX card of any level but they should run Celestia very well. If you look around you can probably find the GeForce4 Ti4600 128MB card for between $199 and $219. You may even find it cheaper. I haven't shoped for one in months as I already have one. With the Ti4600 you can set the antialiasing to 4x and get smooth framrates and fantstic imagery.
But I think you need to steer clear of any 64mb cards. If you want to use realy large textures you are going to want to have the headroom for them. As always the more RAM the better even when it comes to video cards. Good luck with your search and purcase.
I have a ti4200 w/64mb, and it runs celestia very smoothly (atleast compaired to my previous pos radeon7000...)
-
Topic authorFrank
nvidia
Thanks for the reply Don. Will I see any noticeable difference in Celestia between the ti4200,4400,4600 and 4800 cards given that they all have 128mb DDR VRAM. Does Celestia take advantage of all the features on the more expensive cards or do they provide features and performance in areas that Celestia does not use. How much beter can we expect Celestia to operate with the FX cards.
Hi Frank,
All this suggestions foreget one important thing mordern grafic cards (GF3 and above) need much power and work not on every mainboard. Check your power supply maybe that 300Watt is not enought (depend on the manufactor). An example for a problematic mainboard is the often used K7S5A.
My tipp: a GF3 is very fast and from second hand on ebay available. A GF4 can nothing do better in Celestia expect you find 45fps is to slow. Now 128mb is nice but a 64mb card works with all the large textures also very nice. It's more importent that you have enought ram (512mb or more). Fridger has a old GF2gts with only 32mb and they use still the largest textures with all effects.
Bye Jens
All this suggestions foreget one important thing mordern grafic cards (GF3 and above) need much power and work not on every mainboard. Check your power supply maybe that 300Watt is not enought (depend on the manufactor). An example for a problematic mainboard is the often used K7S5A.
My tipp: a GF3 is very fast and from second hand on ebay available. A GF4 can nothing do better in Celestia expect you find 45fps is to slow. Now 128mb is nice but a 64mb card works with all the large textures also very nice. It's more importent that you have enought ram (512mb or more). Fridger has a old GF2gts with only 32mb and they use still the largest textures with all effects.
Bye Jens
jim,
While the GeForce3 is a capable video card it doesn't have HARDWARE VERTEX SHADERS. It uses software vertex shaders and as such isn't as fast when using vertex shader programing. All GeForce4 TI model cards and the GeForce FX cards have both hardware vertex and pixel shaders. That is a definet one up on the GeForce3. The other thing is there are no GeForce3 cards with 128mb of VRAM. As we keep adding features to Celestia that extra 64mb of VRAM is going to have a great deal of importance. Just because you can use a 64mb card and use 16k texturtes on it doesn't mean a 128mb card wouldn't be better. Chris himself stated the a 128mb card is a much better choice. And I think of anybody here he is the one person to listen to most of all. I find it amazing that Fridger is able to get these huge textures to work on his GeForce2 32mb card. But he has stated in the past the framrates do suffer. Besides Frank was asking about a new video card not someones used video card that may or may not work right. How do you know if the person that owned that card wasn't overclocking it the whole time they used it. I feel when it comes to NVidia cards buying new is always the better option. Sure there are plenty of good cards out there on ebay but if I am going to purchase something like this I want to know that I am the first person to use that card. Or at least I think I am. The GeForce3 is now 2 generations old, and he is looking for a newer card.
Frank, as I said before. Go for the GeForce4 TI series of cards. As long as your system has a good 300 watt power supply and can support AGP 4X your good to go.
While the GeForce3 is a capable video card it doesn't have HARDWARE VERTEX SHADERS. It uses software vertex shaders and as such isn't as fast when using vertex shader programing. All GeForce4 TI model cards and the GeForce FX cards have both hardware vertex and pixel shaders. That is a definet one up on the GeForce3. The other thing is there are no GeForce3 cards with 128mb of VRAM. As we keep adding features to Celestia that extra 64mb of VRAM is going to have a great deal of importance. Just because you can use a 64mb card and use 16k texturtes on it doesn't mean a 128mb card wouldn't be better. Chris himself stated the a 128mb card is a much better choice. And I think of anybody here he is the one person to listen to most of all. I find it amazing that Fridger is able to get these huge textures to work on his GeForce2 32mb card. But he has stated in the past the framrates do suffer. Besides Frank was asking about a new video card not someones used video card that may or may not work right. How do you know if the person that owned that card wasn't overclocking it the whole time they used it. I feel when it comes to NVidia cards buying new is always the better option. Sure there are plenty of good cards out there on ebay but if I am going to purchase something like this I want to know that I am the first person to use that card. Or at least I think I am. The GeForce3 is now 2 generations old, and he is looking for a newer card.
Frank, as I said before. Go for the GeForce4 TI series of cards. As long as your system has a good 300 watt power supply and can support AGP 4X your good to go.
-
- Site Admin
- Posts: 4211
- Joined: 28.01.2002
- With us: 22 years 9 months
- Location: Seattle, Washington, USA
Anonymous wrote:jim,
While the GeForce3 is a capable video card it doesn't have HARDWARE VERTEX SHADERS. It uses software vertex shaders and as such isn't as fast when using vertex shader programing. All GeForce4 TI model cards and the GeForce FX cards have both hardware vertex and pixel shaders. That is a definet one up on the GeForce3.
Not true--the GeForce 3 does have hardware vertex shaders. It's the GeForce4 MX that lacks them. A GeForce4 Ti 4200 is still a better card, but if you can find a cheap GeForce3, buy it! It will make a great card for Celestia.
--Chris
Hi Guest (Don?)
The main problem is that we don't know anythink about Frank's Computer and if he want play the lates 3D ego shooter or only a good solution for Celestia. I think it makes no sence to use a faster card then a GF4 TI 4200 on a 1GHz system or slower one. I'm the owner of a GF3 TI 200 and a Duron 900MHz with 512 Mb DDR ram. At the moment is the Duron to slow to get the full performance of my GF3.
Now what is the difference between GF3 and GF4? A GF4 is only a improved GF3 nothing else!
in detail:
- additional multi-monitor-function "nView" (for Celestia not relevant)
- "Lightspeed Memory Architecture II" for better memory performance
- "nFiniteFx-II"-Engine for better performance of pixel and vertex shadder
That's all!
What's about the GF4 MX cards? The GF4 MX is not a improved GF2 but a reviced GF2 MX! All the MX-cards have only two rendering-pippelines. All real GF cards (GF2-GF FX) have 4 rendering-pippelines. Ok the GF4 MX has some othter improvements and therfore you can compare the performance with a GF2. But who use really antialiasing with a slow MX-card? I would never buy a MX-card.
Grafic power and Celestia? The question is how many fps do i need for Celestia? Celestia is no ego shoter where a high frame rate is necessary for quick reactions. I think if the frame rate goes never below 20 then it's fast enough for Celestia.
Bye Jens
The main problem is that we don't know anythink about Frank's Computer and if he want play the lates 3D ego shooter or only a good solution for Celestia. I think it makes no sence to use a faster card then a GF4 TI 4200 on a 1GHz system or slower one. I'm the owner of a GF3 TI 200 and a Duron 900MHz with 512 Mb DDR ram. At the moment is the Duron to slow to get the full performance of my GF3.
Now what is the difference between GF3 and GF4? A GF4 is only a improved GF3 nothing else!
in detail:
- additional multi-monitor-function "nView" (for Celestia not relevant)
- "Lightspeed Memory Architecture II" for better memory performance
- "nFiniteFx-II"-Engine for better performance of pixel and vertex shadder
That's all!
What's about the GF4 MX cards? The GF4 MX is not a improved GF2 but a reviced GF2 MX! All the MX-cards have only two rendering-pippelines. All real GF cards (GF2-GF FX) have 4 rendering-pippelines. Ok the GF4 MX has some othter improvements and therfore you can compare the performance with a GF2. But who use really antialiasing with a slow MX-card? I would never buy a MX-card.
Grafic power and Celestia? The question is how many fps do i need for Celestia? Celestia is no ego shoter where a high frame rate is necessary for quick reactions. I think if the frame rate goes never below 20 then it's fast enough for Celestia.
Bye Jens
Probably the FX5800 (NV30), though its expensive and real noisey.Frank wrote:Can someone tell me what is the best nvidia card at present for celestia.
Im looking for a new card just for celestia as well, I've done a bit of research and haven't yet decided between a Ti4200 (NV28) or an FX5200 (NV34).
The FX5200 is about $100AU cheaper.
Both have 128Meg of ram
I havn't found any good comparisons between the two cards but I think that the FX5200 would be a little behind the Ti4200.
From the nVidia website.
GeForce4 Ti 4200 Vertices per Second: 113 Million
GeForce FX 5200 Vertices per Second: 81 million
Any opinions on the performance between the two? (as it relates to celestia)
I saw a passivly cooled nVidia reference FX5200, I hope some of the manufacturers follow suite.
Or would it be worth waiting for the 256Meg FX5600 (NV31) to become available in Australia? At what size textures would celestia make use of the 256Meg of ram?
Not quite true. I currently run a dual monitor setup with a GF2 MX400, and celestia looks great spanned across both monitors..jim wrote:- additional multi-monitor-function "nView" (for Celestia not relevant)
jim wrote:An example for a problematic mainboard is the often used K7S5A.
Damn I own one of these, looks like ill have to upgrade the motherboard and CPU too. I know the early K7S5A boards copped a lot of flak, ive got a later version so i might be ok here. It was a nice cheap board though.
Marc Griffith http://mostlyharmless.sf.net
In the "for what it's worth" department, my system at work just had its video card upgraded from a
32MB ATI Radeon 7000
to a
128MB NVIDIA FX 5200 lite
The Radeon locked up the system almost every time I opened a full-screen Internet Explorer window, which made it easy to get an upgrade. The FX works just fine The card seems to be just a "reference design" with none of the quality improvements that eVGA has been known for on previous cards -- like better heat sinks.
Here are some performance numbers:
These were all "measured" using the standard medium resolution textures.
viewing Earth in full screen mode at an altitude of about 11,000 km with all features enabled and time running at 1000x, it does 30fps.
viewing Earth in full screen mode at an altitude of about 30,000 km with all features enabled and time running at 1000x, it does 60fps.
viewing Earth in windowed mode at an altitude of about 30,000 km with all features enabled and time running at 1x, it does 75fps.
System:
512MB 2.4GHz Pentium 4, WinXP Pro, SP1
128MB Nvidia FX 5200, AGP 4X, Detonator 44.03, OpenGL 1.4.0
Celestia v1.3.1pre3
32MB ATI Radeon 7000
to a
128MB NVIDIA FX 5200 lite
The Radeon locked up the system almost every time I opened a full-screen Internet Explorer window, which made it easy to get an upgrade. The FX works just fine The card seems to be just a "reference design" with none of the quality improvements that eVGA has been known for on previous cards -- like better heat sinks.
Here are some performance numbers:
These were all "measured" using the standard medium resolution textures.
viewing Earth in full screen mode at an altitude of about 11,000 km with all features enabled and time running at 1000x, it does 30fps.
viewing Earth in full screen mode at an altitude of about 30,000 km with all features enabled and time running at 1000x, it does 60fps.
viewing Earth in windowed mode at an altitude of about 30,000 km with all features enabled and time running at 1x, it does 75fps.
System:
512MB 2.4GHz Pentium 4, WinXP Pro, SP1
128MB Nvidia FX 5200, AGP 4X, Detonator 44.03, OpenGL 1.4.0
Celestia v1.3.1pre3
Selden
- John Van Vliet
- Posts: 2944
- Joined: 28.08.2002
- With us: 22 years 2 months
re
HI i am using a dell 8200 with 512 ram and a nVidia MX400 ,and 44.03_win2kxp_english driver, with 64m vram on win XP and have no problems running earth
.png
4kEarth.png
4kEarthNight.png
4kEarthNormal.png
4kEarthClouds.png
or
8kEarth.dds
8kEarthNight.dds
8kEarthNormal.dds
2kEarthClouds.dds
although the computer dose boug down ,very,very,very,very,slow when i use all 8k(maps) .png
.png
4kEarth.png
4kEarthNight.png
4kEarthNormal.png
4kEarthClouds.png
or
8kEarth.dds
8kEarthNight.dds
8kEarthNormal.dds
2kEarthClouds.dds
although the computer dose boug down ,very,very,very,very,slow when i use all 8k(maps) .png
Here are comparison numbers for a Ti4200 on my much slower system at home.
These were all "measured" using the standard medium resolution textures.
viewing Earth in full screen mode at an altitude of about 11,000 km with all features enabled and time running at 1000x, it does 23fps.
viewing Earth in full screen mode at an altitude of about 30,000 km with all features enabled and time running at 1000x, it does 23fps.
viewing Earth in windowed mode at an altitude of about 30,000 km with all features enabled and time running at 1x, it does 23fps.
At least it's consistant
I forgot to mention in my previous posting that the screen on my system at work is running at 1600x1200 at 75 Hz. That's the same as I use on this system at home.
Also, the limiting stellar magnitude in both cases was set to 6.6. More stars visible slows my home system dramatically.
System:
256MB 500MHz Pentium 3 (dual processor), WinXP Pro, SP1
128MB Nvidia Ti 4200, AGP 2X, Detonator 43.45, OpenGL 1.4.0
Celestia v1.3.1pre3
I am not running 44.03 at home because it goes into a CPU bound loop shortly after the system boots and XP complains bitterly. So do I
These were all "measured" using the standard medium resolution textures.
viewing Earth in full screen mode at an altitude of about 11,000 km with all features enabled and time running at 1000x, it does 23fps.
viewing Earth in full screen mode at an altitude of about 30,000 km with all features enabled and time running at 1000x, it does 23fps.
viewing Earth in windowed mode at an altitude of about 30,000 km with all features enabled and time running at 1x, it does 23fps.
At least it's consistant
I forgot to mention in my previous posting that the screen on my system at work is running at 1600x1200 at 75 Hz. That's the same as I use on this system at home.
Also, the limiting stellar magnitude in both cases was set to 6.6. More stars visible slows my home system dramatically.
System:
256MB 500MHz Pentium 3 (dual processor), WinXP Pro, SP1
128MB Nvidia Ti 4200, AGP 2X, Detonator 43.45, OpenGL 1.4.0
Celestia v1.3.1pre3
I am not running 44.03 at home because it goes into a CPU bound loop shortly after the system boots and XP complains bitterly. So do I
Selden
Thanks for the comparison Selden, I still haven't got that new video card. Due to the constant framerate, the bottleneck on your home system would have to be the cpu/mb dont you think? It would be interesting to know how the ti4200 goes in your work system.
The two cards seem fairly similar. Ill probably end up making the decision on the card that has the biggest heatsink, and is likely to be the quietest.
The two cards seem fairly similar. Ill probably end up making the decision on the card that has the biggest heatsink, and is likely to be the quietest.
Marc Griffith http://mostlyharmless.sf.net
Marc,
I'm not sure where the bottleneck is for this particular test.
I actually was quite surprised that they all came out the same.
I wiggled the earth's position just to reassure myself that the frame rate would change. (It did get slower while I was doing that.)
It has to be due to something that's the same among the three different sets of graphics conditions but I don't know enough about what Celestia's doing to know what that might be. Numeric calculations? Data being transferred over the AGP bus? (Are there any tools to measure AGP datarates?)
I can make more sense of the different speeds shown by the FX card: fewer pixels are changing as you get farther from the Earth, and fewer are involved when you draw it in a window.
Certainly there's a significant pause on my home system whenever it has to load a new texture. It's present but not at all as obvious on the system I have at work. I'd expect that to be related to the CPU & AGP speeds. I didn't try to measure that.
The 4200 has a fan built into the graphic chip's heatsink, but the 5200 has no fan at all, if that matters to you.
I won't be taking the 4200 to work to test, sorry. I'm not about to risk zapping my home system by disassembling it, small though that risk might be. Maybe when I do my next major system upgrade, but that won't be for quite a while. I consider the current performance quite adequate. Maybe when the next generation after the 5900 becomes available.
I'm not sure where the bottleneck is for this particular test.
I actually was quite surprised that they all came out the same.
I wiggled the earth's position just to reassure myself that the frame rate would change. (It did get slower while I was doing that.)
It has to be due to something that's the same among the three different sets of graphics conditions but I don't know enough about what Celestia's doing to know what that might be. Numeric calculations? Data being transferred over the AGP bus? (Are there any tools to measure AGP datarates?)
I can make more sense of the different speeds shown by the FX card: fewer pixels are changing as you get farther from the Earth, and fewer are involved when you draw it in a window.
Certainly there's a significant pause on my home system whenever it has to load a new texture. It's present but not at all as obvious on the system I have at work. I'd expect that to be related to the CPU & AGP speeds. I didn't try to measure that.
The 4200 has a fan built into the graphic chip's heatsink, but the 5200 has no fan at all, if that matters to you.
I won't be taking the 4200 to work to test, sorry. I'm not about to risk zapping my home system by disassembling it, small though that risk might be. Maybe when I do my next major system upgrade, but that won't be for quite a while. I consider the current performance quite adequate. Maybe when the next generation after the 5900 becomes available.
Selden
selden wrote:I won't be taking the 4200 to work to test, sorry. I'm not about to risk zapping my home system by disassembling it, small though that risk might be.
I wasnt expecting you to. . Mabey there is someone else out there do the test.
I just had an idea, with a 'getframerate' method and a lau script a benchmark demo could be made that would display various performance 'scores'. It could travel around, record various framerates, and summarise the results at the end.
A passivley cooled card would be ideal. I was aware that nvidia had built one without a fan but wasnt sure if they were commercially available. Its unfortunate that the market demands a fancy looking (and usually inefficient) heatsink with a loud fan just because they look cool. You said that the manufacturer of your fx card is eVGA?
Im pretty fussy about noise, and have burnt out a few power supplies when 'de-rating' the fans to 5 volts. Though its all worth it when you have a computer where the only annoying sound is the keyboard.
Marc Griffith http://mostlyharmless.sf.net
Pictures of eVGA's 5200 series are at http://www.evga.com/articles/public.asp?AID=133
the one I have at work looks like the 2nd one down, with the black heatsink.
see http://www.evga.com/articles/public.asp?AID=133&page=3 for a closeup.
It's listed by NewEgg for U.S. $80
No games were included, though. *shrug*
(I think that's what they mean by "optional slim box (light version)" )
the one I have at work looks like the 2nd one down, with the black heatsink.
see http://www.evga.com/articles/public.asp?AID=133&page=3 for a closeup.
It's listed by NewEgg for U.S. $80
No games were included, though. *shrug*
(I think that's what they mean by "optional slim box (light version)" )
Selden