My new NVIDIA FX-5900 Ultra/256 MB DDR super card!
-
Topic authort00fri
- Developer
- Posts: 8772
- Joined: 29.03.2002
- Age: 22
- With us: 22 years 7 months
- Location: Hamburg, Germany
My new NVIDIA FX-5900 Ultra/256 MB DDR super card!
Hi all,
over the weekend, I extensively tested NVIDIA's latest super card:
FX-5900-Ultra/256MB DDR
Although, for my present CPU it is a bit of an overkill, I now own it
and thought some of you might be interested to know what my
impressions are.
The plan is, of course, to replace my present system (PIII 1Ghz/512MB
CL2) very soon by a new 4 Ghz Pentium 4 CPU environment with 2GB fast
memory. [ For her needs, my wife will be quite satisfied with my
present setup, including my GF 2 GTS/32 MB card that is working very
well within its limitations.]
Of course I looked at many tests in the net before getting 'involved'.
There seems to be a consensus that this new card is certainly a major
improvement over previous FX models, notably the 'somewhat unlucky'
5800'. The FX-5900 Ultra card (for the moment) is also the fastest one
available on the market, and it is quite expensive (street price
~$500). Since my dealer has delivered thousands of PC's &
servers to my lab, we normally get very favourable conditions, notably
w.r.to testing hardware out beforehand, returning and also price, of course.
However, money was no argument in this story, so let's just forget about it...
For those interested in some technical details:
--There is now a 256 bit memory bus (compared to 128 before)
--The bandwidth is huge: 27.2 GB/sec as compared e.g. to
8.0 GB/sec for a GF4 Ti 4200
--new 0.13 mu process, DX9 native, well + 256 MB DDR SGRAM!
--AGP 3.0, 8x compatible
--absolutely crisp 2d quality! <==================
--agressive new cooling technology
--surprisingly QUIET!!! <==================
My card was produced by a neat German firm, TERRATEC, that got a very good reputation originally via their high-end sound cards. What was most important to me where these specific aspects:
-- The Terratec card is virtually identical to the NVIDIA reference design. This implies a regulated fan (!) and different clocking stages in 2d (300/800 MHz) and 3d (450/850 MHz): the result is that the card is not to be heard when I do 'serious work' in 2d;-). In 3d the card fan ramps up to max in 3 stages! Still only about the same level as my old GF2!
-- Due to the mandatory European CE norm requirements (electro-magnetic compatibility), the card's output signal has to undergo 'castration' through L,R,C filters, where often a lot of quality damage is rooted for reasons of saving. The original NVIDIA design is simply much superior here!
-- Usually, I had cards from ASUS, but the corresponding ASUS model this time has much worse 2d quality and notably an unregulated fan => noisy.
Finally, the benchmarks and Celestia...?;-)
Just believe me, they are great, despite the fact that I presently lack CPU power to reach the quoted frame rates. But if I plot 3dMark2001/3 frame rates as function of the CPU power from their archive, I fit in perfectly.
Under Linux, glxgears moved up from 2400 to 10500 fps, for example...
If there is interest in frame rates, I can certainly post some tests.
I wrote a little while ago that I shall never return to non-tiled
textures;-). Well that statement only referred to my 32MB GF2GTS. With 256MB DDR Ram, I can now load 32K textures easily and get an overall much smoother performance. Chris will soon tune the tile loading further, but at present, with my fairly 'slow' CPU and 2k tiles, the 1 texture performance seems to be better with this card due to its huge storage capability.
Presently, Celestia runs silk-smooth on a high-quality 21" analog monitor in full screen 1600x1200 px and 32bit color ....
WOW
Bye Fridger
over the weekend, I extensively tested NVIDIA's latest super card:
FX-5900-Ultra/256MB DDR
Although, for my present CPU it is a bit of an overkill, I now own it
and thought some of you might be interested to know what my
impressions are.
The plan is, of course, to replace my present system (PIII 1Ghz/512MB
CL2) very soon by a new 4 Ghz Pentium 4 CPU environment with 2GB fast
memory. [ For her needs, my wife will be quite satisfied with my
present setup, including my GF 2 GTS/32 MB card that is working very
well within its limitations.]
Of course I looked at many tests in the net before getting 'involved'.
There seems to be a consensus that this new card is certainly a major
improvement over previous FX models, notably the 'somewhat unlucky'
5800'. The FX-5900 Ultra card (for the moment) is also the fastest one
available on the market, and it is quite expensive (street price
~$500). Since my dealer has delivered thousands of PC's &
servers to my lab, we normally get very favourable conditions, notably
w.r.to testing hardware out beforehand, returning and also price, of course.
However, money was no argument in this story, so let's just forget about it...
For those interested in some technical details:
--There is now a 256 bit memory bus (compared to 128 before)
--The bandwidth is huge: 27.2 GB/sec as compared e.g. to
8.0 GB/sec for a GF4 Ti 4200
--new 0.13 mu process, DX9 native, well + 256 MB DDR SGRAM!
--AGP 3.0, 8x compatible
--absolutely crisp 2d quality! <==================
--agressive new cooling technology
--surprisingly QUIET!!! <==================
My card was produced by a neat German firm, TERRATEC, that got a very good reputation originally via their high-end sound cards. What was most important to me where these specific aspects:
-- The Terratec card is virtually identical to the NVIDIA reference design. This implies a regulated fan (!) and different clocking stages in 2d (300/800 MHz) and 3d (450/850 MHz): the result is that the card is not to be heard when I do 'serious work' in 2d;-). In 3d the card fan ramps up to max in 3 stages! Still only about the same level as my old GF2!
-- Due to the mandatory European CE norm requirements (electro-magnetic compatibility), the card's output signal has to undergo 'castration' through L,R,C filters, where often a lot of quality damage is rooted for reasons of saving. The original NVIDIA design is simply much superior here!
-- Usually, I had cards from ASUS, but the corresponding ASUS model this time has much worse 2d quality and notably an unregulated fan => noisy.
Finally, the benchmarks and Celestia...?;-)
Just believe me, they are great, despite the fact that I presently lack CPU power to reach the quoted frame rates. But if I plot 3dMark2001/3 frame rates as function of the CPU power from their archive, I fit in perfectly.
Under Linux, glxgears moved up from 2400 to 10500 fps, for example...
If there is interest in frame rates, I can certainly post some tests.
I wrote a little while ago that I shall never return to non-tiled
textures;-). Well that statement only referred to my 32MB GF2GTS. With 256MB DDR Ram, I can now load 32K textures easily and get an overall much smoother performance. Chris will soon tune the tile loading further, but at present, with my fairly 'slow' CPU and 2k tiles, the 1 texture performance seems to be better with this card due to its huge storage capability.
Presently, Celestia runs silk-smooth on a high-quality 21" analog monitor in full screen 1600x1200 px and 32bit color ....
WOW
Bye Fridger
Last edited by t00fri on 26.08.2003, 20:49, edited 3 times in total.
-
- Posts: 69
- Joined: 13.01.2003
- With us: 21 years 10 months
- Location: Osijek, Croatia
Finally someone from nVidia decided to produce 256 MB version of the card...
Congratulations, I hope you gonna enjoy Celestia more than any of us... for some time of course.
I'm very curious about the extra large textures.
When you make screenhots of 32k textures combined with other textures, post it here so we can all enjoy the view, even on screenshot...
Bye!
Congratulations, I hope you gonna enjoy Celestia more than any of us... for some time of course.
I'm very curious about the extra large textures.
When you make screenhots of 32k textures combined with other textures, post it here so we can all enjoy the view, even on screenshot...
Bye!
Hi Fridger,
You sound like one very happy customer!
Now you have all this power in your hands I hope you don't forget the less fortunate of us who still have our steam powered cards!:)
It would be a real pity to see some of your excellent works and know it won't run on lower end machines.
Have fun and thanks again for all the fine work you've done.
Regards,
You sound like one very happy customer!
Now you have all this power in your hands I hope you don't forget the less fortunate of us who still have our steam powered cards!:)
It would be a real pity to see some of your excellent works and know it won't run on lower end machines.
Have fun and thanks again for all the fine work you've done.
Regards,
-
- Posts: 1510
- Joined: 07.09.2002
- Age: 59
- With us: 22 years 2 months
- Location: Albany, Oregon
jamarsa,
The GeForce4 Ti series of cards do have hardware anti-aliasing built in. Even the lowly GeForceMX cards have hardware anti-aliasing. Just thought I would point that out.
Don.
The GeForce4 Ti series of cards do have hardware anti-aliasing built in. Even the lowly GeForceMX cards have hardware anti-aliasing. Just thought I would point that out.
Don.
I am officially a retired member.
I might answer a PM or a post if its relevant to something.
Ah, never say never!!
Past texture releases, Hmm let me think about it
Thanks for your understanding.
I might answer a PM or a post if its relevant to something.
Ah, never say never!!
Past texture releases, Hmm let me think about it
Thanks for your understanding.
Don, thanks for your information. Somehow I picked up the notion that 8x antialiasing was only available at FX cards from this post:
http://www.shatters.net/forum/viewtopic.php?t=2839
Looking at it again, it seems that chris talked about 'decent' performance of antialiasing, instead of 'lack' of it.
Anyway, the price of the Ti4200 vs FX-5600 is very close in the site I looked for, but the FX has double memory. The Ti4200, however, has listed a slight better performance in some areas, in the nVidia charts.
I'm not going to buy nothing until late Fall (my skiing apartment is top priority in my budget right now), but I like to think about these 'capricious' purchases thoroughly. I think I'll go to the FX side.
http://www.shatters.net/forum/viewtopic.php?t=2839
Looking at it again, it seems that chris talked about 'decent' performance of antialiasing, instead of 'lack' of it.
Anyway, the price of the Ti4200 vs FX-5600 is very close in the site I looked for, but the FX has double memory. The Ti4200, however, has listed a slight better performance in some areas, in the nVidia charts.
I'm not going to buy nothing until late Fall (my skiing apartment is top priority in my budget right now), but I like to think about these 'capricious' purchases thoroughly. I think I'll go to the FX side.
Thanks for sharing Fridger.
Rarely a piece of software have made me think on spending money to upgrade my computer. Presently I run a NVidia GeForce2 MX (64Mb) on top of a AMD Athlon 1.25GHz, with a very suspicious Via VT8363 cloned motherboard and 512MB SDRAM. Celestia operates ok, but it can't avoid pauses when travelling on to a 8k texturized celestial body. I don't even dare to download and install 16k textures.
I was already planning dropping the AMD for a P4 and that ugly motherboard of mine for a DDR capable one for different reasons. But quite frankly, a graphics card was the least of my concerns. It performs ok on current games and I'm not that picky
But Celestia? Wow!... With Celestia my whole upgrade concept changed. I have to get a better card. I want to see it in smooth full technicolor
The 500 price tag is a little too much, tough. I may stay for a GeForce 4 Ti 4200 or 4600 (stretching it already).
Mario
Rarely a piece of software have made me think on spending money to upgrade my computer. Presently I run a NVidia GeForce2 MX (64Mb) on top of a AMD Athlon 1.25GHz, with a very suspicious Via VT8363 cloned motherboard and 512MB SDRAM. Celestia operates ok, but it can't avoid pauses when travelling on to a 8k texturized celestial body. I don't even dare to download and install 16k textures.
I was already planning dropping the AMD for a P4 and that ugly motherboard of mine for a DDR capable one for different reasons. But quite frankly, a graphics card was the least of my concerns. It performs ok on current games and I'm not that picky
But Celestia? Wow!... With Celestia my whole upgrade concept changed. I have to get a better card. I want to see it in smooth full technicolor
The 500 price tag is a little too much, tough. I may stay for a GeForce 4 Ti 4200 or 4600 (stretching it already).
Mario
-
- Posts: 862
- Joined: 07.04.2003
- With us: 21 years 7 months
- Location: Born in Argentina
- Contact:
i think i'm having troubles with my GeForce2. i think that the problem is the cooler, but my parents doesnt know it. i wont do anything and i will wait. if it breaks, i'll have to buy an other(and maybe better) one
---------X---------
EL XENTENARIO
1905-2005
My page:
http://www.urielpelado.com.ar
My Gallery:
http://www.celestiaproject.net/gallery/view_al ... y-Universe
EL XENTENARIO
1905-2005
My page:
http://www.urielpelado.com.ar
My Gallery:
http://www.celestiaproject.net/gallery/view_al ... y-Universe
Thanks Fridger.
After reading that review, I think this will be the video card ill be aiming for by the end of this year.
After reading that review, I think this will be the video card ill be aiming for by the end of this year.
CPU- Intel Pentium Core 2 Quad ,2.40GHz
RAM- 2Gb 1066MHz DDR2
Motherboard- Gigabyte P35 DQ6
Video Card- Nvidia GeForce 8800 GTS + 640Mb
Hard Drives- 2 SATA Raptor 10000rpm 150GB
OS- Windows Vista Home Premium 32
RAM- 2Gb 1066MHz DDR2
Motherboard- Gigabyte P35 DQ6
Video Card- Nvidia GeForce 8800 GTS + 640Mb
Hard Drives- 2 SATA Raptor 10000rpm 150GB
OS- Windows Vista Home Premium 32
I may have saved my dimes and nickels but I feel like I would be cutting off my arm to pay for that FX...
gimmie another month or so and Ill have one
85 fps might just make it worth it
gimmie another month or so and Ill have one
85 fps might just make it worth it
I'm trying to teach the cavemen how to play scrabble, its uphill work. The only word they know is Uhh and they dont know how to spell it!
Re: My new NVIDIA FX-5900 Ultra/256 MB DDR super card!
Howdy Fridger,
Wow, sounds like you're jumping from the stone age into the space age all in one big jump!
Hope it all goes well.
By the way, the graphics card sounds nice too!
You probably already know this, but if you want to retain any semblance of quiet in your home/office, when you get the 4 Ghz CPU, get a case that has a large ducted fan, with its speed controlled by CPU temp. You'll be glad you did. I went from a six fan case (1.3 Ghz AMD CPU) to a single large fan, ducted case (3.06 Ghz P4) and LOVE IT! Good luck!
-Don G.
t00fri wrote:The plan is, of course, to replace my present system (PIII 1Ghz/512MB CL2) very soon by a new 4 Ghz Pentium 4 CPU environment with 2GB fast memory.
Wow, sounds like you're jumping from the stone age into the space age all in one big jump!
Hope it all goes well.
By the way, the graphics card sounds nice too!
You probably already know this, but if you want to retain any semblance of quiet in your home/office, when you get the 4 Ghz CPU, get a case that has a large ducted fan, with its speed controlled by CPU temp. You'll be glad you did. I went from a six fan case (1.3 Ghz AMD CPU) to a single large fan, ducted case (3.06 Ghz P4) and LOVE IT! Good luck!
-Don G.
-
Topic authort00fri
- Developer
- Posts: 8772
- Joined: 29.03.2002
- Age: 22
- With us: 22 years 7 months
- Location: Hamburg, Germany
Re: My new NVIDIA FX-5900 Ultra/256 MB DDR super card!
don wrote:Howdy Fridger,Wow, sounds like you're jumping from the stone age into the space age all in one big jump!t00fri wrote:The plan is, of course, to replace my present system (PIII 1Ghz/512MB CL2) very soon by a new 4 Ghz Pentium 4 CPU environment with 2GB fast memory.
Hope it all goes well.
By the way, the graphics card sounds nice too!
You probably already know this, but if you want to retain any semblance of quiet in your home/office, when you get the 4 Ghz CPU, get a case that has a large ducted fan, with its speed controlled by CPU temp. You'll be glad you did. I went from a six fan case (1.3 Ghz AMD CPU) to a single large fan, ducted case (3.06 Ghz P4) and LOVE IT! Good luck!
-Don G.
Thanks for your advice, Don. I always buy good things /if/ I buy. In other words, I replace things in longer time intervals than others may like to. Same with cars etc.
Has nothing to do with 'stone age jumps';-)
My present equipment used to be 'high-end' when I bought it. I always assembled my computers myself from components since more than 15 years, so I know a few tricks, how to get them quiet;-).
Thermodynamics is elementary physics;-) and many dealers do unfortunately not know too much of that...
Six fans is in now way a powerful and quiet cooling solution. So you have decided very well to reduce their number. The point is rather to place the fan(s) such that a steady well defined flow of cool air is generated that is able to transport away the heat locally emerging from your hardware...and one has to keep the fan's RPM as low as possible. Big and slow' in other words, as you emphasized.
Also it is very wise to buy the new CPU or Graphics GPU always at the beginning of a new 'die size' vs 'clock cycle' chain rather than towards the end, when the clock cycle and hence the heat produced has been pushed towards a maximum. That is e.g. why I have bought the FX-5900 (new reduced 0.13 mu process!) and not the FX-5800...
Bye Fridger
-
Topic authort00fri
- Developer
- Posts: 8772
- Joined: 29.03.2002
- Age: 22
- With us: 22 years 7 months
- Location: Hamburg, Germany
Rassilon wrote:I may have saved my dimes and nickels but I feel like I would be cutting off my arm to pay for that FX...
gimmie another month or so and Ill have one
85 fps might just make it worth it
Hi Rass',
let me remind everyone, however, that this sort of card really needs a powerful CPU as partner, in order to reach its outstanding performance.
At present in Celestia the fps (for my PIII/1GhZ CPU at least) are largely determined by the limiting magnitude I set for the star display rather than by the foreground texture of a planet, say! The former is largely determined by the CPU while only the latter is reflecting the high power of the FX!
Bye Fridger
Re: My new NVIDIA FX-5900 Ultra/256 MB DDR super card!
Howdy Fridger,
Oh, and our ranch tractor is a 1959 Massey Ferguson, and one of our daily driver vehicles is a 1986 automobile. It can be much more cost effective to properly maintain what you already have, than to replace it every other year like some folks do. However, our other daily driver is a beautiful 1998 Chevy 4X4 pickup truck.
Ahhhhh, very wise indeed! I had not thought of this minor detail before. Good point Fridger.
-Don G.
I understand. My wife and I are the same way, usually. Our new Dell PC is the first pre-configured PC we've bought since the 286 was introduced, which was our first IBM-compatible PC. Before that we had an Apple ][, bought in 1978. We decided to go with a Dell because these days, trying to properly match components has become a difficult task without a LOT of research, and pre-configured system prices have come way down from where they used to be.t00fri wrote:In other words, I replace things in longer time intervals than others may like to. Same with cars etc. Has nothing to do with 'stone age jumps'
Oh, and our ranch tractor is a 1959 Massey Ferguson, and one of our daily driver vehicles is a 1986 automobile. It can be much more cost effective to properly maintain what you already have, than to replace it every other year like some folks do. However, our other daily driver is a beautiful 1998 Chevy 4X4 pickup truck.
That's an understatement! It is the noisiest PC we've ever had. But, it is a full tower chassis with a 450 watt power supply, every slot in the motherboard filled, and every 3-1/2" and 5.25" bay filled. So, it generates a LOT of heat. It's now our "game machine" for flying and driving/racing games.Six fans is in now way a powerful and quiet cooling solution.
Yep, that's the ticket! Like I said, "You probably already know this", and you did.'Big and slow' in other words, as you emphasized.
Also it is very wise to buy the new CPU or Graphics GPU always at the beginning of a new 'die size' vs 'clock cycle' chain rather than towards the end, when the clock cycle and hence the heat produced has been pushed towards a maximum.
Ahhhhh, very wise indeed! I had not thought of this minor detail before. Good point Fridger.
-Don G.
Howdy Fridger,
Thank you for pointing this out Fridger. Yes, FPS is a "relative" number, that is based on your CPU speed, your graphics card, what you want to display, how you want to display it (Chase, Track, etc.), and how fast the camera is moving, if at all. It's not just simply the graphics card.
In other words, with my 3.06 Ghz CPU and ATI 9700 Pro graphics card, I can get well over 120 FPS in Celestia -- but with only minimal items being displayed, no object "holds", and the camera not moving.
-Don G.
t00fri wrote:let me remind everyone, however, that this sort of card really needs a powerful CPU as partner, in order to reach its outstanding performance.
Thank you for pointing this out Fridger. Yes, FPS is a "relative" number, that is based on your CPU speed, your graphics card, what you want to display, how you want to display it (Chase, Track, etc.), and how fast the camera is moving, if at all. It's not just simply the graphics card.
In other words, with my 3.06 Ghz CPU and ATI 9700 Pro graphics card, I can get well over 120 FPS in Celestia -- but with only minimal items being displayed, no object "holds", and the camera not moving.
-Don G.
-
Topic authort00fri
- Developer
- Posts: 8772
- Joined: 29.03.2002
- Age: 22
- With us: 22 years 7 months
- Location: Hamburg, Germany
don wrote:Howdy Fridger,Thank you for pointing this out Fridger. Yes, FPS is a "relative" number, that is based on your CPU speed, your graphics card, what you want to display, how you want to display it (Chase, Track, etc.), and how fast the camera is moving, if at all. It's not just simply the graphics card.t00fri wrote:let me remind everyone, however, that this sort of card really needs a powerful CPU as partner, in order to reach its outstanding performance.
In other words, with my 3.06 Ghz CPU and ATI 9700 Pro graphics card, I can get well over 120 FPS in Celestia -- but with only minimal items being displayed, no object "holds", and the camera not moving.
-Don G.
Don,
could you do me a little favour and run the following test in Celestia with your equipment:
-- select an area in the sky with only stars.
for the default Celestia (no add-ons)
-- turn off AutoMag and read off the fps for
* mag=0
* mag=6
* mag=12
settings, using the [,] keys for adjustment ...and reading off the respective flash output.
Oh yes, and please turn off FSAA and AF and use 32 color bit planes for definitenes in a window, the size of which I leave to your preference (but I need to know it).
Thanks &
Bye Fridger
t00fri wrote:Don,
could you do me a little favour and run the following test in Celestia with your equipment:
-- select an area in the sky with only stars.
for the default Celestia (no add-ons)
-- turn off AutoMag and read off the fps for
* mag=0
* mag=6
* mag=12
settings, using the [,] keys for adjustment ...and reading off the respective flash output.
Oh yes, and please turn off FSAA and AF and use 32 color bit planes for definitenes in a window, the size of which I leave to your preference (but I need to know it).
Thanks &
Bye Fridger
Howdy Fridger,
Sure, my pleasure. Celestia's FPS is going to depend on a LOT of settings, so I've tried to include everything I think might be relevant ...
Code: Select all
PC Setup:
* XP Pro SP-1, clean install, 532 Mhz FSB (4x133), 3.06 Ghz, 1 GB PC-1066 RDRAM,
L2 Cache: 512 KB ECC sync ATC, 3.4 GB virtual memory
* Viewsonic VX-2000 20" flat panel @ 1024x768x32 bit, 60 Hz
* AGP Version 2, 4X, 128 MB Aperature, side band and fast writes both enabled
* ATI Radeon 9700 Pro, dual head, 128 MB DDR SGRAM/SDRAM, using DV output
Celestia OpenGL Reports:
- Renderer: RADEON 9700 PRO x86/SSE2
- Version: 1.3.3842 WinXP Release
- Max simultaneous textures: 8
- Max texture size: 2048
* OpenGL settings:
- Main: Optimal Performance (middle setting)
- Anti-aliasing (2X to 6X): On, set to "Application Preference"
- Anistropic Filtering (2X to 16X): On, set to 8X
- Texture Preference: High Quality
- Mipmap Detail Level: High Quality
- Wait for Vertical Sync: Off
- TRUFORM(tm): Always Off
* Soft Boot (Restart)
Systray Apps:
* Zone Alarm loaded then exited
* Norton Anti-Virus loaded and disabled
* Pop-Up Blocker loaded and exited
* Internal home network (Intel Pro/100M) present but disabled
Apps/Processes:
* File Explorer
* Celestia (1.3.1 pre9) add-ons: Black Hole, Voyager 1/2, std stars.dat
* 40 background processes, with a CPU usage of 0% to 1%
Celestia Setup:
* Display Mode: Windowed / Single View
* Render/Show: Stars and Planets
* Orbits/Labels: None
* Filter Stars Distance: 1,000,000 (default)
* Render/Locations: None
* Auto-Mag: Off
* Star Style: Fuzzy Points
* Ambient Light: None
* Render/Antialiasing: Off
* FOV: 25 44' 45.0" (1.0x)
* No object selected
* Time: Real Time, UTC
I'm not sure what you mean by "FSAA and AF", but I hope it is covered in the above info.
Celestia Location:
cel://Freeflight/2003-08-31T18:29:20.91859?x=qPlR6K18HnLMDA&y=y7eY1kJvAQ&z=aJL64GtD8QAG&ow=0.939654&ox=-0.327777&oy=-0.026231&oz=-0.094473&fov=25.745911&ts=1.000000<d=0&rf=2051&lm=49152
Test Results:
Render
Magnitude Planets FPS
--------- ------- ----
0.99 ON 455
0.99 OFF 1089
5.99 ON 313
5.99 OFF 550
11.99 ON 151
11.99 OFF 194
It's interesting to note the BIG HIT the frame rate takes when render Planets is turned on, even though there are no planets visible on the display.
Switching the Star Style to "Points" results in nearly the same results, and switching to "Scaled Discs" reduces the FPS by only 2-3 frames.
Being a little more realistic: with Earth selected; 31,890.701 km away; in Follow mode; with rendering of Atmospheres, Clouds, Eclipse Shadows, Galaxies, Night Side Lights, Planets, Ring Shadows, and Stars; Ambient Light set to Low; all Systray apps running; DSL modem connected and on-line; Outlook Express running; here's what I get ...
Follow Earth...
Code: Select all
Render
Magnitude Planets FPS
--------- ------- ----
0.99 ON 139
0.99 OFF 980
5.99 ON 52
5.99 OFF 70
11.99 ON 46
11.99 OFF 59
On my system, Follow, Chase and Lock are absolute FPS killers. With the items above being rendered (Planets ON), at mag of 5.99, Lock Earth/Sol gives me 26 FPS, Chase gives me 11 FPS -- neither of which is even good enough to record an AVI file <sigh>.
When I turn Follow off (Esc), but the Earth is still on-screen and moving, the FPS jumps from 52 to 94 FPS at mag=5.99, which means Follow eats up 42 FPS on my system!
I'm sure this is much more information than you wanted, but it gives you (and others) some varied scenarios and results to play with / compare to your own systems <smile>.
Have fun!
-Don G.
-
Topic authort00fri
- Developer
- Posts: 8772
- Joined: 29.03.2002
- Age: 22
- With us: 22 years 7 months
- Location: Hamburg, Germany
don wrote:Test Results:Code: Select all
Render
Magnitude Planets FPS
--------- ------- ----
0.99 ON 455
0.99 OFF 1089
5.99 ON 313
5.99 OFF 550
11.99 ON 151
11.99 OFF 194
Have fun!
-Don G.
Don, thanks a lot, indeed!
This was very useful, notably also knowing all your detailed settings. I have now repeated a most similar test with my system and the results clearly reflect
the CPU clock ratio: Don/me = 3.06 (see below),
in case many stars have to be worked on by the CPU [mag=12]. In that case you are a little better than the CPU clock ratio, since you have double as much RAM...In the opposite extreme, the ratio is more favourable for me ~2.x since the CPUs have less work to do and my graphics card and its large storage helps here.
Code: Select all
Render
Magnitude Planets FPS Don/me
--------- ------- ---- -----------
0.99 ON 187 2.43
0.99 OFF 520 2.10
5.99 ON 111 2.82
5.99 OFF 202 2.72
11.99 ON 45 3.35
11.99 OFF 60 3.23
These important tests clearly underline that a large part of Celestia 'fps performance' is related for mag>=6 to star calculations by the CPU, while the graphics card performance comes 'on top of this' only...
Thanks again &
Bye Fridger
What an amazing comparison! Thanks Fridger.t00fri wrote:... the results clearly reflect
the CPU clock ratio: Don/me = 3.06 (see below),
in case many stars have to be worked on by the CPU [mag=12]. In that case you are a little better than the CPU clock ratio, since you have double as much RAM...In the opposite extreme, the ratio is more favourable for me ~2.x since the CPUs have less work to do and my graphics card and its large storage helps here.
Excellent conclusion, to prove the point. Based on these tests, you could literally pre-calculate what Celestia FPS you will get with your 4 Ghz CPU, or pretty darn close.These important tests clearly underline that a large part of Celestia 'fps performance' is related for mag>=6 to star calculations by the CPU, while the graphics card performance comes 'on top of this' only...
Thanks again
You're very welcome Fridger. Thanks for your testing and comparison too, providing us all with a definitive answer to the age-old question of, "should I get a faster CPU or a faster graphics card?" -- which is BOTH! -- or, it depends on what you want to see.
Bye for now,
-Don G.