Page 1 of 1
Time for an upgrade...
Posted: 15.03.2004, 19:18
by Rassilon
I suspect its about time I start buying pieceby bloody piece my new computer...I am this time gonna be as anal as humanly possible in order to get the highest quality products on the market...
First round of buying is a new video card...Alot of peeps around have recommended I grab the ATI 9800...Now is there the same support in Celestiafor this beastor am I better off sticking with GeForce?
Also I want to know what is the best brand for the buck in the Geforce community these days...I have an MSI right now and have had no issues to my knowledge...Ive noticed alot of bargans around for the FX but that sends off some alarms in my head when I see an FX for under 200 bones...
What I am looking for is the best possibleperformance with roughly 256 megsof video ram...I dont mind if its 200-300 american but when it is over 400 it tends to be a bit much...
Let me know what you all think...
Posted: 15.03.2004, 20:47
by selden
Ras',
The situation really has not changed much over the past couple of years.
If you want the best consumer graphics card for Celestia, then you should get a top-of-the-line Nvidia card: as big and fast an FX card as you can afford.
If you want the best card for games, then maybe you should get a top-of-the-line ATI Radeon card -- but remember that they really aren't that much faster than Nvidia's cards (~10%?), and their performance advantage is different for different games.
Unlike the Ti4xxx models, which only differed in their clock speeds, the FX cards differ in the types of hardware enhancements they provide. I dunno if they'll be providing explicit OpenGL access to those features, though. The least expensive model is the FX5200. 128MB versions are available for less than $100 (but be sure to get one that has a fan). The FX 5950 is about 2-3x as fast as the 5200, depending on the benchmark.
Chris has just started adding shadow improvements that can only be seen if you have an FX card. My understanding is that the proprietary NV OpenGL routine that he's using has an equivalent ARB version and he may be able to make a generic version available RSN.
I'm sure Don Edwards will want to comment on the quality of cooling you should be providing for whatever card you get.
Posted: 15.03.2004, 20:56
by Rassilon
For proper cooling please modify your respective refrigerator for waterproofing and insert case and be done with it already
Yeah I will probably want to hear Dons input here soon...Im thinking PNY Technologies might be a good start...
http://www.pny.com/products/verto/geFor ... 0ultra.asp
Posted: 15.03.2004, 21:06
by selden
My personal preference is Asus, but some think they "gold plate" things too much. They provide over-clocking and temperature monitoring software, too.
Posted: 15.03.2004, 23:33
by Paddywak
Hello Rassilon,
If you want it to run Celestia in
IMAX
get yourself a Matrox Parhelia...
Sure it won't be quite as fast as an ATI 9800 or even an Nvidia FX ... but they can't do three screens .... take a look ...
http://www.celestiaproject.net/forum/viewtopic ... 3e229ba775
Cheers,
Paddywak
Posted: 16.03.2004, 03:50
by Mikeydude750
I surrender to that
But seriously...if you play games in addition to use Celestia, the best bet would be with the 9800 Pro, but if you do use Celestia most of the time, get the nVidia cards instead...
Posted: 16.03.2004, 04:15
by Don. Edwards
Hey Rass,
Well as seldon predicted it, I am here with some info.
The new video card I have thanks to Frank is an eVGA GeForce FX 5900 SE with 128MB of DDR VRAM. This is a fairly fast card. It is a full fledge 5900 FX card just slightly de-clocked for the budget minded among us. The card was $189.00 from Tiger Direct. It has a fairly good cooler on board. I have yet to see my card go above the 70 degree Centigrade mark yet and it has a 140 degree threshold. There are other video card makers selling this model of card as well. The only real negative I can see is the VRAM size of only 128MB. But I have not seen the real need as of yet to move up to the 256MB level. I am sure that the added memory does allow for even larger or more textures to be loaded, but since I almost use .dds textures exclusively anymore this is not so much an issue.
As for compatibility Selden hit it on the mark. If you want 100% compatibility for Celestia then go for the FX series from NVidia. There is still only a 90% compatibility level with an ATI card at this time. Things are getting better but we must face the facts. Celestia and NVidia based cards go together like peanut-butter and jelly. ATI cards just can't do the job right. Now I know there will be a few ATI card users chime in saying that this feature or that works just fine my ATI card. Well that’s fine but Celestia was built up from the beginning with the GeForce family of cards in mind. It has been pure luck as of late that ATI has finally got there act together and implemented a more compatible version of there OpenGL IDC. They are headed in the right direction. But you can see that anyone using an ATI card is at the pure whim of ATI's driver writers. Some people have seen new features long not working on there ATI suddenly working. This tells me it isn't Chris' or Celestia's fault but ATI and the way they are handling the way there drivers make the OpenGL calls to what ever program is using them. This also means that ATI is not following proper procedures for creating there OpenGL IDC drivers.
In a perfect world all video card and chipset manufactures would be releasing OpenGL IDC drivers that were 100% compatible with the present OpenGL standards. This is not the case, and really never has been. Each and every chipset manufacturer has decided what feature to support and how to so it. This of course means that these driver writers can do what ever they want. Also let us not forget Microsoft’s role in this. It was there intent from the dawning of Direct X to put the final nail in OpenGL's coffin. Well they thankfully haven't succeeded and it looks like OpenGL is here to stay for the long term.
OK back on track.
Rass, if you do some occasional gaming along with your work with Celestia like I do than the GeForce FX series is for you. If you are a very hard core gamer than the ATI 98xx series is for you but you will loose some Celestia functionality. If you want to stay under the $200 limit than this puts the 5900SE series cards in you reach. There is also a new 5500 FX series of cards but they tend to run a little slower than the 5900 and may have a few things turned off in the chipset. A 5900 series with 256MB of VRAM is going to probably shoot you way over the $300 mark. There really are not any cards that fall in between the 5900SE and say the 5950 series cards, the main difference being clock speed and more VRAM.
I hope this has helped you out and not caused even more confusion. You can always PM me for more details on the card I have. So far eVGA has been very good about a few things that were missing from my box when it was delivered. They had the missing articles to me with-in 7 days. I was very impressed with there speed. So I can give eVGA some very high marks this.
If you have any other questions than just ask in here or as I said, PM me.
Don. Edwards
Posted: 16.03.2004, 05:56
by don
Howdy Rassilon,
I'm a happy multi-PC ATI card user (Radeon 9700 Pro - 128 MB and others). That is, before Celestia came along with it's OpenGL requirements -- the first I had ever used OpenGL.
I agree with everything in the above posts. Celestia is written for OpenGL and specifically for nVidia's OpenGL capabilities, and more recently, specifically for the FX series of nVidia cards. Special functions or features for other cards are secondary and come later.
ATI's mainstay has been DirectX. nVidia's mainstay has been OpenGL.
So, if you play Windows games, get an ATI. If you use Celestia more than games, get an nVidia. Simple as that. Then, find the right model for your budget.
Posted: 16.03.2004, 15:22
by maxim
Don. Edwards wrote:... go together like peanut-butter and jelly.
thats for sure american taste.
maxim
Posted: 16.03.2004, 15:30
by TERRIER
maxim wrote:Don. Edwards wrote:... go together like peanut butter and jelly
That's what I was thinking
Posted: 16.03.2004, 15:40
by Rassilon
I see I have alot of options...I might give the SE a whirl or just wait some time until the upper end cards go down a tad...I prefer going up in video memory over sticking with the same...
And now that I think about it using dds textures might solve my current crashing problems with emissives?
Problem is dds never really looks as good as uncompressed jpeg...
so eVga is a good brand...Anyone have feedback on the PNY brand cards?
Posted: 16.03.2004, 16:37
by selden
Ras',
Chris posted in the thread about Saturn-related crashes that he thinks he's found and fixed the bug related to emissives. It should be fixed in pre7, to be available RSN. I'm guessing it's probably not related to the type of texture file, but I suppose it wouldn't hurt to try a change.
Posted: 16.03.2004, 17:14
by t00fri
Ras'
here is a little advice from a 'poor' FX 5900Ultra /256MB user (with a too slow CPU;-)). I must say that I value the 256MB very highly, notably with my slow CPU (PIII 1 GHz/512MB only) given my 'monster' texture projects.
I want to stress an issue that has been ignored so far in this discussion: 2d quality!
It is a fact that notably cheaper cards often produce miserable 2d quality under high resolutions+bandwidths like 1600x1200x85Hz. You might want to check for yourself for a moment how much time of the day you are staring at your /2d/ screen... it's quite a bit!
A main reason for problems in Europe, at least, is the so-called CE norm. It requires any electrical appliance to be shielded as to its emitted electro-magnetic radiation!
The consequence of this is that over here, at least, a number of card producing firms use very cheap and low quality R-L-C filters at the output of their cards, to cut down the electro-magnetic emission! Is is often the main reason for bad 2d.
I have no idea, whether all this is a real issue in the US, but it may be worth inquiring. My card (TERRATEC, Germany) uses more or less exactly the NVIDIA reference design and indeed, has an excellent 2d output quality! Also ATI cards have a reputation of good 2d output.
You might have a look into 'Tom's hardware' tests online...
Bye Fridger
PS Say hallo to 'the lady'!
Posted: 16.03.2004, 20:48
by Don. Edwards
Rass,
About PNY, that is the vender of my ill fated GeForce4 TI 4600. I am not saying they build bad cards but, once bitten twice shy as the saying goes.
Both PNY and eVGA do not stray far from the NVidia reference card design that Fridger mentioned. As for 2D quality, it was very good on my TI 4600 and is every bit as good to possibly better on the new FX 5900 SE.
Another thing to keep in mind is the advance in memory speeds that have occurred over the last year in video cards. My Geforce4 TI 4600 had a memory bandwidth of about 10GB per sec while my new card can do 23GB per sec. That over double the memory bandwidth. This is a very important feature when we are using these big textures. It speed load times up.
As far as any big deal or changes with the next generation of video cards it will be PCI Express. But that will mean an all new motherboard, power supply , case, ect......... and it is still a few months away. AGP and PCI Express will co-exist for the foreseeable future. But if you are seriously planning on blowing some big money on a system I would then consider waiting and see what the newer cards from NVidia will do in both AGP and PCI Express and then make the choice of a major platform change or a simple upgrade on the system you have.
I personally have bumped up to the end of the road for the Athlon XP series processors, I am at the 3200XP now and there is no replacement in the wings, so I have to make the same choice as to which way to go. As of this point we are standing at the crossroads of the PC platform. All roads now lead to a totally new motherboard form factor and a totally new add-in card interface. We saw this back in the early 90's with the advent of regular PCI and then again at the birth of AGP. But this is a total change from what we have known in the past. If you look around the web you can find all sorts of info on were this is all going.
Now as for the .dds over .jpg topic. There is some degradation of the textures using the .dds format but if you use DXT3 or DTX5 you don't see this as much. Sure the textures end up larger in size but they are holding more visual data then DXT1. Try playing around with a DXT1, DXT3 and DXT5 .dds textures. I think that if you do a little comparing you can see a little difference between the version and there comparative output. When I have a choice I always try and release a texture in the DXT3 format. But this isn't always possible when it comes to the 16k textures.
I my work on the HD28185 system I have found that having multiple 4k textures on all the moons tends to bring even my system to a grinding halt. I decided that as a texture is finished and I plug it into the add-on it gets converted to .dds. This allows me to view all my work and zoom around the system allot faster. If I was using .jpg or .png textures I wouldn't even be able to smoothly zoom in to check on the progress.
I think that covers things for now.
Don. Edwards