Celestia and Radeon 8500LE

General discussion about Celestia that doesn't fit into other forums.
Topic author
D.Edwards
Posts: 136
Joined: 25.05.2002
With us: 22 years 10 months
Location: Oregon/USA

Celestia and Radeon 8500LE

Post #1by D.Edwards » 14.07.2002, 06:24

Help!
All you guys that know anything about ATI's Radeon cards. I just got my 8500LE 128MB DDR versus another NVidia card because its supposed to render OpenGL better. But I can't even get basic Celestia features working and then the system locks up. I never had problems with NVidia cards like this. Am I missing something. Shouldn't ATI's installers load the OpenGL ICD for me. I am at a complete loss. I wanted to honestly give this card a try after being burnned by by new NVidia G4 440MX which was screwed up from the factory. ( hint don't buy an MSI branded NVidia card. they come pre overclocked and you can't change the clock rate so they won't render 3D at all.) I will keep checking around for information but I am ready to chuck the thing back to the store and give up and just keep my old GeForce2 MX200 32meg.

Mikeydude750
Posts: 169
Joined: 31.01.2002
With us: 23 years 2 months
Location: Wisconsin

Celestia and Radeon 8500LE

Post #2by Mikeydude750 » 14.07.2002, 11:17

D.Edwards wrote:Help!
All you guys that know anything about ATI's Radeon cards. I just got my 8500LE 128MB DDR versus another NVidia card because its supposed to render OpenGL better. But I can't even get basic Celestia features working and then the system locks up. I never had problems with NVidia cards like this. Am I missing something. Shouldn't ATI's installers load the OpenGL ICD for me. I am at a complete loss. I wanted to honestly give this card a try after being burnned by by new NVidia G4 440MX which was screwed up from the factory. ( hint don't buy an MSI branded NVidia card. they come pre overclocked and you can't change the clock rate so they won't render 3D at all.) I will keep checking around for information but I am ready to chuck the thing back to the store and give up and just keep my old GeForce2 MX200 32meg.


There's only 1 solution right now:Ti 4200, 4400, or 4600(preferably 4600), baby!

Celestia hates ATI's drivers, so it doesn't work very nicely.

Chris, once the ATI R300 comes out, you better make support for it. 512 MB video RAM, 3-4(maybe even 5) times the power of the Ti 4600,full hardware support for Displacement mapping(which will make Celestia exponentially cooler and better, once most people have cards that can do this), faster memory, and a faster, more efficient, GPU.

Topic author
D.Edwards
Posts: 136
Joined: 25.05.2002
With us: 22 years 10 months
Location: Oregon/USA

Re: Radeon

Post #3by D.Edwards » 14.07.2002, 12:04

I finaly came to that conclusion. I had bought a Geforce4 MX 440 64MB DDR but the manifacturer had it built in overclocked way beyond NVidia'a specs so every time you tried to launch a 3D app it locked up the computer. I took it back and was going to get a diferent brand NVidia card but my brother and another guy talked me into this darn thing. Now get this, this card won't work with the motherboard chipset cooling fan running and it won't work right if my Seagate Baracuda 7200 RPM hard drive is powered up. If either of these are running the Radeon distorts like hell. I should have went with my first instincts the other day and bought a Leadtek Winfast GeForce instead. I was going to buy the Ti4200 but the prices have been hicked up in my area so there not cost effective for me. The GeForce4 MX440 will have to do and its still miles ahead of my lowly Geforce2 MX 200 with only a 64bit memory bus. Now I have a 90 mile round trip to Fry's to take this ATI card back. Oh well we all live and learn and I will be getting $10 bucks back for the price diference.

Raul.
Posts: 40
Joined: 04.06.2002
With us: 22 years 10 months
Location: Oviedo, Spain

Post #4by Raul. » 14.07.2002, 12:45

I wouldn't buy a GeForce4 MX. Despite of the nice 4 it's not a real GeForce4, it's just a faster GeForce2. It doesn't support all GF4 features (especially vertex and pixel shaders). I'd get a true GeForce3 or maybe a GeForce4 Ti4200. Even my old overclocked GeForce1 DDR kicks every MX butt :P

Take a look at this article:

http://www17.tomshardware.com/graphic/0 ... index.html


And pay attention to the GF4 section:

http://www17.tomshardware.com/graphic/0 ... e4-06.html



Regarding ATI problems with OpenGL it's not Celestia's fault. ATI is well known for buggy drivers and OpenGL ICD couldn't be an exception. Best OpenGL implementation in consumer level cards is NVIDIA's, PERIOD :twisted:

Rassilon
Posts: 1887
Joined: 29.01.2002
With us: 23 years 2 months
Location: Altair

Post #5by Rassilon » 14.07.2002, 13:51

Raul. wrote:Regarding ATI problems with OpenGL it's not Celestia's fault. ATI is well known for buggy drivers and OpenGL ICD couldn't be an exception. Best OpenGL implementation in consumer level cards is NVIDIA's, PERIOD :twisted:


I agree 110% dont waste your money on anything ATI...Trust me you'll be glad you didnt...I had an ATI 128 Pro and had NOTHING but problems with it...Also steer 1 billion ly away from ANYTHING generic...Buy only the best...Visiontek Xtasy lol
I'm trying to teach the cavemen how to play scrabble, its uphill work. The only word they know is Uhh and they dont know how to spell it!

Topic author
D.Edwards
Posts: 136
Joined: 25.05.2002
With us: 22 years 10 months
Location: Oregon/USA

RE: GeForce4 MX 440

Post #6by D.Edwards » 14.07.2002, 21:40

Actualy I have researched this quite thourghly and the only thing that the mx440 is missing is 1 pixel shaderand the shadow shader. It has both pixel and vertex shaders is just doesn't have a second pixel shader as the TI series does. If I could get a Ti4200 for the price that they sold for a few weeks ago I would but demand in my area has them raising prices. Even the MX440's price has gone up. I may consider a GeForce3 if they have on priced low enough.
The main things I want to work are spectural lingting and the antialising.
Later when I have more money that won't be an issue. I have a very limited bugget at the time. But realy need the upgrade. My Geforce2 mx200 is a bottle neck and it has to go.

Raul.
Posts: 40
Joined: 04.06.2002
With us: 22 years 10 months
Location: Oviedo, Spain

RE: GeForce4 MX 440

Post #7by Raul. » 14.07.2002, 22:19

D.Edwards wrote:Actualy I have researched this quite thourghly and the only thing that the mx440 is missing is 1 pixel shaderand the shadow shader. It has both pixel and vertex shaders is just doesn't have a second pixel shader as the TI series does.


NO, NO, NO, you're wrong. The GF4 MX440 doesn't support pixel or vertex shaders at all. Take a look at the features directly from NVIDIA:

http://www.nvidia.com/view.asp?PAGE=pg_20020201403631


It's just a faster GF2MX400 (270Mhz vs 200Mhz core speed and 540Mpix/s vs 400Mpix/s fill rate). So, if the grfx card is the bottleneck it won't get much better with a GF4MX440. You've been warned :lol:. As Johnny Rotten said right after he bought a GF4 MX440: "ever feel like you've been cheated?" :twisted: :twisted: :twisted:


Hope this helps :wink:

Topic author
D.Edwards
Posts: 136
Joined: 25.05.2002
With us: 22 years 10 months
Location: Oregon/USA

RE: Gf 440MX

Post #8by D.Edwards » 14.07.2002, 23:15

Ah! :?
My lowly GeForce2 MX 200 with 32mb of SDRAM has primtive pixel and vertex shaders of a sort. I don't know where you are getting your info from. But if the GeForce4 MX440 is just a souped up GeForce2 than that will be fine for me. I read the article. It has the old GeForce2 pixel & vertex Rasterizers and I know the GeForce2 has them because I am able to use specular lighting effects. I know the card is basicly just a GeForce2 with the new antialiasing engine built in. But for me it has 64MB of DDR versus 32MB of SDRAM. Its a hell of alot faster than my old card. It will make me happy for the moment.Take a look at the specs of my present card

Specification:
Nvidia GeForce2 MX GPU:
350MHz RAMDAC.
2.8 GB/s memory bandwidth
20 million triangles/sec
Fill Rate 700 Million Texels
Digital Vibrance control
NVIDIA Shading Rasterizer (NSR)
High-Definition Video Processor (HDVP)
AGP 4X with Fast Writes Support
32-bit color
32-bit Z/stencil buffer
Cube environment mapping
DirectX and S3 texture compression
Memory:32MB on-board 128bit SDRAM in a 64Bit Memory Bus.

Now the specs for the MX 440

Specification:
Nvidia GeForce4 MX440 GPU features:
Graphics Core 256 bit
Memory Interface 128 bit DDR
Memory Bandwidth 6.4 GB/sec
Fill Rate 1.1 Billion Texels
Triangles/sec 34 million
Effective Memory
Clock Rate 400 MHz
RAMDAC 350 MHz

Hardware Features
Nvidia latest nView technology
Nvidia Light speed memory architecture9tm) II
Nvidia Accuview9tm) Antialiasing
Nvidia Video Processing Engine (VPE)
Integrated dual 350MHz RAMDACs
Integrated TV encoder
2 dual-rendering pipelines
4 texels per clock cycle
Cube environment mapping
64M high-speed 128-bit DDR RMA
High-performance 2D rendering engine
AGP 4X with Fast Writes
Nvidia Shading Rasterizer
AGP 4X/2X support
Integrated hardware transform engine
Integrated hardware lighting engine
True-color hardware cursor
High-quality HDTV-DVD playback
True, reflective bump mapping
Multibuffering (double, triple, quad) for smooth animation and video playback
DirectX and S3TC texture compression support
OpenGL 1.3 ICD support
32-bit color with 32-bit Z/stencil buffer
cooling solution:
on-board active heat-sink cooling fan
AGP Standard:
AGP 2.0 slot support
TV-out:
TV-out up to 1024x768 resolution

This doesn't sound like any GeForce2 card I ever saw for sale. The old card can still do specular lighting. The proof is is the pudding. I will get the card and do some screen shots and we will see just how everything looks. Besides this card is only a stopgap for me. Then I sell it to my brother and I get the card I really want. We all have to take baby steps first and this mine. Its not like I have the fastest computer by a long shot. I just reached the 1GHZ mark and the graphics card is the bottleneck. As I said in previous posts I had planned on buying the Ti4200 but the prices here shot up $50 dollars and that put it out of range for me. I had I fixed $150 bugget and when I saw the card I month ago it was selling for that.
It was just the 64mb version but that was fine. Know the card is $199 and that killed it for me. The Radeon was/is a very nice card but as it was said earlier Chris hasn't built in support for ATI's rendering engine. So now I have only one choice. I will get the MX440 and if I can swing the extra cash in the next 30 days I will take the card back to Fry's and get the Ti4200. I like to screw Fry's anyway. They have done it to me enogh times. :D

Raul.
Posts: 40
Joined: 04.06.2002
With us: 22 years 10 months
Location: Oviedo, Spain

Post #9by Raul. » 15.07.2002, 08:30

That's exactly what i meant, it's basically a GF2 card (with minor features added) but lack of the cool GF3 features (pixel & vertex shaders). I don't have a GF3 myself but i recall Celestia uses pixel&vertex shaders if available, and look much better. You should ask Chris about this.

Well, if you can screw Fry's later then go for it :twisted:

Rassilon
Posts: 1887
Joined: 29.01.2002
With us: 23 years 2 months
Location: Altair

RE: GeForce4 MX 440

Post #10by Rassilon » 15.07.2002, 16:36

Raul. wrote:As Johnny Rotten said right after he bought a GF4 MX440: "ever feel like you've been cheated?" :twisted: :twisted: :twisted:


You dont mean Dave Rotten of Repulse Records do you? :mrgreen:
I'm trying to teach the cavemen how to play scrabble, its uphill work. The only word they know is Uhh and they dont know how to spell it!

Raul.
Posts: 40
Joined: 04.06.2002
With us: 22 years 10 months
Location: Oviedo, Spain

RE: GeForce4 MX 440

Post #11by Raul. » 15.07.2002, 20:49

Rassilon wrote:You dont mean Dave Rotten of Repulse Records do you? :mrgreen:


Nope, it was just a bad joke inspired by Johhny Rotten's (Sex Pistols singer) words. I don't remember the place (San Francisco maybe?) but Sex Pistols were on their first (and last) US tour and that day after playing JUST ONE SONG Rotten said "ever feel like you've been cheated?" and left the stage :twisted: .


As Neil Young would say "The king is gone but is not forgotten, this is the story of Johnny Rotten" :lol:

chris
Site Admin
Posts: 4211
Joined: 28.01.2002
With us: 23 years 2 months
Location: Seattle, Washington, USA

Post #12by chris » 15.07.2002, 22:31

Celestia will work with a Radeon, though there are certain driver versions out there that have caused problems.

The GeForce4 MX does not have DX8 compliant pixel shaders. However, it does have some limited shading functionality through register combiners. This is all that Celestia currently uses. In the future, I may add some rendering features to Celestia that require DX8 level pixel shaders, but it's more likely I'll skip directly to DX9 level features (R300 and NV30), as there's much more programmability there (its going to be amazing!)

The GeForce4 MX has software-emulated vertex shaders; cards with hardware vertex shaders will be noticeably faster for vertex shader effects. In my opinion, the best card to buy right now is the GeForce4 Ti 4200--reasonable price, full DX8 functionality, twin hardware vertex shaders. A 128 meg version is better if you want to use large textures.

I doubt I'll support any features specific to the ATI R300. I do work for nVIDIA after all, but more important is that since Celestia is free software, I can't afford to go out and spend $400 of my own money per new graphics chipset and support all the new features. I will support anything that is standardized by the OpenGL ARB (standard vertex shaders are coming soon!) I'll also support nVIDIA specific extensions because that's the card brand I happen to use. Of course, anyone else is welcome to add support for ATI specific features to Celestia. And in an ideal world, the OpenGL ARB will incorporate new functionality into the spec quickly enough that supporting vendor-specific extensions is no longer an issue at all.

Finally, a couple corrections about the R300 . . . It's not going to be 3-4 times more powerful (whatever that means) than a Ti 4600. I'd expect more like a 50% framerate increase. Displacement mapping is useful in some cases, but probably not in Celestia until hardware gets faster--it will still be necessary to fake the appearance of detail with bump mapping instead.

D.Edwards: can you run OpenGL apps besides Celestia? Or do you have problems running anything?

--Chris

Rassilon
Posts: 1887
Joined: 29.01.2002
With us: 23 years 2 months
Location: Altair

RE: GeForce4 MX 440

Post #13by Rassilon » 16.07.2002, 01:58

Raul. wrote:Nope, it was just a bad joke inspired by Johhny Rotten's (Sex Pistols singer) words. I don't remember the place (San Francisco maybe?) but Sex Pistols were on their first (and last) US tour and that day after playing JUST ONE SONG Rotten said "ever feel like you've been cheated?" and left the stage :twisted: .

As Neil Young would say "The king is gone but is not forgotten, this is the story of Johnny Rotten" :lol:


LoL J Rotten of PIL and of course the pistols I could I forget him ;)
I'm trying to teach the cavemen how to play scrabble, its uphill work. The only word they know is Uhh and they dont know how to spell it!

Topic author
D.Edwards
Posts: 136
Joined: 25.05.2002
With us: 22 years 10 months
Location: Oregon/USA

RE: Readeon and GeForceMX 440

Post #14by D.Edwards » 16.07.2002, 06:27

Hi Chris and everyone else that wrote in on this topic.
The Radeon, when I finally got it to work looked beautiful but I was unable to get any kind of Specular lighting in Celestia. I tried many driver versions to no avail. Also the Radeon wasn't working properly on my new main motherboard. To finally get it to do anything I had to install it on my USB damaged Abit motherboard. It worked fine on that board but not on my replacement board of another brand but same chipset. So the board went back to Fry's yesterday and I picked up the WinFast GeForce4 MX440.
Guess what. It wont render 3D of anykind on my main board but if its installed onto the Abit it works fine. I had read the specs and it said it required a 300 watt power supply so I then headed over to my friends little computer store here in town and tried everything with a new 300 watt power supply. No dice. We came to the conclusion that the motherboard isn't alowing the video card to pull the current needed for 3D rendering and thats why it keeps locking up. As soon as we put the card in his benchtop system it ran without a hitch.
So needless to say I am taking the card back being that I can't use it and am getting an Athlon 1800XP CPU and motherboard that Fry's has on sale. This of course ins't going to help with the 3D rendering of Celestia or anything else. But I have had the worst luck I have ever had with this recent hardware upgrade. Later in the month or early next month I am going to get the GeForce4 Ti4200 or maybe a Ti4400. I will just wait and bide my time. At least I will be getting the system I originaly wanted in the first place and I can still work on textures for Celestia with my old tried and true GeForce2 MX200 32meggr. I just wish it had 64 megs instead. Hey wait a minute, my friend at the computer shop just got a GeForce2 MX400 with 64MB of SDRAM in the shop on a trade. Maybe I can talk him to letting me use it till I can get the Ti.
Wish Me Luck!!

Robert.G

Radeon 8500

Post #15by Robert.G » 01.08.2002, 19:54

I use a Radeon 8500/64MB with driver 4.13.01.9039 and have absolutly no problems! Celestia works very fine.

Topic author
D.Edwards
Posts: 136
Joined: 25.05.2002
With us: 22 years 10 months
Location: Oregon/USA

RE: Radeon

Post #16by D.Edwards » 01.08.2002, 23:57

Are getting Speculars in Earths oceans?
I never said the card didn't work I just stated that it is unable to do speculars. I never even got as far as testing for bumpmapping. I had enough input in 1 day to make my choice and that was to go right back to my first choice and that was Nvidia.
You didn't state witch operating system your using. Win9x or Win2k or XP.
I use XP and the drivers gave no specular support what so ever. I tried serveral different driver versions with no luck. If your happy with your card as is thats fine by me but I refuse to loose a feature I had on an older video card and I will not ever go back to an older operating system.

TVTExtreme4
Posts: 45
Joined: 30.01.2002
With us: 23 years 2 months

Radeon 8500

Post #17by TVTExtreme4 » 02.08.2002, 03:22

Robert.G wrote:I use a Radeon 8500/64MB with driver 4.13.01.9039 and have absolutly no problems! Celestia works very fine.


I have a Radeon 7200/64MB, Windows XP, 750mhz AMD Duron, and 576mb of ram and it works great. I had a geforce 2 card before but the Radeon works MUCH BETTER! :D (there... i told my OS, and no i dont get specular ocean effects)
Last edited by TVTExtreme4 on 02.08.2002, 14:34, edited 5 times in total.

Topic author
D.Edwards
Posts: 136
Joined: 25.05.2002
With us: 22 years 10 months
Location: Oregon/USA

RE: Readeon

Post #18by D.Edwards » 02.08.2002, 05:33

Listen you guys. You are not telling what OS your using. I may give the Radeon another chance but I have to know what Windows version your using. I use Windows XP exclusivly. If you tell me your card is working just great but your using Windows98 or ME thats not goning to convince me to go back and try the card again. I need to here from users of Windows XP any version that they are in fact getting spectural lighting effects and what version of the drivers there using. Just comming into the forum and saying it works great for you is one thing but Win9x and Win XP use different kernels and diferent drivers. So if the Win9x drivers gives you specturals and the WinXP drivers don't you can see were I am comming from. In that case I going to stick to buying the NVidia Ti4200.
You have to come into this forum with at least a little bit common sence when your posting about an issue like this. You have to give all the information that is needed so the rest of us can make a well thought out descision. I know for a fact that NVidia cards can do spectural lighting effects but so far I have no evidence from anyone that any Radeon based video card can do the same thing in Celestia at this point. Even Chris says that there are known issues with Radeon cards and Celestia. If anyone knows more about this topic and about what I any talkiing about its going to be Chris.
So lets try this again.
If anyone is having any luck with getting spectuaral lighting effects with any Radeon video card please give me all pertinant information about your operating system version, drivers version, and anything else you may think might help anyone else in here.
Maybe taking a sreenshot of earth with specturals turned on as part evidence might help as well in convicing us. Most of us hard core Celestia users use Nvidia cards so its up to you Radeon users to try and convince us otherwise.


Return to “Celestia Users”