NVIDIA's OpenGL 1.4

The place to discuss creating, porting and modifying Celestia's source code.
Topic author
Raul.
Posts: 40
Joined: 04.06.2002
With us: 22 years
Location: Oviedo, Spain

NVIDIA's OpenGL 1.4

Post #1by Raul. » 18.09.2002, 08:39

Yesterday i saw NVIDIA released a few weeks ago Detonator 40.41 Beta. Apparently these drivers feature OpenGL 1.4 (along with a claimed up to 25% increase in performance). I decided to give it a try (on my GeForce1 DDR) but unfortunately Celestia don't work properly. All textures and polys are messed (it's quite a funny show actually :twisted: ). Anyone else tried Detonator 40.41 Beta?

http://www.nvidia.com/view.asp?IO=winxp-2k_40.41


I've been reading all the new OpenGL 1.4 features and some of them sound pretty cool:

http://www.opengl.org/developers/docume ... nGL14.html

Avatar
selden
Developer
Posts: 10190
Joined: 04.09.2002
With us: 21 years 10 months
Location: NY, USA

Post #2by selden » 18.09.2002, 11:22

Unfortunately, the Detonator 40 libraries are very much "beta" quality. I've seen one report of them causing crashes. I did a quick test of them.
When I saw clouds patterns that were the same shape as the continents, I decided to switch back to v30 until the debugged version is ready.
Selden

Rassilon
Posts: 1887
Joined: 29.01.2002
With us: 22 years 5 months
Location: Altair

Post #3by Rassilon » 18.09.2002, 14:29

Yeah anything beta is a roll of the dice....
~
Depth and Shadows - Depth textures define a new texture internal format, DEPTH, normally used to represent depth values. Applications include image-based shadow casting, displacement mapping, and image-based rendering. Image-based shadowing is enabled with a new texture application mode defined by the parameter TEXTURE COMPARE MODE.
~
I wonder if this will introduce a faster way to produce planetary landscapes...Will be nice too see the outcome...
I'm trying to teach the cavemen how to play scrabble, its uphill work. The only word they know is Uhh and they dont know how to spell it!

chris
Site Admin
Posts: 4211
Joined: 28.01.2002
With us: 22 years 5 months
Location: Seattle, Washington, USA

NVIDIA's OpenGL 1.4

Post #4by chris » 18.09.2002, 17:11

Raul. wrote:Yesterday i saw NVIDIA released a few weeks ago Detonator 40.41 Beta. Apparently these drivers feature OpenGL 1.4 (along with a claimed up to 25% increase in performance). I decided to give it a try (on my GeForce1 DDR) but unfortunately Celestia don't work properly. All textures and polys are messed (it's quite a funny show actually :twisted: ). Anyone else tried Detonator 40.41 Beta?
I've tried them on a GeForce4 MX system and they seem to work fine with Celestia 1.2.5pre3. I don't doubt there are still some problems with 40.41 though . . . Unfortunately, I work on the Direct3D drivers and OpenGL, so I can't directly fix a problem :)

I've been reading all the new OpenGL 1.4 features and some of them sound pretty cool:

http://www.opengl.org/developers/docume ... nGL14.html

The best thing about OpenGL 1.4 is a standard vertex program extension . . . I'll be moving to this for Celestia 1.2.6, so ATI Radeon owners will finally be able experience vertex shader effects too. Of course, I'll also be taking advantage of nVIDIA's new fragment shader extension for some truly mind-blowing effects (think real-time procedural planet textures) on the latest hardware.

--Chris

abiogenesis
Posts: 104
Joined: 07.06.2002
With us: 22 years
Location: Redmond, WA

Post #5by abiogenesis » 18.09.2002, 20:43

Chris wrote:I'll also be taking advantage of nVIDIA's new fragment shader extension for some truly mind-blowing effects (think real-time procedural planet textures) on the latest hardware.


What does this mean exactly? Will it just be run-time created textures, or will we be able to do animations? I don't imagine that the average Celestia user will be able to write a Fragment Shader to implement, say, an animated cloud texture.

I was reading about the NV30's CinemaFX stuff. The article mentioned using a Shader to generate a texture as opposed to a simple map. The example given was wood grain. How would a shader be better than a mapped image in this case? It sounds exciting, but I just don't get it.

- a b i o g e n e s i s -

etrepum
Developer
Posts: 29
Joined: 29.05.2002
With us: 22 years 1 month
Location: New York, NY
Contact:

NVIDIA's OpenGL 1.4

Post #6by etrepum » 19.09.2002, 04:06

chris wrote:The best thing about OpenGL 1.4 is a standard vertex program extension . . . I'll be moving to this for Celestia 1.2.6, so ATI Radeon owners will finally be able experience vertex shader effects too. Of course, I'll also be taking advantage of nVIDIA's new fragment shader extension for some truly mind-blowing effects (think real-time procedural planet textures) on the latest hardware.

--Chris


Have you seen Apple's new OpenGL Shader Builder (in the dev kit for 10.2)? It parses ARB Vertex Program source code and displays results realtime as you type.. it's pretty slick. Also does ATI Fragment Shaders and some Nvidia stuff, but I haven't had a chance to play with that as I've been using my laptop (the OpenGL implementation of 10.2 has a decently optimized software ARB Vertex Program implementation if your hardware can't do it). It even spits out C source for you once you're happy with what you've got.

-bob

etrepum
Developer
Posts: 29
Joined: 29.05.2002
With us: 22 years 1 month
Location: New York, NY
Contact:

Post #7by etrepum » 19.09.2002, 04:17

abiogenesis wrote:I was reading about the NV30's CinemaFX stuff. The article mentioned using a Shader to generate a texture as opposed to a simple map. The example given was wood grain. How would a shader be better than a mapped image in this case? It sounds exciting, but I just don't get it.


I think that they used shaders and/or a vertex program do realtime fur by generating it only fly mapped onto low-poly fins in one of the newer demos... I could be wrong, I didn't look very hard at it.

There are a few benefits to procedural textures, primarily having to do with memory. You don't need a whole lot of memory to generate textures on the fly, just GPU power. Imagine an extreme close-up of a planet, without procedural texturing you'd likely need a few gigantic bitmaps for all the maps.. with procedural texturing you could emulate those few megabytes of prerendered images with maybe 10k of code, and it'd likely look a lot better.

Of course, it's not a whole lot of good for rendering planets we have good pictures of, but for objects like stars and the planets we don't have high-res photos of it'd be great.

abiogenesis
Posts: 104
Joined: 07.06.2002
With us: 22 years
Location: Redmond, WA

Post #8by abiogenesis » 19.09.2002, 04:42

I guess that makes sense then. A wood grain texture of a huge wall would need a huge texture map, but the grain pattern is probably really simple to code up algorithmically. I can get that.

It might be hard to algorithmically generate the surface of a planet, though. Unless maybe using fractals. Hm.... You might be able to pull of very dramatic LOD effects with a run-time fractal.

Still, to make the most use of Celestia's texture-developing community, the shaders would have to be really simple to program. Or a simple, high-level interface could be put into Celestia itself.

- a b i o g e n e s i s -

JLP
Posts: 41
Joined: 31.01.2002
With us: 22 years 5 months
Location: Slovenia
Contact:

NVIDIA's OpenGL 1.4

Post #9by JLP » 20.09.2002, 20:18

chris wrote:The best thing about OpenGL 1.4 is a standard vertex program extension . . . I'll be moving to this for Celestia 1.2.6, so ATI Radeon owners will finally be able experience vertex shader effects too. Of course, I'll also be taking advantage of nVIDIA's new fragment shader extension for some truly mind-blowing effects (think real-time procedural planet textures) on the latest hardware.

--Chris


finally more support for ATI Radeon owners. What about using other stuff that Radeons can do. I think that Truform would be great for a lot of round stuff in Celestia.
Live long and prosper!


Return to “Development”