List of features for next release of Celestia
-
Topic authorkikinho
- Posts: 330
- Joined: 18.09.2004
- With us: 20 years 2 months
- Location: Eden, a planet in Etheral Universe
List of features for next release of Celestia
When we will have list of features for next release of Celestia?
-
- Site Admin
- Posts: 4211
- Joined: 28.01.2002
- With us: 22 years 10 months
- Location: Seattle, Washington, USA
Re: List of features for next release of Celestia
kikinho wrote:When we will have list of features for next release of Celestia?
I don't know.
Here's the list of changes so far:
* New stars.dat format with HD catalog numbers and parallax errors omitted
* Added catalog cross indexes for HD and SAO catalogs
* Fixed bugs in parsing catalog numbers
* Added white dwarf temperatures and spectral types DA, DB, DC, DO, DQ, and DZ
* Added handling for partial spectral types where one or both of the subclass and luminosity class are unknown
* Permit extended star attributes in .stc files: mesh, texture, rotation elements, semiaxes
* Support multiple star systems of any complexity with orbits and barycenters
* Changed .stc loader so that a star definition with a duplicate catalog number replaces the previous definition
* Made catalog numbers optional in .stc files; stars can be defined without having to create fake catalog numbers
* Star colors now chosen based on temperature; either 'classic' and black body colors may be selected
* Bound % key to toggle between star color tables
* Fixed black holes: infinite radius bug gone, rendered just as a plain black sphere
* Created new command line tools for building binary star databases
* Removed limitation that two or more solar systems couldn't be shown simultaneously
* Show light from multiple nearby stars illuminating planets
* Implemented new GLSL render path; NVIDIA combiners and GeForceFX paths deprecated.
* Display UTF-8 superscript digits in some star names
Anonymous wrote:Implemented new GLSL render path; NVIDIA combiners and GeForceFX paths deprecated
Does this mean that ATI card owners of sufficient power will get all the shader effects now?
The FX code isn?t fully implemented(the only apparent benefit is with the ring shadows) and now will be deprecated in benefit of Geforce 6?!It?s not fair;I don?t have money to buy a next generation video card now(Geforce 6 or Radeon X800).I buy the Geforce FX mainly because of Celestia and now are you saying that my purchase was useless?Only in 6 months or even 1 year,I will be able to buy a new video card...
Daniel,
I think you misread something.
"Deprecated" means "no more enhancements." It does not mean "deleted". See http://www.hyperdictionary.com/computing/deprecated
It's not that Nvidia cards won't be supported, it's just that he's not planning to enhance the old code that works only on Nvidia cards. I suspect the old code won't go away for quite some time.
The new shadow code that Chris is writing is more general, so it should work on more cards. The new shadow effects will work fine on Nvidia FX 5nnn cards.
The new code will not work on Nvidia GF4 Ti4nnn, MX cards and other old models, though.
s.
I think you misread something.
"Deprecated" means "no more enhancements." It does not mean "deleted". See http://www.hyperdictionary.com/computing/deprecated
It's not that Nvidia cards won't be supported, it's just that he's not planning to enhance the old code that works only on Nvidia cards. I suspect the old code won't go away for quite some time.
The new shadow code that Chris is writing is more general, so it should work on more cards. The new shadow effects will work fine on Nvidia FX 5nnn cards.
The new code will not work on Nvidia GF4 Ti4nnn, MX cards and other old models, though.
s.
Selden
-
- Site Admin
- Posts: 4211
- Joined: 28.01.2002
- With us: 22 years 10 months
- Location: Seattle, Washington, USA
danielj wrote:Anonymous wrote:Implemented new GLSL render path; NVIDIA combiners and GeForceFX paths deprecated
Does this mean that ATI card owners of sufficient power will get all the shader effects now?
The FX code isn?t fully implemented(the only apparent benefit is with the ring shadows) and now will be deprecated in benefit of Geforce 6?!It?s not fair;I don?t have money to buy a next generation video card now(Geforce 6 or Radeon X800).I buy the Geforce FX mainly because of Celestia and now are you saying that my purchase was useless?Only in 6 months or even 1 year,I will be able to buy a new video card...
As Selden said, the new new GLSL code will run fine on a GeForce FX. It will also work on ATI Radeon 9500 and above. Given this, there doesn't seem much reason to continue developing the GeForce FX-specific code. The NVIDIA combiners path--the one that doesn't use vertex shaders--is unnecessary now because all NVIDIA OpenGL drivers released in the last year have supported vertex programs, albeit through software emulation for older cards. I can't think of a good reason to keep it around.
--Chris
-
- Posts: 1386
- Joined: 06.06.2003
- With us: 21 years 5 months
-
- Site Admin
- Posts: 4211
- Joined: 28.01.2002
- With us: 22 years 10 months
- Location: Seattle, Washington, USA
Evil Dr Ganymede wrote:The new code will not work on Nvidia GF4 Ti4nnn, MX cards and other old models, though.
That's an arse, I have an Nvidia GF4 Ti4200
So what's the ideal card going to be for Celestia? I used to use ATI but found they were too flaky and so switched to NVidia Geforce.
The ideal card depends on your budget. Here's what I'd recommend from NVIDIA:
$500: GeForce 6800 Ultra
$400: GeForce 6800 GT
$300: GeForce 6800
$200: GeForce 6600 GT
$150: GeForce 6600
< $150: GeForce FX 5700
The GeForce 6600 cards are great if you don't want to spend ridiculous amounts of money, but they're currently only available for PCI Express systems. An AGP version should be available soon, however.
--Chris
Wait, let me understand this!!!
Are you saying that if I download the next version of Celestia, my GeForce 4 TI 4200 card will not display the program and I will simply not see anything, or are you saying that it will still work fine but I will not be able to see some of the new features that are coming out?
Please clarify. If the worse is true, then I have to agree with Daniel and others that this becomes a real $&&*@##. We are working hard to boost the use of Celestia to the world community and introduce it to as many people as possible. I just applied for a $50,000 government grant to introduce Celestia into Astronomy education in high schools all around America. If new features render it useless on even 1 year old video cards, then far fewer people will be able to use it. As it is, I bought NVidia GeForce FX 5200 cards with 128 MB video RAM for my 30 school computers, just a year ago. Already, version 1.3.2 is maxing them out when using textures that displayed fine in 1.3.1.
I hope I am not going to have to request my high school to buy another 30 new Nvidia cards after just one year of using the old ones. I can tell you their answer now. It will be NO! That will force me to stop updating Celestia to the latest version. Frankly, that would suck!
Please ... please ... keep older card versions in mind when revising the code.
Frank
Are you saying that if I download the next version of Celestia, my GeForce 4 TI 4200 card will not display the program and I will simply not see anything, or are you saying that it will still work fine but I will not be able to see some of the new features that are coming out?
Please clarify. If the worse is true, then I have to agree with Daniel and others that this becomes a real $&&*@##. We are working hard to boost the use of Celestia to the world community and introduce it to as many people as possible. I just applied for a $50,000 government grant to introduce Celestia into Astronomy education in high schools all around America. If new features render it useless on even 1 year old video cards, then far fewer people will be able to use it. As it is, I bought NVidia GeForce FX 5200 cards with 128 MB video RAM for my 30 school computers, just a year ago. Already, version 1.3.2 is maxing them out when using textures that displayed fine in 1.3.1.
I hope I am not going to have to request my high school to buy another 30 new Nvidia cards after just one year of using the old ones. I can tell you their answer now. It will be NO! That will force me to stop updating Celestia to the latest version. Frankly, that would suck!
Please ... please ... keep older card versions in mind when revising the code.
Frank
Frank,
No currently existing functionality will be lost.
New shading functionality (e.g. multiple light sources and the shadows they cast) uses the OpenGL routines "GL_ARB_shading_language_100" and "GL_ARB_fragment_shader". The shading_language routine is available on the Ti4200, but fragment_shader isn't.
The new shading functionality works fine on FX5200 cards.
No currently existing functionality will be lost.
New shading functionality (e.g. multiple light sources and the shadows they cast) uses the OpenGL routines "GL_ARB_shading_language_100" and "GL_ARB_fragment_shader". The shading_language routine is available on the Ti4200, but fragment_shader isn't.
The new shading functionality works fine on FX5200 cards.
Last edited by selden on 22.10.2004, 11:17, edited 1 time in total.
Selden
Selden:
WHEW! Thanks for the good news. I thought my use of Celestia in the classroom was in danger.
By the way, I am teaching 160 kids this year, and they are so captivated with using Celestia as a means of getting out of the classroom and into space, that they are staying after school to continue cruising in it. THAT is unusual, believe me! Their unanimously consider it ... "COOL!"
Frank
WHEW! Thanks for the good news. I thought my use of Celestia in the classroom was in danger.
By the way, I am teaching 160 kids this year, and they are so captivated with using Celestia as a means of getting out of the classroom and into space, that they are staying after school to continue cruising in it. THAT is unusual, believe me! Their unanimously consider it ... "COOL!"
Frank
-
- Site Admin
- Posts: 4211
- Joined: 28.01.2002
- With us: 22 years 10 months
- Location: Seattle, Washington, USA
Stop worrying. Any graphics card that works with Celestia 1.3.2 will also work with future versions. I'm adding a new high quality rendering path that will work only on newer hardware, but the old modes will still be there and still be supported. In fact, I've even made some improvements to the old render paths so that they can show planets illuminated by multiple suns.Anonymous wrote:Wait, let me understand this!!!
Are you saying that if I download the next version of Celestia, my GeForce 4 TI 4200 card will not display the program and I will simply not see anything, or are you saying that it will still work fine but I will not be able to see some of the new features that are coming out?
As it is, I bought NVidia GeForce FX 5200 cards with 128 MB video RAM for my 30 school computers, just a year ago. Already, version 1.3.2 is maxing them out when using textures that displayed fine in 1.3.1.
GeForce FX 5200 cards will be able to use the new rendering features, including improved shadows.
Do you have a specific case where 1.3.2 is slower than 1.3.1?
--Chris
Chris:
I appreciate your invitation to highlight my problem with 1.3.2. and FX 5200 cards. I have to set up the situation, get some screenshots to post, so I'll answer in more detail tomorrow. However, my school FX cards are now running poorly since I installed 1.3.2. Their frame rates have dropped to 1 or 2 fps in more than a few cases, whereas my frame rate for the same textures and identical scene on my GeForce4 TI 4200 at home is 28 - 30 fps. The school's computers are good ... celeron 2.5 GHz cpu's, FX 5200 cards with 128 MB Video RAM, the latest Nvidia drivers, 512 MB RAM, and only 1 other application running (MS Word). The cfg file and all texture, model and extras add-ons are identical to what I am using at home.
I have to drop vertex shading to Basic or Multitexture on my school computers to get the frame rates back up to a reasonable amount, at least for the scenes that are giving me trouble. This never happened with 1.3.1, using the same computers and essentially identical add-ons. In fact, as a test, I loaded the identical add-ons and textures into my version of 1.3.1 at home, and ran it on my home computer. The identical scene that causes a 1 -2 fps rate at school, produces a 28 fps in 1.3.1 on my home computer. Of course, I have the TI 4200, but ....
Since I have changed add-ons a bit, I will also do some experimentation before I respond further tomorrow. I will also move this post to the BUGS topic section, if that is OK with you. I'll post it separately so as not to take up folks time here.
Till tomorrow
Frank
I appreciate your invitation to highlight my problem with 1.3.2. and FX 5200 cards. I have to set up the situation, get some screenshots to post, so I'll answer in more detail tomorrow. However, my school FX cards are now running poorly since I installed 1.3.2. Their frame rates have dropped to 1 or 2 fps in more than a few cases, whereas my frame rate for the same textures and identical scene on my GeForce4 TI 4200 at home is 28 - 30 fps. The school's computers are good ... celeron 2.5 GHz cpu's, FX 5200 cards with 128 MB Video RAM, the latest Nvidia drivers, 512 MB RAM, and only 1 other application running (MS Word). The cfg file and all texture, model and extras add-ons are identical to what I am using at home.
I have to drop vertex shading to Basic or Multitexture on my school computers to get the frame rates back up to a reasonable amount, at least for the scenes that are giving me trouble. This never happened with 1.3.1, using the same computers and essentially identical add-ons. In fact, as a test, I loaded the identical add-ons and textures into my version of 1.3.1 at home, and ran it on my home computer. The identical scene that causes a 1 -2 fps rate at school, produces a 28 fps in 1.3.1 on my home computer. Of course, I have the TI 4200, but ....
Since I have changed add-ons a bit, I will also do some experimentation before I respond further tomorrow. I will also move this post to the BUGS topic section, if that is OK with you. I'll post it separately so as not to take up folks time here.
Till tomorrow
Frank
-
- Site Admin
- Posts: 4211
- Joined: 28.01.2002
- With us: 22 years 10 months
- Location: Seattle, Washington, USA
fsgregs wrote:Chris:
I appreciate your invitation to highlight my problem with 1.3.2. and FX 5200 cards. I have to set up the situation, get some screenshots to post, so I'll answer in more detail tomorrow. However, my school FX cards are now running poorly since I installed 1.3.2. Their frame rates have dropped to 1 or 2 fps in more than a few cases, whereas my frame rate for the same textures and identical scene on my GeForce4 TI 4200 at home is 28 - 30 fps.
Frank,
Was XP SP2 installed on the school computers? Some people have reported that installing SP2 replaces their graphics drivers with in-the-box drivers that lack hardware OpenGL support. You should copy and paste the information from the OpenGL Info window on one of the school machines so I can see if it's using software OpenGL. There's absolutely no reason that you should see such a performance drop in moving from 1.3.1 to 1.3.2.
--Chris
j'aide
ge force 5200 FX
=> http://www.nvidia.com/content/drivers/drivers.asp
||
\/
=> download
Version: 61.77
Release Date: July 27, 2004
http://download.nvidia.com/Windows/61.77/61.77_win2kxp_international.exe
celestia 1.3.2 30 ? 60 fps
ge force 5200 FX
=> http://www.nvidia.com/content/drivers/drivers.asp
||
\/
=> download
Version: 61.77
Release Date: July 27, 2004
http://download.nvidia.com/Windows/61.77/61.77_win2kxp_international.exe
celestia 1.3.2 30 ? 60 fps
windows 10 directX 12 version
celestia 1.7.0 64 bits
with a general handicap of 80% and it makes much d' efforts for the community and s' expimer, thank you d' to be understanding.
celestia 1.7.0 64 bits
with a general handicap of 80% and it makes much d' efforts for the community and s' expimer, thank you d' to be understanding.