1 billion star ESA Gaia catalogue in Celestia?
But does it actually work?
Just because a program has been built with 64 bit compiler and linker options does not mean that it can handle large arrays. Someone has to carefully go through all of the code to make sure that the array definitions and subscript calculations actually do the right things.
Just because a program has been built with 64 bit compiler and linker options does not mean that it can handle large arrays. Someone has to carefully go through all of the code to make sure that the array definitions and subscript calculations actually do the right things.
Selden
Actually, I believe the issue here is the catalognumber, not memory size.
Catalog number is a uint32, and the stars are checked via sequence of pointers, which are size agnostic.
There is nothing memory wise to prevent the x64 version from loading and running with it.
Besides of course, physical ram limits.
What concerns me at the moment is the catalog number itself.
It is possible that celestia uses the catalog number implicitly assuming it is also the HIP#, which would break addons.
This is not something I have delved into yet.
I am currently downloading the DR2 formatted catalog of 97M stars.
My intent is to use the buildstar.pl script as a basis for converting DR2 to celestiaDB.
Two rounds, the first winnows the DR2 format down to the hip_main datafile format.
The second, assuming the first actually works, go ahead and use the existing perl script.
If that fails for some reason.
I won't be using perl since I am neither good with it, nor do I like perl.
Same goes for python.
More than likely I will use Lua or php, perhaps pascal.
One I get it converted, I will try it and see what happens.
If the catalog number is implicitly the HIP#, then the celestia code will need to be adapted.
Once I have it converted, I will upload it so others can play with it as well.
Janus.
Catalog number is a uint32, and the stars are checked via sequence of pointers, which are size agnostic.
There is nothing memory wise to prevent the x64 version from loading and running with it.
Besides of course, physical ram limits.
What concerns me at the moment is the catalog number itself.
It is possible that celestia uses the catalog number implicitly assuming it is also the HIP#, which would break addons.
This is not something I have delved into yet.
I am currently downloading the DR2 formatted catalog of 97M stars.
My intent is to use the buildstar.pl script as a basis for converting DR2 to celestiaDB.
Two rounds, the first winnows the DR2 format down to the hip_main datafile format.
The second, assuming the first actually works, go ahead and use the existing perl script.
If that fails for some reason.
I won't be using perl since I am neither good with it, nor do I like perl.
Same goes for python.
More than likely I will use Lua or php, perhaps pascal.
One I get it converted, I will try it and see what happens.
If the catalog number is implicitly the HIP#, then the celestia code will need to be adapted.
Once I have it converted, I will upload it so others can play with it as well.
Janus.
Janus,
If you can, I suggest writing the software in Lua 5.1, primarily because it's included in Celestia. That'd ensure that anyone with Celestia could, in principle, run your catalog translation software on any platform without having to install some other OS-specific compiler and whatever dependencies it might have.
If you can, I suggest writing the software in Lua 5.1, primarily because it's included in Celestia. That'd ensure that anyone with Celestia could, in principle, run your catalog translation software on any platform without having to install some other OS-specific compiler and whatever dependencies it might have.
Selden
@selden
A lua interpreter is not included with celestia, it is built in to the program itself.
There is no way to run the lua script.
Besides, file access in lua is horrid.
My plan is to convert the DR2 data into stars_97m.dat, then load it instead.
Doing the same thing as the 2M star catalog, only with 97M instead.
If it ever finishes downloading that is.
I will post when I am able to start making some progress.
Janus.
A lua interpreter is not included with celestia, it is built in to the program itself.
There is no way to run the lua script.
Besides, file access in lua is horrid.
My plan is to convert the DR2 data into stars_97m.dat, then load it instead.
Doing the same thing as the 2M star catalog, only with 97M instead.
If it ever finishes downloading that is.
I will post when I am able to start making some progress.
Janus.
Janus,
Lua 5.1 programs (scripts) just need to be given the extension .CELX in order to run them from Celestia.
As you wrote, the Lua 5.1 interpreter is built into Celestia, which is why I suggested using it.
Personally, I write most of my one-offs in Fortran 77
My understanding is that the full DR2 dataset is over 550 GB, so it would take quite a while to download. I haven't taken the time to investigate either it or the abbreviated 97 GB version. So far I've just been extracting and downloading very small subsets using the GAIA version of SQL.
Lua 5.1 programs (scripts) just need to be given the extension .CELX in order to run them from Celestia.
As you wrote, the Lua 5.1 interpreter is built into Celestia, which is why I suggested using it.
Personally, I write most of my one-offs in Fortran 77
My understanding is that the full DR2 dataset is over 550 GB, so it would take quite a while to download. I haven't taken the time to investigate either it or the abbreviated 97 GB version. So far I've just been extracting and downloading very small subsets using the GAIA version of SQL.
Selden
Okay, I have finally gotten the 9.1G set, and extracted it.
Oh man, what a mess.
I am at the pseudo code stage, but here is what I have so far.
The log file in 48-adp-040000-0.5-0.125 is where you start, this gives you an idea of what you have.
Then look inside the file metadata.bin next.
It starts with a uint32 big endian number 00 00 24 E0 which is 0x24E0, $34E0, or 9440 base 10.
That is followed by 9440, 76(0x4C, $4C) byte headers for the individual particle files.
Still working on the header.
The particle files, minus the opening header, are relatively easy.
If you can call a variable length structure easy.
Here is a sample of the parsing code used the gaia project.
Warning 1: Sample is in java instead of a real language.
Warning 2: I HATE java, and I freely admit to being biased against it.
Not saying it doesn't work, just that I hate it.
Which then needs to be translated to a hip file compatible form.
Sadly, it looks like either perl or C/C++ is going to be the language of choice for this.
The former because I have a template to go by.
The latter for memory/performance reasons.
It may be necessary to modify the star class in celestia to add a hip, tycho & gaia index values.
I have not traced everything all the way through, but it appears so far that catalog index doubles as the hip#.
Not a huge change by itself, but one with a lot of ripples.
I will try to keep everyone up to date on my progress.
Janus.
Oh man, what a mess.
I am at the pseudo code stage, but here is what I have so far.
The log file in 48-adp-040000-0.5-0.125 is where you start, this gives you an idea of what you have.
Then look inside the file metadata.bin next.
It starts with a uint32 big endian number 00 00 24 E0 which is 0x24E0, $34E0, or 9440 base 10.
That is followed by 9440, 76(0x4C, $4C) byte headers for the individual particle files.
Still working on the header.
The particle files, minus the opening header, are relatively easy.
If you can call a variable length structure easy.
Here is a sample of the parsing code used the gaia project.
Warning 1: Sample is in java instead of a real language.
Warning 2: I HATE java, and I freely admit to being biased against it.
Not saying it doesn't work, just that I hate it.
Code: Select all
// name_length, name, appmag, absmag, colorbv, r, g, b, a,
// ra[deg], dec[deg], dist[u], (double) x[u], (double) y[u],
// (double) z[u], mualpha[mas/yr], mudelta[mas/yr],
// radvel[km/s], pmx[u/yr], pmy[u/yr], pmz[u/yr], id, hip,
// tychoLength, tycho, sourceCatalog, pageid, type
int nameLength = data_in.readInt();
StringBuilder sb = new StringBuilder();
for (int i = 0; i < nameLength; i++) {
sb.append(data_in.readChar());
}
String name = sb.toString();
float appmag = data_in.readFloat();
float absmag = data_in.readFloat();
float colorbv = data_in.readFloat();
float r = data_in.readFloat();
float g = data_in.readFloat();
float b = data_in.readFloat();
float a = data_in.readFloat();
float ra = data_in.readFloat();
float dec = data_in.readFloat();
float dist = data_in.readFloat();
double x = data_in.readDouble();
double y = data_in.readDouble();
double z = data_in.readDouble();
float mualpha = data_in.readFloat();
float mudelta = data_in.readFloat();
float radvel = data_in.readFloat();
float pmx = data_in.readFloat();
float pmy = data_in.readFloat();
float pmz = data_in.readFloat();
long id = data_in.readLong();
int hip = data_in.readInt();
int tychoLength = data_in.readInt();
sb = new StringBuilder();
for (int i = 0; i < tychoLength; i++) {
sb.append(data_in.readChar());
}
String tycho = sb.toString();
byte source = data_in.readByte();
long pageId = data_in.readInt();
int type = data_in.readInt();
if (appmag < GlobalConf.data.LIMIT_MAG_LOAD) {
Vector3d pos = new Vector3d(x, y, z);
Vector3 pmSph = new Vector3(mualpha, mudelta, radvel);
Vector3 pm = new Vector3(pmx, pmy, pmz);
float[] cc;
if (Float.isNaN(colorbv)) {
colorbv = 0.62f;
cc = new float[] { 1.0f, 0.95f, 0.91f, 1.0f };
} else {
cc = new float[] { r, g, b, a };
}
Star s = new Star(pos, pm, pmSph, appmag, absmag, colorbv, name, ra, dec, id, hip, tycho, source);
Which then needs to be translated to a hip file compatible form.
Sadly, it looks like either perl or C/C++ is going to be the language of choice for this.
The former because I have a template to go by.
The latter for memory/performance reasons.
It may be necessary to modify the star class in celestia to add a hip, tycho & gaia index values.
I have not traced everything all the way through, but it appears so far that catalog index doubles as the hip#.
Not a huge change by itself, but one with a lot of ripples.
I will try to keep everyone up to date on my progress.
Janus.
Whoa. This is the crown jewel: 601 million stars, 56 GB data.
It is finally online.
How the hell can Gaia Sky handle this much data?
https://zah.uni-heidelberg.de/institutes/ari/gaia/outreach/gaiasky/downloads/#dr2catalogs
It is finally online.
How the hell can Gaia Sky handle this much data?
https://zah.uni-heidelberg.de/institutes/ari/gaia/outreach/gaiasky/downloads/#dr2catalogs
- Gurren Lagann
- Posts: 434
- Joined: 31.01.2018
- Age: 18
- With us: 6 years 9 months
- Location: State of Rio de Janeiro, Brazil
john71 wrote:Whoa. This is the crown jewel: 601 million stars, 56 GB data.
It is finally online.
How the hell can Gaia Sky handle this much data?
https://zah.uni-heidelberg.de/institutes/ari/gaia/outreach/gaiasky/downloads/#dr2catalogs
Gaia Sky's servers are Supercomputers. Done.
"The tomorrow we're trying to reach is not a tomorrow you had decided on!"
- Simon the Digger
"Nothing is impossible for me, as long I'm determinated to keep moving forward!"
"If other people aren't going to do it, I'm going to do it myself!"
- Me (Gurren)
Current major projects:
- Aur Cir
- Cel+
- Project Sisyphus
- Populating the Local Group
- An galaxy generator
- Simon the Digger
"Nothing is impossible for me, as long I'm determinated to keep moving forward!"
"If other people aren't going to do it, I'm going to do it myself!"
- Me (Gurren)
Current major projects:
- Aur Cir
- Cel+
- Project Sisyphus
- Populating the Local Group
- An galaxy generator
601 million stars! Amazing! There are star "corridors" on the other side of the Milky Way!
Added after 9 hours 24 minutes:
I meant the Gaia Sky software!
It handles flawlessly more than 90GB of star data!!!!
No RAM issues, no performance problems.
Celestia should use a similar method.
Added after 9 hours 24 minutes:
Gurren Lagann wrote:Gaia Sky's servers are Supercomputers. Done.
I meant the Gaia Sky software!
It handles flawlessly more than 90GB of star data!!!!
No RAM issues, no performance problems.
Celestia should use a similar method.
By the way, Gaia Sky is open source software, so it is possible to integrate parts of it's code into Celestia...in that way Celestia would be able to use Gaia Sky data directly.
Added after 4 minutes 41 seconds:
Also in this Space Engine forum thread (http://forum.spaceengine.org/viewtopic.php?f=10&t=297&sid=39d63c7b7b8d0ab0db1efa8e8571e765&start=15) it is mentioned that 2.7 GB data is sufficient to create an usable "stripped down" version of Gaia stars.
Added after 4 minutes 41 seconds:
Also in this Space Engine forum thread (http://forum.spaceengine.org/viewtopic.php?f=10&t=297&sid=39d63c7b7b8d0ab0db1efa8e8571e765&start=15) it is mentioned that 2.7 GB data is sufficient to create an usable "stripped down" version of Gaia stars.
- Alexell
- Site Admin
- Posts: 303
- Joined: 07.10.2010
- Age: 30
- With us: 14 years 1 month
- Location: Moscow, Russia
- Contact:
john71, Yes, this is an interesting idea. Perhaps this will really be done only by hiring programmers for money. I postponed the idea of collecting donations for Celestia, but I plan to return to this idea.
Admin of celestia.space
PC: Intel Core i7-8700 @ 3.20GHz, SSD, 16 Gb RAM, NVIDIA GeForce GTX 1080, Creative Sound Blaster ZxR. Windows 10 x64.
Phone: iPhone Xs 256 Gb. iOS 14.
PC: Intel Core i7-8700 @ 3.20GHz, SSD, 16 Gb RAM, NVIDIA GeForce GTX 1080, Creative Sound Blaster ZxR. Windows 10 x64.
Phone: iPhone Xs 256 Gb. iOS 14.