Page 1 of 1

How can Celestia handle 20GB Gaia star data?

Posted: 11.08.2016, 07:23
by john71
There were an interesting topic on the Space Engine forum (I copied it down below). It is about handling very large data in Space Engine.

Is it possible to develop something similar in Celestia?

How much work is it?

Can it be borrowed from Space Engine? :wink:

***

http://en.spaceengine.org/forum/8-332-1

Transfer data to user can be made like transfer of textures are in Google Earth. The database is organized in octree, and this allows it to handle hundreds of billions stars and the interactive choosing of visible ones, like I've done in SpaceEngine.

The galaxy is subdivided into 1000 pc blocks. Each is subdivided into 8 childs 500 pc each. Each of them is subdivided into 8 childs 250 pc each and so on. SpaceEngine has 10-levels of octree. Each octree block has a star array associated with it. The magic is in the range of absolute magnitude for each octree level. The level 0 (1000 pc blocks) handle stars brighter than -5m abs mag, the level 1 handle -5...0m stars, the level 2 handle 0...+2.5m and so on. The step in magnitude is chosen so that each block contains approximately the same number of stars.

Render (or query for downloading) is based on limiting visual magnitude. For example, limiting visual mag = +6m. The Engine passes the octree and chooses the blocks that have visual magnitude of the most brightest block's star is less than +6m. At limiting magnitude +6m and infinite culling radius (sky render mode), the engine needs ~6000 stars in the Sun neighboorhood, at +7m ~50000 stars. This is just several tens of blocks, these will be downloaded (or generated) very quickly. The octree with magnitude culling allows avoiding querying of distant dwarf stars, that are already invisible. Once the blocks are downloaded from server, they may be saved in the disk cache. (For procedural stars, SpaceEngine doesn't use a disk cache, because generating of blocks are much faster than downloading them from disk.) Of course this system makes no sense for HIPPARCOS catalog with only 120k star, but for Gaia it will be neccessary (however, Gaia will be released in 2018-2019, our computers may have terrabytes of RAM by then smile )

*

Posted: 11.08.2016, 09:49
by selden
That's how it's already done in Celestia.

Posted: 11.08.2016, 11:28
by john71
Thank you for your answer!

Whoa. Does that mean we have to rearrange the GAIA data in a specific way and Celestia can use it right away?

Posted: 11.08.2016, 12:14
by selden
Up to the maximum data space that Celestia can handle, yes.

Unfortunately, Celestia uses 32 bit addressing, so it can't exceed 4GB total under any circumstances. A substantial careful rewrite would have to be done for it to safely use 64 bit addressing. I'm sure it uses 32bit integers for array subscripts in many places, for example.

Posted: 11.08.2016, 12:24
by john71
Can we use 5 or 6 4GB data chunks to simulate a 20GB 64 bit addressing somehow?

Posted: 11.08.2016, 12:27
by Fenerit
As a far as I know, here there is a tiny program to patch the win 32bit executables to gain 2gb of extra virtual memory on 64bit systems (4gb tot. instead of 2gb). Maybe to handle not-so-huge files could help.

Posted: 11.08.2016, 12:30
by john71
As I understand the problem is not memory related, but the sheer size of the Gaia database...