16,000 light year limit with 128 bit integers
Posted: 30.11.2005, 18:22
Celestia Coordinate Calculation Attempts Using 128 Bit Integers
Written by a GlobeMaker, not a developer, November 30, 2005
Boulder Creek, California, USA
Goal
Calculate the possible size of the Celestia Universe when a 128 bit integer defines a coordinate in one dimension.
Assumption
Assume that the smallest location coordinate increment is 10 nanometers. This feature size was estimated during a past experiment observing a .3ds molecule in Celestia. The simulated molecule had a 10nm radius. The jittering of its position appeared to be over a distance of 10nm to 1000 nanometers. The cause of this jitter is believed to be due to the quantization of the position coordinates. This implies that the least significant bit is changing to represent two adjacent positions. This 10nm feature size assumption can be changed to a different size, someday.
Find how many meters in a light year.
c = 3 x 10^8 meters per sec
n = Number of seconds per year 365*24*60*60 = 3.2 x 10^7 sec/year
1 light year in meters = c x n = 3 x 3.2 x 10 ^ ( 8+7) = 10^16 meters per light year
so 1 light year is 10 trillion kilometers.
This result matches values I have memorized in the past.
The Celestia 16,000 light year limit calculation.
Why is there a 16,000 light year limit in Celestia for star positions?
The radius is 16,000, the Diameter is 32,000 light years
How many bits are needed to represent 32,000 light years, using a 10 nanometer feature size?
Maybe the numeric representation is not in simple binary. Maybe it is binary coded decimal integers (BCD).
Goal
Find how many bits of BCD encoded integers are needed for 16,000 light year radius
Each decimal digit uses 4 bits.
A unit length is assumed to be 10nm.
The maximum count M of unit lengths will be
32,000 light years / 10 nanometers = M
M = (3.2 x 10 ^ 4 light years) x ( 10^16 meters per light year ) / ( 10 ^ -8 meters )
M = 3.2 x 10 ^28 (no units)
This count needs 29 digits, each digit is 4 bits.
4bits per digit x 29 digits = 116 bits
116 bits in BCD.
This is close to the 128 bits used to meet the goal of 16,000 light years radius.
If a simple binary encoding were used instead of BCD, the 16,000 light year limit would be increased to 340 trillion lightyears.
Written by a GlobeMaker, not a developer, November 30, 2005
Boulder Creek, California, USA
Goal
Calculate the possible size of the Celestia Universe when a 128 bit integer defines a coordinate in one dimension.
Assumption
Assume that the smallest location coordinate increment is 10 nanometers. This feature size was estimated during a past experiment observing a .3ds molecule in Celestia. The simulated molecule had a 10nm radius. The jittering of its position appeared to be over a distance of 10nm to 1000 nanometers. The cause of this jitter is believed to be due to the quantization of the position coordinates. This implies that the least significant bit is changing to represent two adjacent positions. This 10nm feature size assumption can be changed to a different size, someday.
Find how many meters in a light year.
c = 3 x 10^8 meters per sec
n = Number of seconds per year 365*24*60*60 = 3.2 x 10^7 sec/year
1 light year in meters = c x n = 3 x 3.2 x 10 ^ ( 8+7) = 10^16 meters per light year
so 1 light year is 10 trillion kilometers.
This result matches values I have memorized in the past.
The Celestia 16,000 light year limit calculation.
Why is there a 16,000 light year limit in Celestia for star positions?
The radius is 16,000, the Diameter is 32,000 light years
How many bits are needed to represent 32,000 light years, using a 10 nanometer feature size?
Maybe the numeric representation is not in simple binary. Maybe it is binary coded decimal integers (BCD).
Goal
Find how many bits of BCD encoded integers are needed for 16,000 light year radius
Each decimal digit uses 4 bits.
A unit length is assumed to be 10nm.
The maximum count M of unit lengths will be
32,000 light years / 10 nanometers = M
M = (3.2 x 10 ^ 4 light years) x ( 10^16 meters per light year ) / ( 10 ^ -8 meters )
M = 3.2 x 10 ^28 (no units)
This count needs 29 digits, each digit is 4 bits.
4bits per digit x 29 digits = 116 bits
116 bits in BCD.
This is close to the 128 bits used to meet the goal of 16,000 light years radius.
If a simple binary encoding were used instead of BCD, the 16,000 light year limit would be increased to 340 trillion lightyears.