SPhere collision detection....
Posted: 08.07.2005, 03:23
This question may or may not be answerable by anyone but chris but I was wondering if this happened in celestia in the early days of development...I know that you find the radius of a sphere to do simple bounding sphere collision...using x^2+y^2+z^2^1/2 but lets just say that is isnt accurate in this scenerio....Would float or double precision play a role in this using the observer and the center of the sphere as the 2 units to find this distance? I currently use double....
To make a long story short I fly into a planet like I would celestia and in some places I read I am 100 km from the surface and am passing into the planet and in other places I am 1 km from the surface and am obviously far away....which leads me to believe the center of the sphere is not accurately being plotted....It must change drastically from one side of the sphere to the next....which leads me to believe its a problem with precision.....
In the program I am doing this all in I plot the observer always to be 0,0,0...I set up a global world coordinate as my location in space (the camera never moves) and move everything around me by subtracting world coordinates from the objects true coordinates....I do not see a jittering effect which happens when you loose precision (like when you use float) Not even up close....so my absolute question here is would anyone know what would cause this innacuracy other than precision?
If it is precision then what does celestia do to accurately plot the distance from the observer to the planet?
To make a long story short I fly into a planet like I would celestia and in some places I read I am 100 km from the surface and am passing into the planet and in other places I am 1 km from the surface and am obviously far away....which leads me to believe the center of the sphere is not accurately being plotted....It must change drastically from one side of the sphere to the next....which leads me to believe its a problem with precision.....
In the program I am doing this all in I plot the observer always to be 0,0,0...I set up a global world coordinate as my location in space (the camera never moves) and move everything around me by subtracting world coordinates from the objects true coordinates....I do not see a jittering effect which happens when you loose precision (like when you use float) Not even up close....so my absolute question here is would anyone know what would cause this innacuracy other than precision?
If it is precision then what does celestia do to accurately plot the distance from the observer to the planet?