Page 1 of 1

stars not rendered properly when changing magnitude filter

Posted: 29.06.2006, 15:47
by tec
I have noticed that there are several stars that are not drawn when I adjust the magnitude filter. But the apparent magnitude is smaller than the filter using the [] keys. I am using points as the star style. Here is an example.

Only render stars and center star HIP 1099. Adjust the star magnitude filter to 8.5 and you can see 1099. Use the [] keys to lower the magnitude threshold and watch for 1099 to disappear. It will disappear when the filter is at 7.9 but the star has an apparent brightness of 6.14. Why did it disappear?

There are several others that I have noticed doing this same thing. They are 46768, 25708, 82898, 93543, 37843.

The reason this is a problem is because I am reading the screen pixels and running the image through my star browser pattern matching algorithm. My algorithm blows up when I center on these stars because it is not there.

Thanks
Tim

Posted: 05.07.2006, 15:44
by chris
This sounds like it could be a bug in the octree algorithm that's responsible for culling stars that are either too faint to see or outside the field of view. Thanks for the detailed bug report--knowing specific stars that demonstrate the problem will help me debug this.

--Chris

Posted: 05.07.2006, 18:16
by t00fri
I agree with Chris. It's related to octree parameters. We had plenty such effects with galaxies. I probably even know the place where it happens.

Bye Fridger

Getting the fixes through email

Posted: 06.07.2006, 12:55
by tec
Thanks Chris,

I have been trying to find this bug for a few days myself. My boss is needing some charts/report but I need this issue fixed before I can write the report. Is it possible to get the changes sent to me when you find the bug. My email is tec@knology.net. I don't mind helping find the issue but I will need some guidance. I have stepped through the debugger and I cannot find where the stars are culled out.

Thanks
Tim