Linux version: As a "trip" e.g. to Proxima Centauri illustrates, the background stars closest to Proxima appear brightest as if they were illuminated by this poweful source of light.
This feature is physically incorrect. In fact, the visibility of background stars around close-up stars should be reduced due to the glare.
Fridger[/b][/url]
close-up of stars and background stars
-
- Posts: 312
- Joined: 04.03.2002
- With us: 22 years 8 months
Glare and brightness
The nearby stars appear brighter because the glare is simply being added linearly to their brightness.
I suppose it's open to question whether the glare is supposed to represent some artifact of the viewing system (like "lens flare") or the corona of the star. But in either case, it seems to me that it does make sense to add brightnesses.
The reason this unrealistically increases the stars' visibility rather than drowning them out is because of a more fundamental problem that has been mentioned elsewhere in these forums: it's impossible to represent the tremendous range of brightnesses of astronomical objects on a CRT (not that you'd want to, since you'd end up burning the user's eyes out half the time-- I've heard it said that most things in the universe are either too dim to see or too bright to look at).
Vision is logarithmic and adaptive. A faint object like a distant star will stand out against the dark night sky, but will become invisible if there is a bright object like a nearby sun in the foreground. The same thing applies to photographic and video equipment if it is properly operated. You don't see stars in photos of spacewalking astronauts, because the (usually sunlit) astronaut is so much brighter than the stars are-- the aperture and exposure are adjusted so that the stars disappear entirely, so as not to overexpose the subject.
Perhaps Celestia could have an optional mode to simulate this: the overall brightness of the scene would adjust itself to compensate for the brightest objects in the scene. That would go a long way toward conveying the tremendous differences between bright and dim objects. It would also help answer the people who think the moon landing photos are fake because there are no stars in the sky!
I suppose it's open to question whether the glare is supposed to represent some artifact of the viewing system (like "lens flare") or the corona of the star. But in either case, it seems to me that it does make sense to add brightnesses.
The reason this unrealistically increases the stars' visibility rather than drowning them out is because of a more fundamental problem that has been mentioned elsewhere in these forums: it's impossible to represent the tremendous range of brightnesses of astronomical objects on a CRT (not that you'd want to, since you'd end up burning the user's eyes out half the time-- I've heard it said that most things in the universe are either too dim to see or too bright to look at).
Vision is logarithmic and adaptive. A faint object like a distant star will stand out against the dark night sky, but will become invisible if there is a bright object like a nearby sun in the foreground. The same thing applies to photographic and video equipment if it is properly operated. You don't see stars in photos of spacewalking astronauts, because the (usually sunlit) astronaut is so much brighter than the stars are-- the aperture and exposure are adjusted so that the stars disappear entirely, so as not to overexpose the subject.
Perhaps Celestia could have an optional mode to simulate this: the overall brightness of the scene would adjust itself to compensate for the brightest objects in the scene. That would go a long way toward conveying the tremendous differences between bright and dim objects. It would also help answer the people who think the moon landing photos are fake because there are no stars in the sky!