Another question for you guys.
-
Topic authorDraconiator
Another question for you guys.
I was looking at the aurora last night, and after a few minutes, I ended up staring at the stars instead, heh. What I wanna know, is does Earth get any radiational warmth from the stars at night, no matter how feeble? I think so...And also, why don't stars tend to damage your eyesight when you look at 'em? Does the radiation from them progressively get weaker, or what?
-
- Developer
- Posts: 1863
- Joined: 21.11.2002
- With us: 22 years
You get a tiny amount of radiant energy from the stars - a sky-full of stars delivers something like a microwatt per square metre.
The surface brightness of a nearby star doesn't get any less as you begin to move away from it - the star just gets smaller in the sky and so sheds less light. So you're right, it seems as if every star in the sky should be a tiny but eye-damagingly intense point.
What stops this happening is diffraction - light passing through an aperture is always slightly spread by its interaction with the edges of the aperture. So for the human eye, the size of our maximally dilated pupils determines how tightly we can focus the image of a star onto our retinas - it turns out to be about 1 minute of arc. So once a star gets far enough away for it to subtend just a minute of arc at your eye (and for the Sun that's only just beyond the distance of Neptune), its image inside your eye stops getting smaller and starts diminishing in surface brightness instead. So stars seen from interstellar distances are never intense enough to cause any damage.
Grant
The surface brightness of a nearby star doesn't get any less as you begin to move away from it - the star just gets smaller in the sky and so sheds less light. So you're right, it seems as if every star in the sky should be a tiny but eye-damagingly intense point.
What stops this happening is diffraction - light passing through an aperture is always slightly spread by its interaction with the edges of the aperture. So for the human eye, the size of our maximally dilated pupils determines how tightly we can focus the image of a star onto our retinas - it turns out to be about 1 minute of arc. So once a star gets far enough away for it to subtend just a minute of arc at your eye (and for the Sun that's only just beyond the distance of Neptune), its image inside your eye stops getting smaller and starts diminishing in surface brightness instead. So stars seen from interstellar distances are never intense enough to cause any damage.
Grant
-
- Developer
- Posts: 1863
- Joined: 21.11.2002
- With us: 22 years
Not really - it cancels itself out.selden wrote:I think you've left out the 1/r^2 effect.
Let's say you double your distance from the Sun - the inverse square law says its brightness decreases fourfold. But it is now only half its previous diameter in the sky - so it covers only one fourth of the angular area it covered previously. One fourth the light is coming from one fourth the area, so the surface brightness is exactly the same. Since it's the energy per unit area directed to your retina that causes the damage, you still deliver a damaging intensity of radiant energy to your retina ... just over a smaller area. If these two inverse-square laws continued to offset each other like this all the way out to astronomical distances, then you'd find yourself frying individual molecular components of your retina with each speck of starlight.
What prevents that happening is the diffraction limit.
(I'm assuming that Draconiator was aware of the simple fact that stars get dimmer as you get farther away from them, and was asking a more subtle question.)
Grant
-
- Posts: 1386
- Joined: 06.06.2003
- With us: 21 years 6 months
-
- Developer
- Posts: 1863
- Joined: 21.11.2002
- With us: 22 years
Bit variable ... but you're talking over an hour for a decent display to build to a maximum and then dissipate. Keep an eye on the northern sky an hour or so either side of local (solar) midnight, but good displays can blow up at other times, too.Evil Dr Ganymede wrote:Am I just looking up at the wrong time or something? How long do they usually last?
At least you're at a longitude that's more likely to see the aurora ... in Europe you need to be farther north to have the same chance of a view.
Grant
-
- Posts: 1386
- Joined: 06.06.2003
- With us: 21 years 6 months
Oddly enough a lot of the photos that show up on spaceweather.com are taken around sunset...
I'm not sure if we're in the ideal place to see aurorae here though (at about 48 degrees N on Vancouver Island) - apparently we're quite a bit further from the magnetic pole than the Maritimes on the eastern seaboard. I'll have to check that...
I'm not sure if we're in the ideal place to see aurorae here though (at about 48 degrees N on Vancouver Island) - apparently we're quite a bit further from the magnetic pole than the Maritimes on the eastern seaboard. I'll have to check that...
-
- Developer
- Posts: 1863
- Joined: 21.11.2002
- With us: 22 years
Yes - the geomagnetic pole is in the strait between Greenland and Canada, so the Maritimes are going to be closer, as you say.Evil Dr Ganymede wrote:I'm not sure if we're in the ideal place to see aurorae here though (at about 48 degrees N on Vancouver Island) - apparently we're quite a bit further from the magnetic pole than the Maritimes on the eastern seaboard. I'll have to check that...
Grant