I posted it under celestia development forums but I wonder if, per chance, it is my misconception, so I ask it here.
Is that true that absolute magnitude should be proportional to star temperature; that is, are hotter stars always brighter?
Are hotter stars always brighter?
Is that true that absolute magnitude should be proportional to star temperature; that is, are hotter stars always brighter?
These are two different questions. The first part (about proportionality) is asking whether absolute magnitude M is related to temperature T according to a relationship M = kT where k is a constant.
The second part is asking about whether there exists a relationship between M and T such that an increase in T results in a decrease in M (since magnitude decreases with increasing brightness). While a proportional relationship with negative k would satisfy this, so do many other relationships which are not proportional, e.g. an exponential relationship M=exp(k*T) with negative k. Monotonic and proportional mean different things!
Since you are referring to absolute magnitude, I'll assume that by "brighter" you mean "having a greater total luminosity". Consider a red dwarf, a red giant and the Sun. The Sun is hotter and more luminous than the red dwarf, but the red giant is cooler and more luminous than the Sun. Temperature is not the only parameter affecting luminosity - the size of the star also is relevant!
Treating a star as a spherical blackbody with radius r and uniform temperature T, the luminosity L is given by
L=sigma*T^4 * 4*pi * r^2
Where sigma is the Stefan Boltzmann constant.
So hotter stars are not always brighter.
Last edited by makc on 31.01.2007, 09:54, edited 1 time in total.
- t00fri
- Developer
- Posts: 8772
- Joined: 29.03.2002
- Age: 22
- With us: 22 years 8 months
- Location: Hamburg, Germany
I hope you took into account also the bolometric corrections in your plot?
Absolute Mags M_V usually refer only to the visual band (V) when they have been computed e.g. from the apparent visual mags m_V. For the total luminosity of a star one has to integrate over the full spectral range. That's what the bolometric correction to the visual magnitude effectively provides as funtion of the star's luminosity class.
These are e.g. tabulated in Celestia's star.cpp source file and correctly taken into account.
Bye Fridger
the thing is that flux I was calculating is luminous not radiometric, that is, not just surface x integral of spectrum, but integral of spectrum multiplied by CIE 1931 y color matching function. I believe this causes residual scattering. however, "degree of linearity" is just enough for my purposes.