Page 1 of 1

Are hotter stars always brighter?

Posted: 30.01.2007, 15:02
by makc
I posted it under celestia development forums but I wonder if, per chance, it is my misconception, so I ask it here.

Is that true that absolute magnitude should be proportional to star temperature; that is, are hotter stars always brighter?

Posted: 30.01.2007, 15:15
by ajtribick
Is that true that absolute magnitude should be proportional to star temperature; that is, are hotter stars always brighter?


These are two different questions. The first part (about proportionality) is asking whether absolute magnitude M is related to temperature T according to a relationship M = kT where k is a constant.

The second part is asking about whether there exists a relationship between M and T such that an increase in T results in a decrease in M (since magnitude decreases with increasing brightness). While a proportional relationship with negative k would satisfy this, so do many other relationships which are not proportional, e.g. an exponential relationship M=exp(k*T) with negative k. Monotonic and proportional mean different things!

Since you are referring to absolute magnitude, I'll assume that by "brighter" you mean "having a greater total luminosity". Consider a red dwarf, a red giant and the Sun. The Sun is hotter and more luminous than the red dwarf, but the red giant is cooler and more luminous than the Sun. Temperature is not the only parameter affecting luminosity - the size of the star also is relevant!

Treating a star as a spherical blackbody with radius r and uniform temperature T, the luminosity L is given by

L=sigma*T^4 * 4*pi * r^2

Where sigma is the Stefan Boltzmann constant.

So hotter stars are not always brighter.

Posted: 30.01.2007, 17:34
by makc
okay thanks that see to make sense.

p.s. I changed my plot a bit, so now it looks more like right.

Posted: 31.01.2007, 09:53
by makc
I have fixed my code so now results are totally as expected

Image

Thank you again.

Posted: 31.01.2007, 10:51
by t00fri
makc wrote:I have fixed my code so now results are totally as expected

Image

Thank you again.


I hope you took into account also the bolometric corrections in your plot?

Absolute Mags M_V usually refer only to the visual band (V) when they have been computed e.g. from the apparent visual mags m_V. For the total luminosity of a star one has to integrate over the full spectral range. That's what the bolometric correction to the visual magnitude effectively provides as funtion of the star's luminosity class.

These are e.g. tabulated in Celestia's star.cpp source file and correctly taken into account.

Bye Fridger

Posted: 08.02.2007, 11:00
by makc
the thing is that flux I was calculating is luminous not radiometric, that is, not just surface x integral of spectrum, but integral of spectrum multiplied by CIE 1931 y color matching function. I believe this causes residual scattering. however, "degree of linearity" is just enough for my purposes.