Are hotter stars always brighter?

General physics and astronomy discussions not directly related to Celestia
Topic author
makc
Posts: 18
Joined: 09.01.2007
With us: 17 years 11 months

Are hotter stars always brighter?

Post #1by makc » 30.01.2007, 15:02

I posted it under celestia development forums but I wonder if, per chance, it is my misconception, so I ask it here.

Is that true that absolute magnitude should be proportional to star temperature; that is, are hotter stars always brighter?

ajtribick
Developer
Posts: 1855
Joined: 11.08.2003
With us: 21 years 4 months

Post #2by ajtribick » 30.01.2007, 15:15

Is that true that absolute magnitude should be proportional to star temperature; that is, are hotter stars always brighter?


These are two different questions. The first part (about proportionality) is asking whether absolute magnitude M is related to temperature T according to a relationship M = kT where k is a constant.

The second part is asking about whether there exists a relationship between M and T such that an increase in T results in a decrease in M (since magnitude decreases with increasing brightness). While a proportional relationship with negative k would satisfy this, so do many other relationships which are not proportional, e.g. an exponential relationship M=exp(k*T) with negative k. Monotonic and proportional mean different things!

Since you are referring to absolute magnitude, I'll assume that by "brighter" you mean "having a greater total luminosity". Consider a red dwarf, a red giant and the Sun. The Sun is hotter and more luminous than the red dwarf, but the red giant is cooler and more luminous than the Sun. Temperature is not the only parameter affecting luminosity - the size of the star also is relevant!

Treating a star as a spherical blackbody with radius r and uniform temperature T, the luminosity L is given by

L=sigma*T^4 * 4*pi * r^2

Where sigma is the Stefan Boltzmann constant.

So hotter stars are not always brighter.

Topic author
makc
Posts: 18
Joined: 09.01.2007
With us: 17 years 11 months

Post #3by makc » 30.01.2007, 17:34

okay thanks that see to make sense.

p.s. I changed my plot a bit, so now it looks more like right.
Last edited by makc on 31.01.2007, 09:54, edited 1 time in total.

Topic author
makc
Posts: 18
Joined: 09.01.2007
With us: 17 years 11 months

Post #4by makc » 31.01.2007, 09:53

I have fixed my code so now results are totally as expected

Image

Thank you again.

Avatar
t00fri
Developer
Posts: 8772
Joined: 29.03.2002
Age: 22
With us: 22 years 8 months
Location: Hamburg, Germany

Post #5by t00fri » 31.01.2007, 10:51

makc wrote:I have fixed my code so now results are totally as expected

Image

Thank you again.


I hope you took into account also the bolometric corrections in your plot?

Absolute Mags M_V usually refer only to the visual band (V) when they have been computed e.g. from the apparent visual mags m_V. For the total luminosity of a star one has to integrate over the full spectral range. That's what the bolometric correction to the visual magnitude effectively provides as funtion of the star's luminosity class.

These are e.g. tabulated in Celestia's star.cpp source file and correctly taken into account.

Bye Fridger
Image

Topic author
makc
Posts: 18
Joined: 09.01.2007
With us: 17 years 11 months

Post #6by makc » 08.02.2007, 11:00

the thing is that flux I was calculating is luminous not radiometric, that is, not just surface x integral of spectrum, but integral of spectrum multiplied by CIE 1931 y color matching function. I believe this causes residual scattering. however, "degree of linearity" is just enough for my purposes.


Return to “Physics and Astronomy”