It is my understanding that absolute magnitude M and (total) flux F emitted by the star are related as M = -2.5 log F + const.
That is, if you take temperature T, calculate and integrate black body spectrum across visible wavelengthes, and plot resulting values against M, you should see some sort noisy line.
However, when I actually do that for random 2000 stars, I see this:
Can someone, please, explain it?