Is that true that absolute magnitude should be proportional to star temperature; that is, are hotter stars always brighter?
These are two different questions. The first part (about proportionality) is asking whether absolute magnitude M is related to temperature T according to a relationship M = kT where k is a constant.
The second part is asking about whether there exists a relationship between M and T such that an increase in T results in a decrease in M (since magnitude decreases with increasing brightness). While a proportional relationship with negative k would satisfy this, so do many other relationships which are not proportional, e.g. an exponential relationship M=exp(k*T) with negative k. Monotonic and proportional mean different things!
Since you are referring to absolute magnitude, I'll assume that by "brighter" you mean "having a greater total luminosity". Consider a red dwarf, a red giant and the Sun. The Sun is hotter and more luminous than the red dwarf, but the red giant is cooler and more luminous than the Sun. Temperature is not the only parameter affecting luminosity - the size of the star also is relevant!
Treating a star as a spherical blackbody with radius r and uniform temperature T, the luminosity L is given by
L=sigma*T^4 * 4*pi * r^2
Where sigma is the Stefan Boltzmann constant.
So hotter stars are not always brighter.