Paolo wrote:
Please clarify your idea, if not the discussion will continue to
follow dreamable but inconsistent paths.
Do you mean that we are able to collect the same image of
e.g a nebula (orientation, resolution) aquired at different
wavelenghts. So we will be able to create by hand using e.g.
The gimp a set of perfectly superimposable alternative
textures for the same nebula.
If so the question is really easy to implement.
....
Paolo & others,
I doubt that the real challenges are that simple. Of course we
could keep ourselves busy with all sorts of "10 working hour
kind of bricolages" . What makes me think since quite a
while, however, are possible approaches that fulfil a
maximum of exciting possibilities based on the
multi-wavelength (filter) concept in Celestia.
Since I am basically convinced that nothing much will really
happen on that front before Chris is hopefully back (if at all),
there is no particular reason to hurry.
So I just take my time and explore different possible alleys
one may take.
Yes indeed, as to deep-sky objects, there is now steady
influx of new hires IR (Spitzer, 2mass..) and UV (Galex,.. )
imaging besides the visual light images we got e.g. from
Hubble. Clearly, the respective objects may be rescaled,
mapped, alligned and superimposed in various modes by
"texture people".
If we want to achieve outstanding "revolutionary" display
solutions in this field, we first of all need a catching-on
logistic concept:
1) what do we intend to achieve through multi-wavelength
displays? Who are the people we want to address with such
new features?
2) which objects to select and which to skip? It makes little
sense to just start somewhere at random.
3) How are we going to exploit the available multi-wavelength
imaging information
systematically and with a
uniform color mapping concept? How do we share
"mere beauty of display" versus "scientific information &
content"?
...
I did follow up quite a number of concrete ideas for possible
realizations. If they are more mature and satisfying at least
to myself I shall report about them for sure. Certainly I spent
already quite a lot of time to explore various possibilities.
A really important issue in my view is a well-choosen
color mapping scheme that should be
both
maximally informative and most intuitive to grasp (and
beautiful not to forget ) . [It is of secondary
importance to me at this point whether we want to allow
users to modify it or not]
So I did a number of experiments here as well.
Let me illustrate something that I kind of fancy right now.
++++++++++++++++++++++++++++
Background info:
Human eyes are "measuring instruments" for light, with
different kinds of "Photopigments" used to sense 3 types of
color (B=blue,G=green and R=red (yellow actually)),
according to known spectral sensitivity profiles). The
human brain then composes these different
filtered signals to what we perceive as "visual color".
Altogether, we are able to "see" a wavelength-window (0.4 -
0.7 micron), centered around green (~0.55 micron) as illustrated in this figure
"Natural" Color Mapping:
The IR Spitzer Space Telescope, is providing
various filtered images in a
shifted
wavelength-window, centered e.g. in the (deep) infrared
around 8 micron! Typically at three IR reference wavelengths
1) 3.5 microns
2) 8.0 microns
3) 24.0 microns
[more frequently, they use 3-4 "colors" between 3.5 and 8 micron, however]
Suppose we interpret these "short", "medium" and "long" IR
shots as "blue" "green" and "red" input to our modified "eye",
now thought to be sensitive in a
shifted window with
center ~8 microns (instead of 0.55 microns), our brain will
again provide an intuitive "visual" color
composition/translation of the three IR filter images. This
would look like this for the beautiful example of the M81
spiral galaxy:
0.5 microns <-> RGB visual:
-----------------------------------
---shifting ----shifting ----shifting----
3.5 microns -> "blue":
-------------------------
8.0 microns -> "green":
--------------------------
24 microns -> "red":
-------------------------
RGB-brain translation (GIMP)
------------------------------------------------
AHA! not bad...
Next comes a mosaic of how we could nicely
interpolate with a
slider in Celestia. Based on
this discrete imaging input, the 5 images display (top left to
bottom) what we would "see" of M81, when shifting the
center sensitivity of our "virtual eye" from actual 0.55 micron,
to 1.86 micron, 3.7 micron, 5.6 micron and finally to the
Spitzer central 8 micron IR wavelength!
The IR spectral ranges should of course be properly
normalized (which I skipped for simplicity).
Bye Fridger