Missed Chances without Wavelength Filters in Celestia

The place to discuss creating, porting and modifying Celestia's source code.
Avatar
Topic author
t00fri
Developer
Posts: 8772
Joined: 29.03.2002
Age: 22
With us: 22 years 6 months
Location: Hamburg, Germany

Missed Chances without Wavelength Filters in Celestia

Post #1by t00fri » 28.04.2005, 20:25

Hi,

here are some more recent amazing examples for what is
missed by not embarking into the wavelength filter concept
for Celestia:

The Galaxy Evolution Explorer (GALEX), launched by a
Pegasus rocket on April 28 2003, has been observing the sky
using detectors sensitive to ultraviolet light. On the
other spectral end, we have the 2mass infrared
survey
. Let's compare these various views with
respective visual light images.

Below, I display two instructive examples from GALEX (and
2mass IR), illustrating how strikingly different galaxies look
in 2-3 different wavelength bands: UV, visual, IR


Image
Image
and a few more GALEX images in the UV:
Image

It's a shame to miss that oportunity...


Bye Fridger
Last edited by t00fri on 28.04.2005, 23:16, edited 2 times in total.

ElChristou
Developer
Posts: 3776
Joined: 04.02.2005
With us: 19 years 7 months

Post #2by ElChristou » 28.04.2005, 20:36

Sooner or later the wavelenght will be implemented... it's obligatory...

Just for info, shall a bodie will de describe with one image per wavelenght, or there is a way to do an image with enough data to be used with a real filter system?
Image

essee
Posts: 24
Joined: 30.12.2004
With us: 19 years 9 months
Location: CO, USA

Post #3by essee » 29.04.2005, 06:50

Thumbs up
from an american hitching a ride thru the ever increasingly "realistic" :wink:
Celestia. Threads as they've been, thought ya'll could use the encouraging words and the following aside...I know observer changes observed but from this perspective (my eyes only?) Celestia seems constantly changing, evolving, reaching outward...

sc

maxim
Posts: 1036
Joined: 13.11.2003
With us: 20 years 10 months
Location: N?rnberg, Germany

Post #4by maxim » 01.05.2005, 12:12

IHMO there are several specifications to be made, before we could go into implementing the wavefilter feature. Beside the will to do so, we haven't worked out anything formal. When thinking about the theme, some questions come into my mind, and I think without having fully answered these, we shouldn't go on.

Question 1: How should the filter graininess be defined?
Well, we could say, that's simple, an specify three bands: IR, VIS and UV. Or we can define bands every 500nm i.e. - in fact the decision will probably depend on the available source material, resp. the typical instrumental waveband channels used in sientific astronomy.
Conclusion: A summary about standart instruments, usual sensor specifications and the derived source material would help. Problems would arise if there is no standartisation in this issue, and every instrument/sensor is a unique design, not comparable to any other instrument/sensor (and if the same would be valid for the derived data).

Question 2: How should the visual data be stored?
There had been proposals to define (or use) a multichannel texture format. That would be very compact, but result in a significant performance penalty on handling it during runtime. Other proposals use one separate texture for every waveband - how to organize that? Fixed different folders? File naming conventions like 'limit-of-knowledge'?
Conclusion: Performance penalty seems to be a strong issue. Organisation of separate textures has to be defined by someone. Definition variations seems equally good - there simply has to be ONE fixed.

Question 3: How should the visual data be processed/displayed?
A lot of the existing material, currently regarded as visual, is already a mixture of VIS/IR or VIS/UV that was preprocessed for common publishing. Should there be attempts to divide these data again? Or simply classify it as non-separable multiband representation. If there are different textures for different wavebands, expanding the 'alt-surface' definition may be a good approach.
Conclusion: Expanding 'alt-surface' or inventing 'alt-waveband' may be a first step. That would make it urgently nessecary to be able to define 'alt-surfaces' also for stars, galaxies and nebulae. Additionally the new 'alt-surface' should handle alternate atmosphere definitions - or a generalization of them - a hull (spherical hulls?) of gas or thin matter, common to planets, stars and galaxies.

Question 4: How should multiband intermix be handled on the screen?
How are multiple objects on the screen handled, if some of them are defined only in a certain waveband or in visible light? Display only objects of the selected waveband? Display selected waveband and all visibles? Display different wavebands only for the currently selected object? - May be easy to do that via a special textures menu.
Adjacent question: Is there an easy way to expand creation rules for default objects (stars, galaxies), to allow a common generalized method of display for different wavebands?

Question 5: What functionality should the filter UI represent to the user?
Switching between bands of course. Merging bands also? Is there a clean way to compute merged bands fast? Are there any standart algorithms defined/used already in astronomy? Seamlessly sliding along a continuous band? - that seems to be far to complicated in the first run.
Conclusion: Merging bands may be a goal if its simple and easy to compute. Else skip it until a future release.

Question 6: Could it be a first goal to just expand default texure classes by waveband definitions - similar to 'limit-of-knowledge'?
A very first step that could force display material to be collected, processed and finally displayed in Celestia - thus force the next development steps.


If all these questions could be REALLY answered, not just speculated or dreamed on it, we will be on our next step for wavelength filters.

maxim

Slalomsk8er
Posts: 128
Joined: 26.07.2004
With us: 20 years 2 months
Location: Earth 7.593358long / 47.582393lat
Contact:

Post #5by Slalomsk8er » 01.05.2005, 17:12

http://celestiaproject.net/forum/viewtopic.php ... highlight=

1. logarythmical from 1pm to 10 Mm (pm, nm, um, mm, m, km, Mm) with 3 digits after the point (333.333 nm)

2. 1 and 3 band images, raw, color tables and sensor data. Where 1 band is 8bit graycale (with A: 16bit WA or 32bit RGBA) and 3 bands are 24bit RGB(with A: 32bit RGBA). Raw are the "source" bands. 16bit grayscale and the other stuff celestia can not show at the moment. Color tables define the mix of the bands that fom the actual texture (RGB or RGBA) more then one mix in the file possible. Sensor data contains the data of the sensors of a 1 band and 3 band image plus the mix information of the 3 band image.

Code: Select all

Tree structur:
[extra]
[object]
[texture]                                                                                                                                                      | [mesh] | xxx.ssc
[333.333_nm-433.333nm]   |  [740_nm-380_nm]                                                   | colortables
texture | raw | sensor           || texture | raw1 | raw2 | raw3 | rawX | sensors 

Legend:
[folder]
file
| seperator
|| next folder


3. Mixed like defined in the color tables (RGBA) to the actual texture, the results could be stored in the objects texture folder (if user has this option on). This would result in a texture per color table mix. The textures generated from the color tables show up in the "alt-waveband" menu. The sensor info can be displayed as text in the "Info Text".

4. Like defined per RMB menu. All objects in visual bands plus the selected in the active band; just objects with a texture in the active band; all ojects in there "normal" bands if not in active band; best match of active band.

5. Alpha blending?

6. I am not shure, what you mean but I think it is better to implement it full and not as preview.


What are you guys thinking?
ASUS A7N8X Deluxe
AMD Athlon XP 2800+ (2.08GHz)
1GB DDR RAM 333MHz
NVIDIA GeForce FX 5600 AGP 8X

maxim
Posts: 1036
Joined: 13.11.2003
With us: 20 years 10 months
Location: N?rnberg, Germany

Post #6by maxim » 02.05.2005, 21:19

1. logarythmical from 1pm to 10 Mm (pm, nm, um, mm, m, km, Mm) with 3 digits after the point (333.333 nm)
A new texture every 0.001 nm? You don't propose to store about 400000 textures alone for the visual spectrum, do you?
What I meant was a common width of typical used bands like:

Code: Select all

Far IR (FIR)  ~15um - 1000um
LWIR          ~8um - 15um
MWIR          ~3um - 8um
SWIR          ~1400nm - 3000nm
Near IR (NIR) ~700nm - 1400nm
Visual (VIS)  ~380nm - 750nm
UVA           ~315nm - 380nm
UVB           ~280nm - 315nm
UVC           ~100nm - 280nm
VUV           ~50nm - 190nm
EUV           ~12nm - 20nm
Soft XR       ~2400pm - 4400pm
I'm not shure what parts or typical bands of these are used in astronomy.

5. Alpha blending?
Hm, I'm not shure if multiband pictures are made in such a simple way.

6. I am not shure, what you mean but I think it is better to implement it full and not as preview.
This is not a commercial project, where you just have to aquire a team and set a deadline of three month. As we already are experiencing, in an open source project you have to start slowly, or nothing will happen at all (that was the meaning).

maxim

Slalomsk8er
Posts: 128
Joined: 26.07.2004
With us: 20 years 2 months
Location: Earth 7.593358long / 47.582393lat
Contact:

Post #7by Slalomsk8er » 02.05.2005, 23:35

maxim wrote:
1. logarythmical from 1pm to 10 Mm (pm, nm, um, mm, m, km, Mm) with 3 digits after the point (333.333 nm)
A new texture every 0.001 nm? You don't propose to store about 400000 textures alone for the visual spectrum, do you?
Nope, this says, that we would have a grain size of a 1/1000 at the actual scale (pm - Mm) this will make it possible to define a band at max 000.000_pm-999.999_Mm (all known wave lengths) or just the half (000.000_pm-500.000mm) or the the smalest single band (000.001_pm) or biggest single band (999.999_Mm) or ...
If I had them and need to work with them, why not? I was thinking of this more like a naming convention for the folder of the band ;)
What I meant was a common width of typical used bands like:

Code: Select all

Far IR (FIR)  ~15um - 1000um
LWIR          ~8um - 15um
MWIR          ~3um - 8um
SWIR          ~1400nm - 3000nm
Near IR (NIR) ~700nm - 1400nm
Visual (VIS)  ~380nm - 750nm
UVA           ~315nm - 380nm
UVB           ~280nm - 315nm
UVC           ~100nm - 280nm
VUV           ~50nm - 190nm
EUV           ~12nm - 20nm
Soft XR       ~2400pm - 4400pm
I'm not shure what parts or typical bands of these are used in astronomy.
I think this will be defined by the names of the folder of the bands. So every band that has a folder in the extra folder will show up in the menu. This gives the user a lot more power at his hands. The user can define wave bands as much as he likes, even overlaping ones :twisted:
5. Alpha blending?
Hm, I'm not shure if multiband pictures are made in such a simple way.
Not that simple at all ;) a visual bands (RGB) image is normaly made from 3 grayscale bands. You need the info from every senser plus a understanding of grayscale/RGB normalisation. We have the sensor information in the sensor file, the band information in the texture and we could add blending table for the actual blending funktion into celestia it self (if needed).
6. I am not shure, what you mean but I think it is better to implement it full and not as preview.
This is not a commercial project, where you just have to aquire a team and set a deadline of three month. As we already are experiencing, in an open source project you have to start slowly, or nothing will happen at all (that was the meaning).

maxim

OK but let us implement stuff from pre version to pre version and not from release version to release version 8O
ASUS A7N8X Deluxe

AMD Athlon XP 2800+ (2.08GHz)

1GB DDR RAM 333MHz

NVIDIA GeForce FX 5600 AGP 8X

Gerbil94
Posts: 12
Joined: 26.04.2005
With us: 19 years 5 months
Location: Manchester, UK

Post #8by Gerbil94 » 03.05.2005, 09:14

maxim wrote:IHMO there are several specifications to be made, before we could go into implementing the wavefilter feature. Beside the will to do so, we haven't worked out anything formal. When thinking about the theme, some questions come into my mind, and I think without having fully answered these, we shouldn't go on.

Absolutely.

Question 1: How should the filter graininess be defined?

I don't like the "every 500nm" possibility, or any system based purely on centre wavelength (although I'm not suggesting you should care :) ). First problem: there are many bands that overlap but have different shapes, especially in optical. You might have a set of optical data using a line filter, and a second set using a broadband filter. If both filters are centred on the same wavelength, then a system that uses only centre wavelength might conclude that they are identical, when in fact they could be providing quite different information (depending on the shape of the source's spectrum).

Linear steps are not very useul. Your first band would be at 0nm(!), the second in the optical at 500nm, the third and many following in the infrared at 1000nm; eventually you will reach radio, but there will be many unused steps and the gap between different radio bands will be very great. There would also be no X-ray and gamma-ray support. I don't think regular steps are very useful in general; a very dense step size would lead to one filter actually covering multiple steps, while too sparse a step size would move filters about in wavelength. Most filters don't have centre wavelengths separated by nice round numbers.

I think a pure wavelength system should at least take bandwidth into account, or alternatively each filter should have a unique code (that could be generated based on wavelength information). The code system would let us distinguish line filters from ramp filters and other filters with odd shapes.

Conclusion: A summary about standart instruments, usual sensor specifications and the derived source material would help. Problems would arise if there is no standartisation in this issue, and every instrument/sensor is a unique design, not comparable to any other instrument/sensor (and if the same would be valid for the derived data).

Although different sensors may work in quite different ways, I think there are going to be two outputs that people are interested in. First, catalogues of brightness (for stars). Stars tend to look like points no matter the instrument; the only thing that changes is brightness/magnitude (more below). The other is images. Data from even radio telescopes and gamma-ray telescopes can be processed into an image.

Question 3: How should the visual data be processed/displayed?

If each observing setup got its own code -- instead of classification based purely on wavelength -- then multi-wavelength data would get a unique code and there would be no problem. I don't think it would be practical to separate combined data; certainly it wouldn't be desirable.

Conclusion: Expanding 'alt-surface' or inventing 'alt-waveband' may be a first step.

It seems to me that starting with galaxies and deep sky objects may be less useful than starting with stars. Since all but a few stars are normally unresolved (point-like), the main effect is a simple change in brightness/magnitude. Isn't that a deal easier to incorporate than special textures, just to start with? You could change waveband and see how different stars became prominent from red to blue. This would be an optical, infrared and UV thing only (stars aren't very bright outside these bands) so the amount of data to worry about would be restricted and simplified. It would allow the waveband UI to be prototyped. The most labour intensive step that I can see as far as data is concerned would be cross-correlating stars in infrared catalogues (say) with Hipparcos stars.
Is this approach sensible?

Question 4: How should multiband intermix be handled on the screen?

I think it would be sensible to start by displaying only objects of the selected band.

Question 5: What functionality should the filter UI represent to the user?
Switching between bands of course. Merging bands also? Is there a clean way to compute merged bands fast? Are there any standart algorithms defined/used already in astronomy? Seamlessly sliding along a continuous band? - that seems to be far to complicated in the first run.

You can interpolate brightness between known bands if you are brave (and if you have some knowledge of the source spectrum). But you can't interpolate morphology and without the source spectrum and instrument responses it's no better than guesswork.

Conclusion: Merging bands may be a goal if its simple and easy to compute. Else skip it until a future release.


What is the merged band supposed to represent? What do you want it to tell you about objects? Answering that will identify the best way to merge them.

Sirius
Posts: 31
Joined: 03.04.2002
With us: 22 years 5 months
Location: Germany

Post #9by Sirius » 03.05.2005, 17:37

Starting with Stars:

Stars are relatively easy to model spectrally, as they can be approximated pretty good by "black body radiators" (or the english equivalent) i.e. the intensity of the emitted radiation per wavelength is an analytical expression depending only on the temperature. As celestia (and all astronomers) at the moment do just this (getting color information by the temperature or, in case of the astronomers, vice versa) it should be easy to "move" another part of the spectrum into the visible light, so we can display it on the monitor.

This is essentially a function lambda = a*(lambda' + x) (like that)

Continuing with planets & deep-space objects:

_Very_ difficult to implement in a seamless manner.
"alternate textures" would allow only a very limited number of _fixed_ transformations (lambda = ...) ; and to specify one filter or one transformation (you realize that you see "false colour" pictures whenever someone shows you an IR picture?) would limit the displayable textures to that one! A "blending" would be IMHO impossible from the ready-available _pictures_ (lack of information).

What would be possible: Filters for all "calculatable", mostly unstructured objects, i.e. Stars, a bit of the Galaxy, dust clouds (when they are implemented in visible light :-). i.e. you mostly couldn't use textures but would have to use background information. Looking at something as complicated as Earth would require lots of information about different wavelength images, at best ones with monochromatic filters (they could be mergeable).

Starting with Stars could be a good starting point, i think I have seen an encapsulation of this (color finding) in render.cpp and for anyone interested I suggest looking at Planck's law of radiation (or Planck's radiation formula) that describes black-body radiation.
<physics>
WARNING: if you translate a spectrum remember that the Intensity is in fact a propability P(lambda) d(lambda), i.e. if you apply a transformation lambda -> lambada' you have to replace d(lambda) -> d(lambda'). This is only important if you want to scale or otherwise skew the spectrum.
</physics>

Avatar
Topic author
t00fri
Developer
Posts: 8772
Joined: 29.03.2002
Age: 22
With us: 22 years 6 months
Location: Hamburg, Germany

Post #10by t00fri » 03.05.2005, 18:18

Sirius wrote:Starting with Stars:

Stars are relatively easy to model spectrally, as they can be
approximated pretty good by "black body radiators" (or the
english equivalent) i.e. the intensity of the emitted radiation
per wavelength is an analytical expression depending only on
the temperature. As celestia (and all astronomers) at the
moment do just this (getting color information by the
temperature or, in case of the astronomers, vice versa) it
should be easy to "move" another part of the spectrum into
the visible light, so we can display it on the monitor.

This is essentially a function lambda = a*(lambda' + x) (like
that)

Celestia has all this (and more) in its code. Just have a look
first...We use black-body radiation whenever there are no
direct sources for the star colors. In my opinion, /discrete/
stars are totally uninteresting in the present context.

Continuing with planets & deep-space objects:

_Very_ difficult to implement in a seamless manner.
"alternate textures" would allow only a very limited number of
_fixed_ transformations (lambda = ...) ; and to specify one
filter or one transformation (you realize that you see "false
colour" pictures whenever someone shows you an IR
picture?) would limit the displayable textures to that one! A
"blending" would be IMHO impossible from the
ready-available _pictures_ (lack of information).


Seemless is just asked too much for a start. Astronomers
also do not use "seamless" filters in reality. In my view we
should start with a discrete set that implements the most
common wavelength bands that are used in astronomy/space
explorations

If we consider seemless blending then we need a whole
army of "texture creators" normalizing the available textures
in different bands to each other.

Certainly, I have actual imaging data in mind not theoretical
calculations of spectral properties.

As long as Chris does not reappear and gets involved or
another 3d genius speaks up loudly ;-), I will just not tire
myself any further with "dreaming" about this issue. We just
need 3d expertise to seriously discuss about feasible
renderings.

No doubt, there are many cases of great interest (I am really
tired repeating myself...), like

-- the SOHO sun that is continuously and simultaneously
updated in various bands.

-- imaging of our galaxy and others from UV (GALEX) via
visible light to IR (2mass) (see e.g. images above and in
other respective threads that I initiated). Hydrogen emission
regions etc...

--Titan(visible, near IR, radar) , Saturn atmosphere ...Earth
(IR) ..Venus (visible, radar)

My preference would be to start with a restricted (but
impressive) implementation, yet with a framework that is
sufficiently expandable.

The easiest implementation in practice would be just a switch
that can be activated to toggle through the available image
information for an object. Not dissimilar to the 'I' key that
allows to toggle clouds on|off or CTRL V that leads us
through various render paths...I am sure Chris would have
lots of good ideas how do do such things really cool...

Another possible scenario would be to utilize Celestia's split
window facility to display different wavelength band images
next to each other.

Bye Fridger

Slalomsk8er
Posts: 128
Joined: 26.07.2004
With us: 20 years 2 months
Location: Earth 7.593358long / 47.582393lat
Contact:

Post #11by Slalomsk8er » 04.05.2005, 00:41

I did not think about the stars!
This would be a great point to start but I think we need then one more thing to o it right, the source spectum of every star in the catalog. IMHO to get the source spectum of the stars is a task for the sciense team and not that easy with the binary star errors in the catalogs.

But the textures will be more intersting for the general user then the change of color and brightness of the stars. It is not just about images from galaxies and nebulas but about images of our sun (SOHO) and planets like Venus (no ground visible) and Jupiter too.
ASUS A7N8X Deluxe

AMD Athlon XP 2800+ (2.08GHz)

1GB DDR RAM 333MHz

NVIDIA GeForce FX 5600 AGP 8X

Slalomsk8er
Posts: 128
Joined: 26.07.2004
With us: 20 years 2 months
Location: Earth 7.593358long / 47.582393lat
Contact:

Post #12by Slalomsk8er » 04.05.2005, 01:14

Seemless is just asked too much for a start. Astronomers
also do not use "seamless" filters in reality. In my view we
should start with a discrete set that implements the most
common wavelength bands that are used in astronomy/space
explorations
I think there are not things like common wavelength bands for this to start with. The users will fill the menu real quick with there own bands if we alow this. Do not limit the bands, make it just possible for them to emerge ;)

I am for a flexible system, that show the things you put in the folders in the menu. This will be more or less easy. Next we need a way to change all objects with a band inside the active one to there best match band. To match the stars would be the next logical step. And at the end we can think about some complex band mixing ;)
_Very_ difficult to implement in a seamless manner.
"alternate textures" would allow only a very limited number of _fixed_ transformations (lambda = ...) ; and to specify one filter or one transformation (you realize that you see "false colour" pictures whenever someone shows you an IR picture?) would limit the displayable textures to that one! A "blending" would be IMHO impossible from the ready-available _pictures_ (lack of information).

Color tables, source spectrums etc. can help us with merging. But I think for a real good image it needs to be done by hand and outside of celestia, we have to make it easy to define bands and to put them in the menu (automatc at best). Blending is possible in celestia for grayscale (raw) bands via a color table, that let the user define there own false color mixes.
ASUS A7N8X Deluxe

AMD Athlon XP 2800+ (2.08GHz)

1GB DDR RAM 333MHz

NVIDIA GeForce FX 5600 AGP 8X

Sirius
Posts: 31
Joined: 03.04.2002
With us: 22 years 5 months
Location: Germany

Post #13by Sirius » 04.05.2005, 11:24

@Slaloms8er:

Alls Stars emit a well-understood spectrum that is called Black-Body radiation
http://en.wikipedia.org/wiki/Black_Body

This spectrum _only_ depends on the temperature, so the information present in celestia - and the implementation, as t00fri said - is sufficient to produce images in _arbitrary_ parts of the spectrum.

The interenting thing - in which stars differ - is the absorption line spectrum, but that does not influence the way they look even with sharp filters. This is something completely different, which could one far day be implemented in celestia...

@t00fri:

You are right: Stars would not really be very interesting. Just my 5??

No doubt, there are many cases of great interest (I am really
tired repeating myself...), like


It's like that in a forum, old threads die and no noone spends any time looking for them or reading them. (me for one :-( )
I think really a deveopment Wiki where things like this can be set up and discussed (MediaWiki discussion pages come to mind) would be much better than the present situation with only a forum.[/url]

maxim
Posts: 1036
Joined: 13.11.2003
With us: 20 years 10 months
Location: N?rnberg, Germany

Post #14by maxim » 04.05.2005, 13:58

So, what I've learned so far, is that image data is captured only greyscale, and we need color lookup tables to deal with the false coloration of IR, UV, ... images - another object to implement in Celestia. Therefore we will perhaps need something like 'color themes' too, to deal with (different?) good meaningfull colorisations - another object class.

I would like to see a list of common bands that should be implemented at first. And I do not understand how 3d rendering is directly involved in the filtering issue (exept if you want to display 3d ionosphere effects)

Sorry, some quick short notes only, 'cause I'm actually already off (in France) for the next three weeks - 'til then,

maxim

Avatar
Topic author
t00fri
Developer
Posts: 8772
Joined: 29.03.2002
Age: 22
With us: 22 years 6 months
Location: Hamburg, Germany

Post #15by t00fri » 05.05.2005, 09:06

Here is another spectacular NEW multi-wavelength result: The
famous Sombrero galaxy (m104) as imaged by the
Spitzer Space Telescope (IR) (lower right), it's visual
appearance (Hubble, lower left) and an amazing combination of visual AND IR (top image):


Image

It was taken from here:

http://www.spitzer.caltech.edu/Media/releases/ssc2005-11/ssc2005-11a.shtml

The Spitzer Space Telescope is of course another rich (and
ongoing) source of incredible IR images.

So here is another four-color composite of invisible IR light,
showing emissions from wavelengths of 3.6 microns (blue),
4.5 microns (green), 5.8 microns (orange) and 8.0 microns
(red). These wavelengths are roughly 10 times longer than
those seen by the human eye.

Image

Clearly, if Celestia ever will involve multi-wavelength
displays, we should contemplate about a false color
standard
encoding to be applied to all IR imaging in
Celestia
. Same for UV. So people will quickly memorize
the color - wavelength correspondence and can immediately
read off the dominant emissions from the images
...No matter whether galaxies, nebulae, planets, moons or
stars are on display



Bye Fridger

Avatar
Topic author
t00fri
Developer
Posts: 8772
Joined: 29.03.2002
Age: 22
With us: 22 years 6 months
Location: Hamburg, Germany

Post #16by t00fri » 05.05.2005, 10:30

Bye the way...

with two such equal-sized m51 images in the visual and
IR bands
, one may do some very revealing
"image-manipulation games". I used the king-size
(3000x2400) versions of the M51 images, aligned and
overlaid them in two revealing modes:

1) Subtract mode (see GIMP online manual for explanation!):

Here is a small part of the resulting overlay , emphasizing the
satellite galaxy of M51 (red!)

Image

2) Multiply mode (see GIMP online manual for explanation):

Exactly the same part as in the Subtract mode above, but now
in Multiply mode for the overlaid pixels:

Image

Enjoy,

Bye Fridger

Avatar
Topic author
t00fri
Developer
Posts: 8772
Joined: 29.03.2002
Age: 22
With us: 22 years 6 months
Location: Hamburg, Germany

Post #17by t00fri » 05.05.2005, 10:36

...and another phantastic visual versus IR comparison from the
Spitzer Space Telescope site

Image

Just to invalidate somebodies remarks, that there might be only
little appropriate image material at different wavelength ;-)



Bye Fridger

Slalomsk8er
Posts: 128
Joined: 26.07.2004
With us: 20 years 2 months
Location: Earth 7.593358long / 47.582393lat
Contact:

Post #18by Slalomsk8er » 05.05.2005, 13:34

Clearly, if Celestia ever will involve multi-wavelength
displays, we should contemplate about a false color
standard encoding to be applied to all IR imaging in
Celestia. Same for UV. So people will quickly memorize
the color - wavelength correspondence and can immediately
read off the dominant emissions from the images
...No matter whether galaxies, nebulae, planets, moons or
stars are on display

I think the user shall be able to use what ever color he like for his bands (colortables).
But we need to define a standart for the color maping in the celestia file that are bundled with celestia.
With the law of the octave you can hear a planets orbit or the bats and dolphins sonar, can we use some thing like that to map the unvisual bands in to the visual?

Some mixing modes like add, sub, div and multi would be cool to have besides blend ;)
ASUS A7N8X Deluxe

AMD Athlon XP 2800+ (2.08GHz)

1GB DDR RAM 333MHz

NVIDIA GeForce FX 5600 AGP 8X

Paolo
Posts: 502
Joined: 23.09.2002
With us: 22 years
Location: Pordenone/Italy

Post #19by Paolo » 05.05.2005, 16:21

t00fri wrote:Just to invalidate somebodies remarks, that there might be only
little appropriate image material at different wavelength ;-)


Please clarify your idea, if not the discussion will continue to follow dreamable but inconsistent paths.

Do you mean that we are able to collect the same image of e.g a nebula (orientation, resolution) aquired at different wavelenghts. So we will be able to create by hand using e.g. The gimp a set of perfectly superimposable alternative textures for the same nebula.

If so the question is really easy to implement. It will be enough to extend the DSC file format including something like this:

Code: Select all

{
...
    Alt_text{
         Name "Visible NOAO"
         Bandwidth "xxx-yyy nm"
         Note "After UV post processing. Courtesy of NASA"
         Texture "Trifid_Nebula_NOAO.jpg"
    }

...
}


The coding for the DSC enhancement will be trivial and easy. A copy paste activity.
Loading the alternative texture will be the same. A copy paste activity.
Diplaying alternate textures for the selected object will be the same. A copy paste activity.
Displaying the additional info in verbose mode in the HUD will be the same. A copy paste activity.
Additional 3D-OGL-STL capabilities required: none.

Time estimated for the operation in order to have a runnig copy of Celestia with these basic capabilities is <10 full coding hours. Debug is another matter.

The amount of work necessary to prepare the textures......... depends. A high quality result will require a lot of manual image processing work.

So if you agree with all this, within the end of the week you or someone else will surely find this time and so we will have a new pre-release with deep sky objects wavelenght filters capability. :wink:
Remember: Time always flows, it is the most precious thing that we have.
My Celestia - Celui

Avatar
Topic author
t00fri
Developer
Posts: 8772
Joined: 29.03.2002
Age: 22
With us: 22 years 6 months
Location: Hamburg, Germany

Post #20by t00fri » 07.05.2005, 12:41

Paolo wrote:
Please clarify your idea, if not the discussion will continue to
follow dreamable but inconsistent paths.

Do you mean that we are able to collect the same image of
e.g a nebula (orientation, resolution) aquired at different
wavelenghts. So we will be able to create by hand using e.g.
The gimp a set of perfectly superimposable alternative
textures for the same nebula.

If so the question is really easy to implement.
....


Paolo & others,

I doubt that the real challenges are that simple. Of course we
could keep ourselves busy with all sorts of "10 working hour
kind of bricolages" . What makes me think since quite a
while, however, are possible approaches that fulfil a
maximum of exciting possibilities based on the
multi-wavelength (filter) concept in Celestia.

Since I am basically convinced that nothing much will really
happen on that front before Chris is hopefully back (if at all),
there is no particular reason to hurry.

So I just take my time and explore different possible alleys
one may take.

Yes indeed, as to deep-sky objects, there is now steady
influx of new hires IR (Spitzer, 2mass..) and UV (Galex,.. )
imaging besides the visual light images we got e.g. from
Hubble. Clearly, the respective objects may be rescaled,
mapped, alligned and superimposed in various modes by
"texture people".

If we want to achieve outstanding "revolutionary" display
solutions in this field, we first of all need a catching-on
logistic concept:

1) what do we intend to achieve through multi-wavelength
displays? Who are the people we want to address with such
new features?

2) which objects to select and which to skip? It makes little
sense to just start somewhere at random.

3) How are we going to exploit the available multi-wavelength
imaging information systematically and with a
uniform color mapping concept? How do we share
"mere beauty of display" versus "scientific information &
content"?
...

I did follow up quite a number of concrete ideas for possible
realizations. If they are more mature and satisfying at least
to myself I shall report about them for sure. Certainly I spent
already quite a lot of time to explore various possibilities.

A really important issue in my view is a well-choosen
color mapping scheme that should be both
maximally informative and most intuitive to grasp (and
beautiful not to forget ;-) )
. [It is of secondary
importance to me at this point whether we want to allow
users to modify it or not]

So I did a number of experiments here as well.

Let me illustrate something that I kind of fancy right now.

++++++++++++++++++++++++++++

Background info:

Human eyes are "measuring instruments" for light, with
different kinds of "Photopigments" used to sense 3 types of
color (B=blue,G=green and R=red (yellow actually)),
according to known spectral sensitivity profiles). The
human brain then composes these different
filtered signals to what we perceive as "visual color".

Altogether, we are able to "see" a wavelength-window (0.4 -
0.7 micron), centered around green (~0.55 micron) as illustrated in this figure

Image

"Natural" Color Mapping:

The IR Spitzer Space Telescope, is providing
various filtered images in a shifted
wavelength-window, centered e.g. in the (deep) infrared
around 8 micron! Typically at three IR reference wavelengths

1) 3.5 microns
2) 8.0 microns
3) 24.0 microns

[more frequently, they use 3-4 "colors" between 3.5 and 8 micron, however]

Suppose we interpret these "short", "medium" and "long" IR
shots as "blue" "green" and "red" input to our modified "eye",
now thought to be sensitive in a shifted window with
center ~8 microns (instead of 0.55 microns), our brain will
again provide an intuitive "visual" color
composition/translation of the three IR filter images. This
would look like this for the beautiful example of the M81
spiral galaxy:

0.5 microns <-> RGB visual:
-----------------------------------
Image

---shifting ----shifting ----shifting----

3.5 microns -> "blue":
-------------------------
Image

8.0 microns -> "green":
--------------------------
Image

24 microns -> "red":
-------------------------
Image

RGB-brain translation (GIMP) ;-)
------------------------------------------------

Image

AHA! not bad...


Next comes a mosaic of how we could nicely
interpolate with a slider in Celestia. Based on
this discrete imaging input, the 5 images display (top left to
bottom) what we would "see" of M81, when shifting the
center sensitivity of our "virtual eye" from actual 0.55 micron,
to 1.86 micron, 3.7 micron, 5.6 micron and finally to the
Spitzer central 8 micron IR wavelength!

The IR spectral ranges should of course be properly
normalized (which I skipped for simplicity).

Image

Bye Fridger
Last edited by t00fri on 07.05.2005, 22:48, edited 2 times in total.


Return to “Development”