Page 1 of 1

Human gesture-based user interface for Celestia

Posted: 08.03.2011, 17:54
by rymix
Hi all,

I'm new here. Love Celestia! I only discovered it a few weeks ago and I must say I'm very impressed, not only with the software but the community to support it.

I'm studying a Masters at a UK University and I am trying to choose my final year project. I would like to build a human-gesture-based user interface for Celestia - think Microsoft Kinect. My goal is to make the software as accessible as possible as a learning tool for junior school children. I aim to do this by removing from view as much of the 'scary' stuff as possible, allowing users to find their way around using a simple, novel user interface.

I hope to build a UI overlay which listens for various gesture inputs (probably using the Freenect open source Kinect drivers, and using one of the middleware gesture libraries currently available - OpenNI, Freenect, possible even Microsoft). I envisage a graphical menu system which appears on top of Celestia and provides basic controls such as starting tours (scripts), jumping to objects and, if I can work it out, gesture-based 3D manipulation. Imagine holding your hands out in front of you and 'rotating' a planet, zooming in and out using 'bigger' and 'smaller' gestures, adjusting time using a slider, and perhaps flying around using a virtual joystick (or perhaps body position detection).

Lots of idea, none particularly well defined yet, but hopefully you can see what I'm getting at.

I wondered if anyone here has worked/is working on anything similar? How feasible do you think my idea of a graphical menu overlay is (seeing as I'm not that familiar with the Celestia source yet)? And, gut feeling, do you think I should build my gesture recognition into Celestia itself or have it as an external handler (i.e. a remote 'key-presser')?

Thanks for your help, and I look forward to working with you all over the coming months!

Kind regards,
Steve

Re: Human gesture-based user interface for Celestia

Posted: 08.03.2011, 20:39
by selden
Steve,

Noone before you has mentioned working on a gesture-based interface for Celestia.
The current code on SourceForge supports several different OS-specific GUIs, though,
for Windows, MacOS and Linux. Perhaps their codes will provide examples to help you.

The current hope is that the next major release of Celestia will have a single cross-platform (conventional, non-gesture) GUI based on Qt, and some initial code for that is included, too. It'd be nice if your gesture interface were cross-platform, too, but it'd be understandable if that turns out be too large a project for the time you have available.

Re: Human gesture-based user interface for Celestia

Posted: 09.03.2011, 08:05
by rymix
Thanks for your reply Selden.

I will probably be mainly working with the Windows version. I have the Celestia source working in all three OS versions, and the same is true for my Kinect hacks, but news that Microsoft is going to officially support PC Kinect drivers (and perhaps release some middleware) leads me to concentrate on Windows - probably! I have until May to decide. Hopefully the picture will be clearer by then.

By 'next major release' do you mean Celestia 1.7.0 or Celestia 2.0? I would certainly be interested in the plans for cross-platform GUI standardisation. However, as you've already pointed out, I might not have the time to do a 'proper' job. The nature of my studies is experimental, and I have to remove from the critical path as many external factors as possible. This isn't to say I don't want to contribute to the Celestia project, though. If my gesture-based interface proves to be a success then I will gladly work it up into a Celestia add-on or maybe even core if it's relevant - but this might have to wait until next year.

Thanks again,
Steve

Re: Human gesture-based user interface for Celestia

Posted: 09.03.2011, 09:23
by John Van Vliet
--- edit ---

Re: Human gesture-based user interface for Celestia

Posted: 09.03.2011, 12:06
by rymix
john Van Vliet wrote:I gave up on Microsoft about 5 years ago so I build only the Linux version

Thanks John.

Yes, so did I, but my place of work had a strategy change and a couple of years ago it was Microsoft for everything. To be fair, they are doing a far better job these days than they used to, but it's still frustrating. The Kinect stuff is amazing, though. Late last year Microsoft issues an open letter discouraging the 'Kinect hackers'. Then Prime Sense - the gesture specialist company that Microsoft hired for all the 1st gen Kinect games - issued an open letter to Microsoft pleading with them to release PC drivers and middleware to the open source community...and Microsoft agreed! Unprecedented decision, and hopefully an indication of a shift in attitude by the big boys.

Re: Human gesture-based user interface for Celestia

Posted: 09.03.2011, 18:32
by Fenerit
Just my thumb up for your enterprise, here. 8)

Re: Human gesture-based user interface for Celestia

Posted: 09.03.2011, 21:54
by selden
rymix wrote:By 'next major release' do you mean Celestia 1.7.0 or Celestia 2.0?
Yes. ;)

Some time ago Chris was considering calling the next major release v1.7 if it gets released without Qt, and v2.0 if the Qt interface is ready. There are quite a few major changes which are only partially implemented in the svn code and others which have not yet been uploaded at all. Given Chris' other commitments, it seems likely that it'll be quite a while (another year?) before either 1.7 or 2.0 become available. In the meantime, v1.6.1 is expected to be available RealSoonNow, and the current svn code is easily built and is quite stable, even though not all of the new features are ready.

Re: Human gesture-based user interface for Celestia

Posted: 11.03.2011, 10:37
by rymix
Thanks everyone for your comments.

At the moment it seems like my best approach will be to take a branch of 1.6.1, convert to VS2010 (because that's required for Freenect to work) then build my gesture interface directly into Celestia, on top of the existing GUI. Freenect definitely seems to be the easiest in terms of integration even if it has th least amount of useful middleware of the existing gesture libraries.

Aside from the specific requirements of Celestia, I believe the tickiest part will be the gesture recognition itself. So I hope that whatever work I do can be easily urged into the Qt GUI when it's a bit further developed. I foresee a series of semi-transparent screen overlays with a grid of event triggers which start various routines. In other words, ostensibly quite simple and probably not very flexible or intelligently designed (from a software architecture perspective).

I need to build a functioning prototype to prove or disprove a theory. If it turns out to be something that others would benefit from too then I'll gladly out in the effort to make it acceptable in a Celestia release.

Re: Human gesture-based user interface for Celestia

Posted: 17.07.2011, 14:09
by duds26
The operating system and drivers actually transform human gestures into actions. Graphical User Interface elements need to be adapted for this. Since the Celestia Developers work with a graphical toolkit. You're going to have to wait for that to be adapted.