Human gesture-based user interface for Celestia
Posted: 08.03.2011, 17:54
Hi all,
I'm new here. Love Celestia! I only discovered it a few weeks ago and I must say I'm very impressed, not only with the software but the community to support it.
I'm studying a Masters at a UK University and I am trying to choose my final year project. I would like to build a human-gesture-based user interface for Celestia - think Microsoft Kinect. My goal is to make the software as accessible as possible as a learning tool for junior school children. I aim to do this by removing from view as much of the 'scary' stuff as possible, allowing users to find their way around using a simple, novel user interface.
I hope to build a UI overlay which listens for various gesture inputs (probably using the Freenect open source Kinect drivers, and using one of the middleware gesture libraries currently available - OpenNI, Freenect, possible even Microsoft). I envisage a graphical menu system which appears on top of Celestia and provides basic controls such as starting tours (scripts), jumping to objects and, if I can work it out, gesture-based 3D manipulation. Imagine holding your hands out in front of you and 'rotating' a planet, zooming in and out using 'bigger' and 'smaller' gestures, adjusting time using a slider, and perhaps flying around using a virtual joystick (or perhaps body position detection).
Lots of idea, none particularly well defined yet, but hopefully you can see what I'm getting at.
I wondered if anyone here has worked/is working on anything similar? How feasible do you think my idea of a graphical menu overlay is (seeing as I'm not that familiar with the Celestia source yet)? And, gut feeling, do you think I should build my gesture recognition into Celestia itself or have it as an external handler (i.e. a remote 'key-presser')?
Thanks for your help, and I look forward to working with you all over the coming months!
Kind regards,
Steve
I'm new here. Love Celestia! I only discovered it a few weeks ago and I must say I'm very impressed, not only with the software but the community to support it.
I'm studying a Masters at a UK University and I am trying to choose my final year project. I would like to build a human-gesture-based user interface for Celestia - think Microsoft Kinect. My goal is to make the software as accessible as possible as a learning tool for junior school children. I aim to do this by removing from view as much of the 'scary' stuff as possible, allowing users to find their way around using a simple, novel user interface.
I hope to build a UI overlay which listens for various gesture inputs (probably using the Freenect open source Kinect drivers, and using one of the middleware gesture libraries currently available - OpenNI, Freenect, possible even Microsoft). I envisage a graphical menu system which appears on top of Celestia and provides basic controls such as starting tours (scripts), jumping to objects and, if I can work it out, gesture-based 3D manipulation. Imagine holding your hands out in front of you and 'rotating' a planet, zooming in and out using 'bigger' and 'smaller' gestures, adjusting time using a slider, and perhaps flying around using a virtual joystick (or perhaps body position detection).
Lots of idea, none particularly well defined yet, but hopefully you can see what I'm getting at.
I wondered if anyone here has worked/is working on anything similar? How feasible do you think my idea of a graphical menu overlay is (seeing as I'm not that familiar with the Celestia source yet)? And, gut feeling, do you think I should build my gesture recognition into Celestia itself or have it as an external handler (i.e. a remote 'key-presser')?
Thanks for your help, and I look forward to working with you all over the coming months!
Kind regards,
Steve