VR portation?

Discussion forum for Celestia developers; topics may only be started by members of the developers group, but anyone can post replies.
Topic author
Arktos
Posts: 1
Joined: 30.01.2018
With us: 6 years 9 months

VR portation?

Post #1by Arktos » 30.01.2018, 00:26

Hey Guys!

Way back in the early 2000s I first came across Celestia and immediately fell in love with it. Some time later I ( for whatever reasons ) couldn't get it running on my new pc and so I forgot about it. Until recently, when I was playing "google earth VR" on the Vive. It was then, that the feeling to hover above earth in virtual reality reminded me strongly of the feels I had, when I for the first time did that in Celestia. And then, following the obvious path, I discovered quite an active community around it still active here.
So I wondered, if anyone might have plans to port Celestia to a VR system, now that these are pretty good and affordable. With a growing variety of VR systems ( Vive, Oculus, PSVR, etc ) there should be one that fits. And I bet "Celestia VR" would be an almost guaranteed hit. If I think about the educational value ( let alone the entertainment factor ) ...boy, that would be awesome!

Arktos

Janus
Posts: 537
Joined: 13.08.2016
With us: 8 years 3 months

Post #2by Janus » 30.01.2018, 17:27

Depending on how you choose to look at it, Celestia is already a type of VR system.

Amusingly, Celestia already contains the core code to display this.
It is the split screen option in view split vertically.
You should be able to use this for poor peoples VR, or what I call the cross eyed VR, though I have not quite managed it yet.
Split the screen, then using a script position the two in the same place.
Offset the left one to the left, the right one to the right, as close to the same amount as possible.
Defocus your eyes the same way you do for stereogram posters you see in malls that show fish or giraffes or spaceships.
In this case if you are looking at earth, you will see three earths, two on the sides our of focus, and a middle one that looks very focused, and curved.
Then issues the same movement commands to both.
If anyone can solve the positioning problems in script, it might be amusing.

This is the essence of VR.
A singular focus point is the start.
Which just like normal, has an xyz for absolute position.
Then its own facing direction, like the observer in celx scripts, or the observer data on screen for example.
Then two display windows are made instead of one.
Each offset by a couple of inches left and right, in whatever the local measurement is, based on the orientation of the base view..

You end up with two displays from one set of data, just like the split screen, only on two different screens.
The hard part is syncing those displays for display timing and data contents.
This requires a dual rendering pipeline, which is tough.
The simplest, and best way, is to use in memory multicasting, which only specialty hardware supports.

Beware though, If the displays alternately change rather than in sync, then the viewer will experience a phantom spinning motion and or other symptoms.
It is the visual equivalent of the Coriolis effect, combined with a roller coaster ride, and it is unpleasant.
These can include headaches, nausea, seizures and general disorientation.

If you want to see what it is like, it is simple, though not pleasant.
Place a singular light behind you that illuminates the wall behind your display but not the display itself, incandescent or led, not florescent.
Then put a fan on side on low so the flicker from the blades is visible in one eye but not the other.
Now try to watch a movie on a large monitor with lots of movement.
Though not identical, it conveys the basic idea.
As one eye reacts to the flicker at the edge of its vision, the other will anticipate it, and pre/react as well.
Overcoming these issues is where most of the work is at.

The other problem is over coming event sync issues, it is not complicated, but it is a lot of work.
Doing that work results in a display delays.
Display delays are the visual equivalent of ping, only instead of milliseconds, it is measured in display frames.
What happens is the hardware renders one side, records it.
Then it renders the other side, and records it.
It then copies those renders to frame buffers in the goggles.
The goggles then swap both buffers at the same time.
This results in a frame for renders one, a frame for render two, a frame for copy one, a frame for copy two, and a frame for swap.

You can still have 60fps quite easily.
However, depending on rendering time, have can have a display delay of two or three, and upto five frames, which will vary based on conditions being rendered.
Thus the visual delay is 1/60 ~= 17ms, which means 34 to 51ms, with up to 85ms display delays.
These variable delays are why people wearing face hugger displays move funny, their eyes are lagging.
Also, most of the rendering systems try for higher frame rates, but the higher the frame rate, the twitchier the system is.

There is also a type that does a dual centered single display.
Kind of a dual render single display.
It does the same as celestia's vertical split screen, to reduce delays.
That way it only has to copy a single frame.
This approach has its own problems however, and its own limitations.

While I have avoided using all the right terminology, because jargon alone does not help newcomers grasp underlying ideas.
I hope I have conveyed the basic idea of what VR is.
If you want a different version, there is an explanation that fits here even better.
VR is the reflection of parallax.

Right now we measure offsets of stars six months apart to determine how far away it is.
In VR, we calculate the offset of objects from two view points inches apart.
In parallax, the closer a star is to being the same place in two pictures six months apart, the further away it is.
The same goes for VR, the closer a thing is to being in the same place on the two screens, the closer the viewer 'sees' it.

So if you want to look at it this way, our telescopes are busy interpreting a VR display known as reality.
While here we take our observations of the universe, and see how close we come to reproducing the real thing.


Janus.

Avatar
Alexell M
Site Admin
Posts: 303
Joined: 07.10.2010
Age: 30
With us: 14 years 1 month
Location: Moscow, Russia
Contact:

Post #3by Alexell » 29.05.2018, 14:56

In fact, the idea is good. Janus is right, VR will already come in handy with the already realized vertical division of the Celestia screen. The rest can be done using CELX script, which will control the process and show the object on both screens with the necessary shift.

But if we talk about VR in general, it would be nice to implement a special VR mode in Celestia, which will not only divide the screen vertically, but close the possibility of separate control of them and allow you to manually travel, but automatically show on both halves of the screen an image with the necessary shift.
Admin of celestia.space
PC: Intel Core i7-8700 @ 3.20GHz, SSD, 16 Gb RAM, NVIDIA GeForce GTX 1080, Creative Sound Blaster ZxR. Windows 10 x64.
Phone: iPhone Xs 256 Gb. iOS 14.
Image


Return to “Ideas & News”