At this year’s GDC independent developer Opaque Multimedia announced and demonstrated a tech demo of their first-person astronaut game Earthlight. You can watch the teaser trailer they released above and there should be footage of the tech demo coming soon.
Earthlight is a PC game set aboard the ISS (International Space Station) that supports VR via the Oculus Rift and motion controls via Kinect For Windows v2 using Opaque Multimedia’s own Kinect 4 Unreal (K4U) software.
You can read more details in the following press release and interview:
MELBOURNE, AUSTRALIA, 24th February, 2015 – Opaque Multimedia today officially announced Earthlight, a tech demo designed to demonstrate the immersive power of the Oculus Rift coupled with the Kinect 2. Earthlight will be available at the Opaque Multimedia booth at GDC 2015, with a public release to follow.
Earthlight is a first-person exploration game where the players steps into the shoes of an astronaut on the International Space Station (ISS) where, from 431 kilometres above, they look down on the Earth from within the comfort of their own spacesuit.
Featuring the most realistic depiction of the International Space Station used in an interactive VR setting to-date, Earthlight demonstrates the very limits of what is visually achievable in consumer-oriented VR experiences.
In Earthlight, players interact with the environment entirely through a natural motion interface, pushing and pulling themselves along the surface of the ISS as they navigate a network of handles and scaffolds.
The player’s motion is captured through a Microsoft Kinect 2 and powered by the company’s Kinect 4 Unreal technology.
“Visiting the International Space Station, floating freely in space, is a dream of many but a reality experienced by only a few.” Said Professor Christopher Fluke, Coordinator of the Virtual Reality Theatre at the Centre of Astrophysics and Supercomputing at Swinburne University at a pre-release demonstration of Earthlight.
“Earthlight is a genuinely engaging experience. Within moments of putting on the Oculus Rift headset, I was ready to climb out onto the surface of the Space Station and begin exploring. I am excited by the potential for interactive virtual experiences to play a greater role in bringing dreams of space exploration to life.”
“It’s been a real challenge to create a pleasant UX when you’re trying to combine VR with kinetic input.” says Chris Mackenzie, one of the team members responsible for creating the sophisticated body mapping system that underlies Earthlight. “We have to pay close attention to a number of different factors that you don’t normally have to account for in conventional VR projects, particularly proprioception.”
“Proprioception is the awareness of the position of your body and joints” Chris explains. “When we replace people’s body with a virtual avatar that mimics their movement, we need to ensure that what they’re seeing matches what they think their limbs are doing, otherwise it makes for a deeply disturbing experience”.
“We have really pushed both the software and the hardware to their limit.” Says Norman Wang, Earthlight Project Lead and Art Director at Opaque Multimedia. “We really did put the Kinect 2 through its paces. The Kinect is already pretty good normally, but when you’re in a VR setting where you’re staring at a pair of virtual arms that are centimetres from the camera, we had to eliminate even the slightest bit of jitter while keeping the simulation responsive.”
Earthlight will be available at the Opaque Multimedia booth at GDC 2015, with a public release to follow.
Earthlight 123KINECT Interview
I recently interviewed Earthlight’s Project Lead and Art Director Norman Wang along with the K4U Lead Russell Grain about Earthlight and working with Kinect 2:
123KINECT: What exactly can the player do in the current Earthlight tech demo, what are the Kinect controls and is there an objective in it or is it currently simply an experience to explore the ISS?
Norman Wang: Currently the player follows a set course we planned out on the ISS, using their hands to grab onto handles and position themselves in the world. The hand is tracked with the Kinect and then the input is piped into Unreal Engine 4 with our Kinect 4 Unreal plugin. The current version is designed very much to be an exhibition experience meaning that it is failure free and you can’t just fly off by letting go.
In our public release, we will give player that freedom should they choose to ignore all the safety precaution.
123KINECT: Is there anything you can reveal about what the full game will eventually be? (Like a realistic adventure like the movie Gravity, a sci-fi with futuristic/alien elements or maybe even a horror like Alien: Isolation?.)
Norman Wang: As for the style of game for Earthlight, can’t say much at the moment, but I can certainly say that we’ve considered all of those options and everything is on the table.
123KINECT: How does Kinect 2 compare to other motion control solutions in general and would you say it’s a good solution for VR? (I’ve noticed a lot, if not most, interactive VR demos/games seem to use either Kinect 1 or 2.)
Norman Wang: With regards to Kinect for VR – We have something in the pipes that will demonstrate the viability of using multiple Kinects as a VR interface. It enables us to do device-less full-body tracking in a large volume, which is a unique capability when it comes to VR interfaces. It does however, have slight problem with latency, meaning that it is not as responsive as say, Razer Hydra or the Vive, but so far we have found it to be satisfactory. We will be demonstrating a reference implementation of it in the near future.
123KINECT: Is Earthlight Kinect For Windows only or is an Xbox One version planned?
Norman Wang: At the moment we’re sorting through our various offers to work out if there is going to be any kind of exclusivity arrangements. We have plans for expanding Earthlight, some of which are Kinect-less but Kinect/Oculus is our primary focus still. We’re working as quickly as possible on our front but any announcement won’t be imminent.
123KINECT: I know this is obviously some unknown time in the (presumably near) future, but have you given any thought to developing Earthlight for the Microsoft HoloLens?
Norman Wang: HoloLens – we can’t say much but we’ll be over it in a big way. Announcements will come as per Microsoft’s schedule.
123KINECT: Is there any video of Earthlight in action (beyond just the teaser trailer), showing the Kinect controls?
Norman Wang: We will be producing a post GDC promo video which we will release soon!
123KINECT: In regards to Kinect 2 I was just wondering about easy/hard it was to develop for in general, what the difficulties are and how it compares to Kinect 1 if you have worked with that too.
Russell Grain: We have worked with the Kinect 1 before on an earlier project. My favourite responses were “Painful” and “Imagine a Kinect 2 that doesn’t work”. Hopefully that gives you some idea what working with the Kinect 1 was like. On the plus side however you could simulate another person in the room by shaking a curtain next to you, so that’s a plus.
The Kinect 2 by comparison is fantastic. The impression that I generally have from our time working with the Kinect 2 is that it much more developer friendly. All the functions that you want output as a simple vector or float or occasionally texture. It makes integrating these inputs into other game engines much easier. It’s not particularly descriptive to say that everything is great but, with one or two small exceptions, it worked exactly as intended.
One noteworthy difficulty we had was with the Kinect assigning arbitrary numbers to player body ID, so we had to implement a series of nodes that tracks player bodies by physical location in the Kinect’s view frustum so that problem is solved now. Similarly for avateering we had problems with the jitter that would be introduced by forward kinematics (each rotation would be a little off that compounds down to shaky hands), this necessitated building the ability to smooth the outputs that make the information a lot more usable. For Earthlight we took the next step an inverse kinematic solution that allowed us to get rid of the jitter entirely.
There are also some things that are much more difficult to solve, for the time being Kinect has trouble telling the front and back of hands apart. Without heavy image analysis there isn’t a tremendous amount we can do about this beyond creating systems to intelligently guess which way the hand should be facing. Microsoft might have some answers and we are trying to weasel them out of them. So development continues.
Anyway that’s a bit about the mostly ups and occasional downs of developing with the Kinect 2.
Are you excited to see another first-person astronaut Kinect game being developed? Let us know in the comments below.