We’re using hand tracking tech to recreate the experience of the Apollo moon missions as accurately and accessibly as we can. So yes, it’s slightly crazy for a small team like us to jump into it, but I can explain. Also, our engineer, Blayke, will jump in and give the juicy details of how we’re doing it.
We had already been working with Oculus VR when handtracking suddenly became a possibility, and our jaws were dropped. This was absolutely the technology we needed. The catch was that it was the most unstable technology we ever worked with. It was a gamble, and when we started, it didn’t seem like it would pay off.
The first problem was that there was a dearth of information when it came to this new hand tracking functionality, even on Oculus’ own Developer Portal. Even before working on this new technology, I knew I’d be facing a lot of hurdles, because not only did I need to implement it on my own, I also needed to make sure it was designed in a way where I could fallback to other input systems such as normal VR controllers, traditional mouse and keyboard, and also touchscreen input for mobile devices.
Additionally, I knew it would be hard to properly fit hand tracking into this experience. Matts and I talked about wanting the ability to push and pull ourselves around the LEM using our hands, exploiting Newton’s first law of motion and the zero-gravity nature of the experience. However… hand tracking in its current state is an extremely unreliable form of input, that also requires that the hands are within a ~180 degree field of view from where you are currently looking.
As a result, this means needing to design around cases where you are holding onto the LEM while looking in another direction, and being able to design around the fact the tracking systems could stop recognizing the hands at any moment (due to bad lighting conditions, hand poses the system just does not understand, etc.), or improperly guess where the hands actually are.
But I knew that as the tech matured (which it assuredly will, take a look at the latest published research by Facebook!) that these problems would be largely diminished or not even a problem anymore, so we figured it was best to get whatever groundwork laid out that we could.
Alas… we circle back to the main issue at hand, figuring out exactly how to get the tech working for Apollo.
I decided that I should work on top of the great open source project, the Virtual Reality Toolkit, as it should be able to provide me with a framework in which I can allow Apollo to work with the wide range of inputs that I wanted. Unfortunately for me, this is yet another place where the documentation is nigh non-existent. The project has been in a pre-release state for quite some time, and though the software is stable, the only places I can look for to understand how it works is through its own source code, and the hours long developer brain-dump videos on Youtube.
So I set out by making a test scene in Unity where I could play with VRTK and try to accomplish some basic zero-gravity movement with hand tracking as my input.
This was an incredibly time consuming and laborious task for me, with admittedly not much to show, as the majority of it involved me digging deeply into the source code of VRTK and combing through all the developer entries, trying to make sense of all the different components they offer, and wrapping my head around this new method of Unity development that consisted almost solely of hooking up components together in the Inspector, with Unity Events serving as the glue between them.
Unfortunately, this way of development can be incredibly difficult to read the flow of data and where/when the processing is done, because the Unity Inspector is all hierarchical in nature, where this would hugely benefit from a node graph representation (like Unity’s newly released Bolt, maybe the project can be moved to this one day… perhaps I will do it? 🤔) with the numerous interconnected pieces. It took me days to trace out some of the pre-built interactions offered by VRTK’s developers so I can modify it to Apollo’s needs.
I had some incredibly annoying bugs that took me a few days to squash, such as my momentum not being preserved when letting go of objects (was due to an API call I was making where I eventually learned on Oculus’ API documentation that it does not behave the same way with the hand tracking input as it does for the Touch controller inputs) and I also spent a considerable amount of time retooling Unity’s Gizmo debug tools to be properly visible in VR so I could do some useful visualizations while testing, such as visualizing bone distances between the bones of my fingers and the momentum vectors upon ungrasping the environment.
So where we are at now is more or less have a working input model that allows me to move through zero gravity by pushing and pulling on objects in my environment that works with traditional VR controllers, which has been integrated into the main Apollo scene. Seeing and using your hands in the LEM is really fucking cool! It only further solidified that this was something I really want to make sure I am able to get right.
Unfortunately there still is an unresolved flaw that I have not yet figured out how to resolve, and from what I’ve been able to gather it is because of the finicky nature of hand tracking. I cannot produce a reliable and consistent behaviour by moving the “virtual body” around with the hand tracking. It works 100% fine and correct with motion controllers, but not so with the hands.
But I am confident that I will eventually figure it out like I always do. Just need time!
We have two goals for our game: Accuracy and Accessibility. We want people to live through the Apollo missions like they were there - BUT - we also want this game to be accessible for as many people as possible to learn from and enjoy.
Those goals can be opposites sometimes, and walking the tightrope between them is what makes working on this project important. We could make a super-accurate simulator that takes weeks to learn and doesn’t deviate from history, but that would be as dry as a textbook. We could also make an unrealistic action-movie retelling of the history, but it wouldn’t have nearly the same impact as making it about what really happened.
So our first hurdle was the controller. Even with simple mechanics, we saw unfamiliar players struggle with the hardware and get demoralized. This was especially heartbreaking for older audiences who wanted to reconnect to the history, as well as younger players who wanted to learn it.