Lessons Learned From Our First Month Coding With The Hololens

Since working with the Hololens, we've seen it ignite people's imaginations.

It certainly has ignited ours. We are very impressed by this ready-to-use device's ability to synchronize complex hardware -- the stereo camera, camera position, gaze, spatial tracking, speech recognizer, and gestures.

We used to track and code these key elements ourselves, on a low level and through a myriad of devices, so we came to really appreciate the simplicity of the API. While it feels to me like the gestures are a bit simple -- especially if you've worked on Leap Motion devices -- I don't mind the constraints for now.

With the complications of hardware synchronization out of the way, we were able to focus on the holographic experience.

We are focusing our attention on four applications right now:

The Restaurant Experience allows a user to browse and customize a restaurant menu through holograms placed in front of them. The main controls are clicking and saying "next plate" (more of this in the video below).

Philip Hue lights control lets you change the color of a Philips Hue light or turn it off completely with gestures (also in the video).

Home Concert is a prototype that immerses you in a concert experience and lets you control the instruments playing.

Space Invaders in 3D is our popular augmented reality game remade for Hololens.

We decided to build our applications with Unity, which we worked with at the Holographic Academy last fall.

To get started, I did all of the tutorials available online -- except the one on sharing holograms -- on the Holographic Academy website.

Initially, I thought that the HoloToolkit would cover most of the use cases I would need. I quickly realized that it was still very early and missing a lot, though it was a great base to get started on the right track.

I found myself adding scripts to my own HoloToolkit from the different tutorials. For example, one of the tutorials was a simple prefab that simplified the light situation pretty well.

We quickly fell into using naming convention for our scripts. All global scripts and objects are suffixed Manager, and all the ones that get attached to a hologram and process incoming events are prefixed Action.

Initially we deployed remotely, but we've quickly moved to USB deployments. It's much faster.

This is what I've got working so far:

I spent a lot of time figuring out how to move holograms around using the gaze with a drag-and-drop gesture. I'm getting pretty good at that. I dissected a script from the tutorial and simplified it to its minimum to get to the essence of the mechanics, then reconstructed it back.

For some time, I had an annoying bug where the objects I started moving were jumping at me, but I resolved that. If that happens to you, drop me an email, I can explain why.

Once I got past the initial jumpiness, there was still an issue with our holograms' general stability. When you walk and move around them, they can sometimes be a little bit shaky. The gaze stabilizer did not helped. I believe I've worked through the issue and will report more in a future post.

I'm still looking into how we could make a standard positioning prefab similar to what the Holographic Windows Interface offers.

In short, our first month with the glasses was awesome and exactly what we expected. In the past few weeks, I've seen very fast improvements on the HoloToolkit, which in my opinion is the way to go. Between the HoloToolkit and tutorials 210 through 240, you have everything you need to get started developing for the Hololens.

In my next post, I'll talk more about our first full application: Controlling a Philips Hue light from your Hololens.

Are you developing an application or have an idea for ways to work with mixed reality? Get in touch with me -- I'm happy to answer any questions.