“It’s really hard.” Now that our friends in Redmond have been more open with peeks into the HoloLens program and those involved are free to speak (a little) more openly, I’ve been reading a ton of tech blogs regarding the experience. Unfortunately, it seems many of these writers have only a small idea of what […]

“It’s really hard.”

Now that our friends in Redmond have been more open with peeks into the HoloLens program and those involved are free to speak (a little) more openly, I’ve been reading a ton of tech blogs regarding the experience. Unfortunately, it seems many of these writers have only a small idea of what software engineers or platform architects do. I say that, because I have read a lot of “It can map your environment, and that is really hard to do.” Well folks – letting your buddy have the last pizza roll is “really hard.” This is actually… “5 solid years of NASA/MIT/American Mathematical Society research and development, nearly impossible, success means a change in the way we think about computers from the ground up…” hard. And they did it. Nearly as impressive as “Spatial Mapping” are the “Spatial Sound” capabilities – the device’s ability to broadcast a sound seemingly from your left, right, or behind you with pin point accuracy. These two core pieces of functionality are the backbone of what HoloLens offers and, so far, what no one can match. Without them, all you really have is a 3D object floating in space. With them, you have the world’s first holographic environmental experience. What’s the early Summary? Totally worth developing for, totally going to change the way we see the world, and totally going change the way we interact with technology. (totally).

“But How?”

The HoloLens hardware is still locked down – but there are some things we do know, and some inferences we can make. We know that the world mapping, gaze tracking, and holographic vision capabilities come from technology derived from KINECT – as in, “binocular (tri-nocular?) KINECT sensors” processing images and mapping the environment in real time. While knowledge of device sensors (Wi-Fi, Bluetooth, Accelerometer, Gyroscope, etc…) is not public either, it makes sense that the HoloLens uses all four of those heavy hitters. Wi-Fi in particular is logical for 1 epic reason – Azure Compute Power. Rumors abound of the new flagship phone (Lumia 940?) having an octa-core processor so, HoloLens should have one too, right? Probably – but the amount of data being processed and rendered would still be too much for that tiny powerhouse to handle.

Enter Azure, we know that every new Xbox One game uses its own Azure Compute Instance to render hundreds of thousands of particles and physics effects in real time. Why not a HoloLens app? Speaking of apps, with tools like Unity and Visual Studio as a foundation the HoloLens (basic) application “build and deployment” process seems pretty straight forward. Using the tools first hand – it was not much different than building out any other Unity scene and the code behind was straight C# with events triggered and bubbled up in the same fashion as every other Windows 10 demo we saw at //Build.

“But Why?”

So why would you want an app built as a hologram? Well, first of all – because it’s cool. Aside from the coolness factor though, Vectorform has been researching and exploring how, over the evolution of computers, humans interact with their digital environment. We’ve seen how user experience has evolved to become more natural, using voice, touch, and gesture. We have been pioneering the concept of natural user environment (NUE) and HoloLens makes great strides in coalescing a digital/natural environment future.

With HoloLens – the computer is the room. Everything. In. The. Room. Yep, handling and controlling digital objects naturally in “your” environment is the evolution of user experience. Thinking of a standard “long list” on a vertical plane, stuck to a wall isn’t doing the experience justice, isn’t using your environment. How about representing those items as crystal cylinders in small holders organized across the floor? How about, rather than clicking an item, you take a crystal cylinder and place it in a holder on a podium and all your content is launched, cascading around the room – fortress of solitude style? See? Superman had it right. In the coming weeks and months, the UI/UX experts at Vectorform will be hard at work defining what makes a holographic experience pop and creating use cases for interaction models for the common control types found in modern mobile applications.

“But What?”

Consumer, Education, or Enterprise? Apps or Games? Yes, yes, yes, yes, and yes. HoloLens certainly lends itself more toward one side of that scale than the other – but nothing should be off the table. As a developer, I can see myself building for all of these scenarios. The experience and interaction model is certainly tailored for completing a given task – a new tool for work, and we’ve seen this example played out via most of the available demos. Or as a teaching tool/group collaboration device (probably the rest of the demo’s you’ve seen). However, as a gaming device there are genres and mechanics that can benefit from environmental interactions; think bowling, ski ball, tennis, ping pong, even one on one fighters (can you beat Ryu?), or first person shooters.

The area where HoloLens may have trouble is the consumer app market. HoloLens, while comfortable, is too bulky for all day use, and no one would really pull it out to check the weather when they can quickly glance at their lock screen. Where it can make headway in the consumer app market is with experiential applications – applications you spend time in. Music and entertainment are obvious wins, and possibly a variety of exercise and fitness uses. I believe however, it will be the apps that take time to explore creativity in the user experience and user interface that will shine through. Personal productivity, and communication applications fall into this category of “experience” applications. While I might not want to use HoloLens to get a weather report, I might want a weather module included in my note taking and research app. Perhaps the smallest niche that has the potential for the largest impact is HoloLens integration with IoT devices, everything from a depth camera broadcasting 3D holograms of the baby’s room to my HoloLens, to visually flying with (or controlling) a drone through 3D space. As a developer, the world really is at my fingertips now. I can actually “build” anything.

“But When?”

Well – Now. Devices are nearly production ready; wire free, self-contained, durable, and great to look at – a solid beta. Tools are working, and if not completely bug free, moving in that direction. Again – a solid beta.

But until release:

  1. Get a copy of Unity 5 and get familiar with the layout of a 3D scene.
  2. Get a copy of Visual Studio and start playing around with Universal applications and the Windows SDK.
  3. 3) Most importantly – start thinking. Think about how your favorite app would be better in 3D space, how it would look in any given room… World news in a list…? Nope. World news on a globe! How would you walk around it? What components can be “touched?” How can you talk to it? Take the red pill, free your mind.

Related News & Views