Experimenting with Augmented Reality in Unity

Augmented reality has become a lot more accessible to developers in recent years. The introduction of ARKit in iOS11 and ARCore in Android N means it's easier than ever to create amazing, mind-bending augmented reality experiences.

Augmented reality has been around for a long time, but it seems as though the technology will become much more prevalent in the coming years thanks to the backing of Apple, Google, and Facebook (see AR Studio).

Having previously used Unity3D to develop a number of interactive displays for museums, I was very pleased to hear that Unity 2017.1.0 included support for ARKit. Augmented reality in Unity is made possible using a plugin that interfaces with the ARKit SDK, which makes the process of creating AR apps fairly straightforward for Unity developers.

Below are a couple of demos I created in Unity using the ARKit SDK plugin, running on iPad (5th generation).

The example above shows how AR could be used to create a simple virtual racing game. The car model (or in this case, duck model) can be driven around a flat surface mapped onto the physical environment. Even with this basic setup the possibilities for AR gaming are obvious to see, especially for multiplayer games where all the players share the same physical space.

In an earlier version of this remote-control car demo I used one button for 'forward' and another for 'reverse' but the controls didn't feel as intuitive and this distracted from the gameplay itself. So from this early experiment I've found that when manipulating a 3D model in AR-space, fewer and simpler UI controls tend to yield a better user experience.

This second example shows how the orientation of a 3D model can be manipulated in realtime using augmented reality. This demo is fairly crude but it does give us an insight into how AR can be used to visualise large-scale structures within real environments, which could be useful in several industries - especially in construction where mistakes identified in the design phase can prevent costly manufacturing defects.

From a UX design point of view it raises the question: how should users interact with 3D models in AR space? One-finger to reposition, two-fingers to rotate? Should the rotate gesture be like turning a dial, similar to how a someone would interact with a dimmer switch or analogue volume control?

I intend to explore UX design for augmented reality apps in more detail in a future post, but for now these simple augmented reality experiments have given me a good insight into how the technology can be used beyond Snapchat filters!