Augmented reality has become a lot more accessible to developers in recent years. The introduction of ARKit in iOS11 and ARCore in Android N means it’s easier than ever to create amazing, mind-bending augmented reality experiences. Augmented reality has been around for a long time, but it seems as though the technology will become much more prevalent in the coming years thanks to the backing of Apple, Google, and Facebook (see AR Studio).
Having used Unity3D for a number of projects, mainly to develop interactive displays for museums, I was very pleased to hear that Unity included support for ARKit at the launch of iOS11. Augmented reality in Unity is made possible using a plugin that interfaces with the ARKit SDK, making the process of creating AR apps fairly straightforward for developers familiar with Unity.
Lured by the prospect of a shallow learning-curve I decided to experiment, both from a programming and user experience design point of view, to gain a better understanding of what could be possible with AR.
Below are a couple of demos I’ve created in Unity using the ARKit SDK plugin, running on iPad (5th generation).
The video above shows a simple remote-control car augmented reality app. The model can be driven around a flat surface using forward (accelerate) and left/right controls – exactly the same as a remote-controlled car. Driving this little duck-car around your desktop is actually great fun (even if it is completely pointless), so the possibilities for AR gaming are obvious to see. I imagine multiplayer AR gaming would be an immensely entertaining experience, especially if your fellow players are with you in the same room.
In an earlier version of the remote-control car demo I used separate buttons for ‘forward’ and ‘reverse’ but the controls didn’t feel as intuitive; on several occasions I had to glance down at my fingers to relocate the reverse button after a mishit. So from this early experiment I’ve found that when manipulating a 3D model in AR-space, fewer/simpler UI controls yields a better user experience.
This second video shows an early experiment using AR to place and manipulate a 3D model in physical space (which in this case was a nearby high street because my office was too small to get sufficient perspective). This demo is still fairly crude at this point but it shows how augmented reality can be used to visualise large-scale structures within real environments, which could be useful in several industries – especially in construction where mistakes identified in the design phase can prevent costly manufacturing defects. From a UX design point of view, it raises the question how should users interact with 3D models once positioned in space? One-finger to reposition, two-fingers to rotate? Should the rotate gesture be like turning a dial, similar to how a someone would interact with a dimmer switch or analogue volume control?
I intend to explore the area of UX design for augmented reality apps in more detail in a future post, but for now these simple augmented reality experiments have given me a good insight into how the technology can be used beyond Snapchat-style filters and effects!