How AR Instruction Manuals Drive Superior Product Engagement
The words conjure up images of thick black-and-white tomes, boring, underused, maybe even untouched for years in our homes. But no longer.
Augmented reality has “blown the dust” off the instruction manuals of yesteryear.
Augmented reality (AR) is maturing as a technology and its adoption on mobile devices is growing steadily. According to research and consulting company Tractica, expanding use cases for mobile AR will lead to growth from 343 million unique monthly active users (MAUs) globally in 2016 to nearly 2 billion MAUs by 2022. Thanks to the mix of virtual and real world experiences it provides, AR offers users a more engaging and interactive experience, and is even more powerful when combined with artificial intelligence (AI) and computer vision capabilities.
Forward-thinking enterprises are using visual AR to enhance the entire customer journey. These companies are harnessing the power of AR to create an effective user engagement tool that connects its customers to its products. Every day, new opportunities for enterprise AR usage are discovered.
Interactive Visual Guidance: The Value Proposition
Today’s AR-based instruction manuals bear no resemblance to the printed booklets of the past. AR provides an effective digital interface that displays the content visually during initial setup, configuration, troubleshooting, regular maintenance, or for demonstration of proper usage. AR delivers an immersive experience, enabling the user to feel self-reliant, empowered, and in control, translating directly into a positive CX.
From the enterprise standpoint, AR manuals deliver better results, significantly alleviating the pressure on the service operation. There are less calls to customer service asking for help, reduced need to dispatch technicians to customers’ homes, and less NFF returns due to lack of customer knowledge about using the product.
AR Manuals: Effective throughout the product life cycle
Unboxing and installation
One of its most popular applications of AR has been assistance with setup and installation. A typical AR unboxing guide provides superimposed instructions on top of a video representation of the physical product to be assembled. For example, Ikea’s AssembleAR app – built on Apple’s ARKit – utilizes the original diagrams and layouts of the paper IKEA manual but overlays them with animation and life-size references that simulate and clarify the self-assembly process. TechSee’s Virtual Technician uses augmented reality to accurately instruct customers on how to properly connect any electronic device from a modem to a coffee machine.
Enterprises have also made good use of AR capabilities by providing users with an overview of their new product functionalities via visually immersive experience. Car manufacturers were early adopters of the AR manuals, allowing consumers to use their smartphone or tablet to get acquainted with every element on their new vehicle’s dashboard, with some manuals also including how-to information for basic repairs and maintenance, such as checking oil, refilling wiper fluid, etc. Today, many consumer-facing products make use of engaging AR-based user guides, which transform the initial interaction with appliances, electronic devices and consumers packaged goods into an easier, fun and improved experience.
The lengthy troubleshooting process can be performed independently with the help of an intelligent, interactive visual guide that clearly explains the instructions in a step-by-step format. Troubleshooting involves a complex, non-linear process, beginning with identifying the problem, determining the correct sequence of actions based on the particular device or circumstance, communicating the solution to the customer, and testing to ensure the resolution was effective. In order to successfully manage this process, the AR manual must be supported by mature and advanced computer vision and AI deep learning capabilities.
Technological Requirements of an AR Manual
Computer vision uses object and motion recognition algorithms to identify images and objects in the user’s physical environment. This capability enables an AR app to recognize a product, device or part thereof, using advanced image processing — or when available, barcode and serial number reading — to identify the object. The higher its capacity for identification, the more powerfully it can perform: identification of the device itself, its different parts and ports, auxiliary parts such as cables, screws etc., as well as the capability to identify relevant elements in the physical space around it, such as wall sockets or jacks.
In order to align interactions between real and virtual objects in an AR environment so that the overlay is kept in position even through movement, a continuous, real-time precise understanding of the shape and location of the real objects in the environment must be acquired.
Deep learning, the most advanced form of AI, enables independent learning of massive data sets. Unlike classic methods in which a human expert needs to define features (rules and attributes), deep learning can learn directly from data without human intervention. When powered by deep learning, an AR app can learn from each consumer interaction in order to continuously improve the process flow for future customer interactions. Data collected from each interaction may include an image bank of devices and models, relevant sequences of actions, possible results, etc.
What was once science fiction has already become everyday reality. AR has found a practical use as an instructional tool for many industries. As the perfect interface for visual guidance, AR is considered the most suitable technology to meet customer demand for consumer product related CX excellence, especially in light of the increasing number of IoT devices flooding the market.
With the right features and capabilities, AR is poised to serve as the underlying technology to improve consumer-product engagement — or simply the future of user manuals — and conserve the environment by saving lots of paper at the same time.