Talking to Machines: The UX of Virtual Conversation

How AI is improving the way we talk to machines.

by Adobe Communications Team

posted on 05-15-2018

Saturday Night Live’s spoof of curmudgeons calling Amazon’s Alexa by the wrong name raises a meaningful point for UX developers: the next big opportunity in user design goes beyond creating just visual experiences. Add auditory – or aural – interactions to your job scope.

Voice-activated technology began with IBM’s Audrey in the 1950s, then matured enough for use by the U.S. Department of Defense in the 1970s, and grew its vocabulary to thousands of words in the 1980s. Today’s mass-market access to Siri, Alexa, Cortana, and other voice assistants raises an opportunity and a challenge for UX developers.

“A voice assistant that merely replaces the push of a few buttons is passé,” says Dave Bilbrough, manager of Enterprise Architecture and Innovation at Adobe. “A growing number of people are asking for help with their work flow.”

An Adobe study found that 82 percent of U.S. employees say technology improves workplace communication and collaboration, but some are wishing for more:

These and other findings from “The Future of Work: More Than a Machine” study demonstrate growing demand for more conversational interactions with machines, beyond search.

“As technology evolves, we get better at understanding what people say and the intent behind their words. We get better at turning knowledge into commands. And we’re learning that people like to talk to machines, but they don’t necessarily like to listen to machines,” says Dave.

Some of the common problems with today’s AI-powered voice assistants are recognizing different accents and unfamiliar or complex words with many syllables. One popular video demonstrates this in detail, when the producers asked individuals, with various accents — American, Scottish, Irish, German, Australian, British, Japanese, and Italian — to ask questions or give commands to the voice assistants. Each question or command involved complex or unfamiliar words like “Worcestershire” or “Benedict Cumberbatch.” While the voice assistants were successful some of the time, they were stumped an equal number of times.

Voice-operated assistants promise to revolutionize a variety of tasks that we face in daily life, but without continual improvement in the user experience, voice technology won’t live up to its potential.

Dave points out that if a new device is able to make life recognizably easier or simpler, it proves its value. And as UX developers learn more about how people try to interact with voice interfaces, they’re able to add more and more complex tasks to a device’s repertoire.

Improving the UX of voice assistants

Gathering and cataloging data. Developers have spent the last five years gathering data on voice assistant usage and learning how users communicate and interact with the technology. Gathering and analyzing the data requires a significant amount of time and energy, but AI is capable of rapidly collecting and interpreting the information and identifying trends.

“One of the key data points we can help brands collect is the frequency and the nature of errors from the interaction with an assistant, like asking a question that the assistant doesn’t understand,” says Colin Morris, director of product management for Adobe Analytics.

“Designers need to have an understanding of whether they’re getting people through a certain ‘flow’ with as much efficiency as possible, and then design, with context, all the different scenarios that could be queried on that next step.”

Retaining memory in search functions. Beyond understanding intent and context, voice assistants must have a degree of memory retention to be successful. Usually when a person refines a search function, the search engine doesn’t take into account the previous questions asked, treating each request as its own entity. However, machine learning algorithms have the ability to remember information and apply it to later searches.

“Try doing a search for something while you’re driving and you don’t have your hands available to type — it’s maddening,” says Michael Scherotter, principal technical evangelist at Microsoft.

Instead, voice technology should be able to understand context to make the interaction a better experience. Michael shares an example:

Me: “Find me a flight to Utah for my meeting on Wednesday.”

Voice assistant: “I found a flight on Tuesday night since you prefer evening flights. Would you like me to book the 8 p.m. flight to Salt Lake City?”

Me: “Yes.”

Voice assistant: “Should I book a nearby hotel that you usually stay at as well?”

“The most helpful assistants use AI to review your previous flight history, find your airport and time preferences, then cross-reference that information with available flights,” says Michael.

Communicating like a human. In addition to understanding the purpose of the technology and the context necessary to complete tasks, voice assistants must be able to communicate using the correct tone.

“People want to feel like they are talking to another person and receive information with speed and efficiency,” says Dave. AI algorithms step in to help voice assistants “learn” how to act and react like a human would.

Michael notes that each of the most prominent voice assistants on the market today has a distinct personality. They can make jokes, use sarcasm, and express apologies. Imbuing digital assistants with human traits makes users more comfortable with the machine and helps the technology fade into the background.

Using voice technology at work

Voice technology, even in its imperfect present state, takes care of mundane everyday tasks that would otherwise take up more time than necessary. And, its use isn’t limited to consumer applications. Using voice assistants like Siri, Alexa, or Cortana in the workplace could improve productivity by allowing employees to get quick answers to pressing questions.

Introducing voice technology to an internal business structure raises some questions, however. A company-vetted, voice-based assistant is yet one more technology for the IT department to manage, so the productivity gains need to offset the costs of implementation and maintenance. In addition, many companies may want the voice to be compatible with their own brand, and that raises questions about customizations and even opens new possibilities for developing the machine to learn specific lingo.

As developers, designers, engineers, and industry leaders incorporate AI to help develop smarter solutions, we’ll get to witness the full potential of voice technology in a variety of experiences. Menial tasks will be handled by an efficient and ever-smarter machine, leaving more time to focus on more rewarding work — whether in our personal lives, at work, or interacting with brands. The more developers understand about the way people talk and listen to voice devices, the better the technology becomes.

To learn more about the future of AI and the benefits it offers, read more articles in our Humans and Machines collection.

Topics: Design, Digital Transformation