I don’t like talking. I’m the type of person who’s perfectly happy to sit silently in a group. So I felt a bit uncomfortable when “voice” was being touted as THE next big mode in interactive design. Speaking to an app sounded like a chore to me, but I jumped into the verbal side of conversational user experience (UX) design with Susan Miller’s Astrology Zone for Amazon Alexa. Now? I’m excited about what I’ve learned and what can be done with this new mode of interaction.
Conversational UX = Having a Dialogue
Part of a designer’s job is to tell a story to the user. In designing a conversational UX, you’re telling your story through a direct dialogue, whether it’s via voice with a virtual assistant or via text with a chatbot. You do that by creating a script in a developer-friendly syntax that defines the user experience. A few things to remember:
- Conversations aren’t straight lines. Gone are the days of touch-tone phone tree interactions—there is no single path to follow. That means you have to think about the different ways a conversation could go and anticipate as many variables in your design as you can.
- Responses will vary. No two users talk the same way, so your design should include some flexibility in how to respond. How this works depends heavily on the technology you use and how much natural language it can recognize. In the case of Astrology Zone, we were building specifically for Amazon Alexa, which has a particular structure and recognition pattern. Google Assistant draws upon more than a decade of search data to enable a deep natural language recognition system, so it will understand variations in phrasing more easily. (Siri performs similarly.)
- Users want to feel they are talking to a person. Think about it: Wouldn’t it be awkward if somebody you’re talking too suddenly sounded automated? Maintain a normal conversational tone across each interaction. Assistant even has the ability to add in smalltalk, which can keep the user engaged. With Astrology Zone, we added in messaging that Susan Miller uses to address her followers during sign-off.
- Remember, “Voice” doesn’t always mean just talking. Remember when I said that I don’t like talking? It turns out many people feel that way. Google, Amazon and Apple have responded by adding more visual components into their virtual assistants. For example, Google Assistant recently updated to allow for text input in addition to voice on the phone, which is very helpful when I can’t speak over my babbling baby. You still need to apply the same design approach, however, whether a user is literally speaking or conversing via text. On mobile, virtual assistants provide an opportunity to enhance the experience with visual elements like images, search suggestions, links and app content.
Conversational UX Guidelines Aren’t Fully Baked Yet
Read the guidelines, but bear in mind they may be incomplete. We’re in the early days of conversational design and documentation is still evolving. While my team was working on Astrology Zone for Alexa, we read everything we could get our hands on and spent plenty of time interacting with Alexa. Still, we quickly hit roadblocks with development. Why? I’d written our phrases according to what the guidelines said we could do—but those guidelines didn’t say what we couldn’t do.
I’m sure the conversational UX guidelines will become more thorough over time. In the meantime, if you have time to test and iterate, you can experiment with how phrases should be set up. If not, stick to the exact wording provided in the guidelines to be safe. Be careful with verbs and connecting words. There are only a limited number available right now (fewer than we thought). And watch out for possessives—they’re a pain. We never could get Alexa to understand “yesterday’s” correctly.
Further Exploration Reveals New Insights
As I got more interested in conversational design, I kicked the tires of multiple virtual assistant options. I had the most fun with Google Assistant. It’s by far the most robust of our new robotic-voiced friends. Its app development tool, Actions on Google, is amazing and enjoyable to use (I wish every tool worked the same way).
After being introduced to Actions at this year’s Google I/O, I dove in and was able to create a demo in no time. Actions allowed me to focus on writing interactions and possible responses rather than formatting and more technical aspects of design because the system trains the action for variants with every input. On the other hand, writing for Alexa feels like diagramming sentences.
This exploration led me to see how fast I could make a basic app to tell facts about my baby, and even add in some personality and expressiveness. I also tinkered with making a demo Guardians of the Galaxy experience—and learned that in the long run, you don’t want invocations and responses to be exactly the same. (“I am Groot!” followed by an “I am Groot” response…. And another “I am Groot!”… Trust me when I say this doesn’t work out well.)
Conversational UX Design Tips
Here are a few tips from my time on the Astrology Zone project and my other explorations:
- Provide natural guidance. Create an introduction that tells the user what the app can do and provides some simple suggestions for interactions. Users discover and explore conversational apps differently—they can’t just tap around a screen to find features—so you have to help them out.
- Keep it short and sweet, and let users be brief as well. For example, to initiate a conversational experience, it’s a good idea to allow users to say only the “invocation” (the app name) and some parameters. A user can initiate Astrology Zone on Alexa by simply saying “Astrology Zone Pisces Today.”
- Let the user mess up and guide them back. If an app only states that something went wrong, the user doesn’t know if it was something they did or something went wrong with the app. That’s frustrating. Instead, provide an error message with the reason for the error (wherever possible), along with options the user can select to get back on track.
- Test with multiple people who have different accents and speech patterns. You want to make sure users can comfortably converse with your UI.
Conversational Design Could Improve Accessibility
Users with blindness or visual impairments rely on screen readers to understand and interact with digital devices. On mobile, these readers are built into the operating systems—VoiceOver on iOS, TalkBack on Android devices. Conversational design can help enhance these current features by scripting the user’s experience with the app.
Last year, I worked on a proof of concept for a hospital mobile solution that would make indoor mobile wayfinding accessible for users with blindness or limited vision. I had to think through and write the experience—in this case, the user’s dialogue with the app was in the form of gestures. Imagine if this was pushed further with the use of a virtual assistant. Users would be able to navigate hospital facilities through natural conversation, without the cognitive load of dealing with a standard app UX.
We could also harness these discoveries and disciplines to design for any situation where visual intake of information might not be possible or advisable. For example, interacting with a visual UI on your phone is really not a good idea when driving—in fact, it’s illegal in many places. In the near future, apps may switch to voice-only conversational UI when driving is detected.
After all of this exploration and experience, I’ve come to believe that voice and conversational design will soon be an essential part of UX. I look forward to building even more in the future. In the meantime, check out Susan Miller’s Astrology Zone for Amazon Alexa.