Language Tools for Voice.
There are thousands of voice apps available today (over 250,000 Alexa skills, and almost 1,800 Google Assistant apps, the last time I checked). But figures from the Alexa Skills store show that they have only 3% active users by the second week. The novelty is wearing off, and people are beginning to demand more useful, more effective voice apps.
As creators and makers, we need to move past doing voice just because it’s the next cool thing, and rather think about how to do voice. How is our voice interface unique? What is its purpose? How does it use language to effectively communicate? The ins and outs of language can often get overlooked (talking is easy, right? We do it all the time), but a few simple things, like structure and variation, can help to make sure your voice interface is intuitive and easy to use, encouraging people to actually come back.
Let’s break it down into 3 things:
- Sounding natural
- Handling commands
1. Sounding Natural
Sounding natural means users have less to learn.
If you’re creating a voice app, your goal is to help users achieve their goals. The way you use language will affect how quickly and how comfortably users move through the experience. Using natural, everyday language makes your voice app more intuitive and easy to use from the get-go. So, here are some best practices:
Get rid of jargon and keep things simple. This doesn’t necessarily always mean short, but avoid all the overly-complicated words. It’s like a website going from “authentication error” (what?) to “wrong password”. Tell users exactly what they need to do, or in this case say, to move forward. Voice apps have no visual interface with things like ‘back’ or ‘next’ buttons, so it’s up to language to move things along.
Questions are one of the best ways to help users move forward because they prompt people with what to say. Here’s an example, which shows why rephrasing a line into a question works:
Voice app says, “I can give you some more options”. It’s not very clear when it’s the user’s turn to speak.
Voice app says, “To hear more options, you can say ‘more options’”. Pretty robotic, a bit a lá touchtone, no?
Voice app says, “Would you like to hear more options?”. Naturally prompts the user to say “yes” or “no”. 🎉
Give and take variations
Think about all the ways a person could say something, and try to account for as many of them as possible. This not only makes things more natural, but it will also make your voice app a bit more inclusive. Imagine, for example, a person for who English is a second language — they may have different ways of phrasing things.
Give variation in your responses too. Keep things from getting boring!
The language and variations you use will get across your brand’s personality. It’s a great opportunity to create something recognisable as your own (though keep functionality front of mind).
2. Handling Commands
Help users quickly and easily complete tasks.
That’s the whole point in a command-based model. You can’t make it harder for someone to use your voice app than it would be to pull out their mobile phone and tap or swipe away their task. So, here are a few tips on handling commands well:
Explain what your voice app can and can’t do. A quick introduction when your app first starts up, letting people know they can book flights but not order pizza will save them from frustration down the line.
In face-to-face conversations with humans, body language like nods and eyebrow raises, or “mmhmms” and “no ways” show someone is actually listening to you.
A voice user interface will only have speech and maybe a little light on the hardware to communicate with, so verbal confirmations are the way to let the user know everything is on track. Below are examples of two types of confirmation, implicit and explicit. The scenario is: booking a flight with a voice assistant.
In the above example of an implicit confirmation, the assistant repeats what it heard to check this is correct. If it’s not, the user could then say no, stop that. At this point, implicit confirmation works because if the assistant does happen to list the wrong option, it’s not too late to go back. And if it is correct, it doesn’t frustrate the user by asking them to verbally confirm something every 5 minutes.
But when it’s time for the user to finalise the booking and hand over the money…
The voice assistant asks the user to verbally agree — after all, the consequences of getting it wrong are much bigger.
Consider primary functions
If the user needs help or needs to stop at any time, let them. Primary functions or known commands will become more and more useful as more and more people adopt voice apps and assistants. “Help” is a good one. As is “next” and “stop”.
Gather information from the user, to help them with their task.
This will involve more of a conversational model, in which user and VUI take turns to speak. Here are some important things to incorporate:
Context is a huge component of language — it’s the way everything works together to convey a particular meaning. Context moves a conversation forward. So if a voice app doesn’t remember what was said before, it becomes frustrating to use. Work with developers to build in context.
Strategise your structure
This is where we talk about questions again. Questions can not only prompt the user with what to say, but when to say it. So, put questions at the end of the sentence as a cue for users to speak.
Add conversational markers
Conversational markers are words or phrases that help us connect and organise what we say. They might be used to start or end a conversation, keep it moving, change the subject, or even express an attitude.
Conversational markers will help the experience feel more natural.
Natural, helpful, and easy to use
These considerations are just a few of the things you can do to make your voice user interface more intuitive and easy to get along with. As more and more people adopt voice, language will be an essential tool for differentiating the good from the bad, and the apps that are here to stay.