More than 80 percent of today’s customer service calls originate from a mobile phone. Some Nuance customers say that more than half of their contact center interactions – voice and data – now involve a mobile device. Those trends are two reasons why so many organizations have made mobile customer service a priority this year.

Meanwhile, consumers increasingly say they prefer self-service options when interacting with a company. But that preference doesn’t mean that just any mobile customer service offering will do. Instead, here are five factors to keep in mind when developing and executing a mobile customer service strategy. They’re all based on questions that merchants, government agencies and other consumer-facing organizations have asked since the debut of Nina.

1) Unify the design process. Multimodality allows smartphone app users to input via speech, text and touch. It’s also a new concept for many developers. That’s why it’s important to design in a new way as well: audio, dialog and visual components all must be designed to complement one another and work together in a tightly time-coordinated user experience

A traditional IVR can handle disambiguation by asking the caller to say yes or no to one of a series of options. The dialog can be lengthy and difficult to remember, so options are typically grouped, users may ask for prompts to be repeated, and may ask to start over if the dialog is difficult to follow. In contrast, a multimodal design can use the screen to reinforce the dialog, and even defer some choices to touch-only. True multimodality – with fluid mode switching – conveys the necessary information to the user in each modality; enabling users to pay attention to the modality most convenient for them at the instant they need it.

2) Sync feedback. In a multimodal environment, feedback must be tightly synced. For example, the non-verbal audio (tones) and animations that indicate state change must be timed to coincide perfectly. If they’re up to 300 ms off, research shows that users may be confused.

3) Use widgets. A widget structure might seem like an unnecessary duplication of functionality that’s already in an app. That redundancy also might appear to increase development and maintenance tasks

In reality, widgets improve the user experience because the user’s focus is entirely on the transaction. The information necessary to complete the transaction is presented in a data-driven format. Widgets provide a template to which the developer can match the requisite data model. Without that widget template, the developer would have to create dozens or even hundreds of screens to accommodate every possible response.

For example, paying a bill typically is a four ‘slot’ (data element) transaction: To, From, Date, and Amount. Any given financial institution may reduce this to three or even just two slots, depending on whether they enforce pre-selecting an account to a bill or other business logic.  Nina’s widget templates make it easier and faster for developers to accommodate those kinds of requirements.

4) Know your states. Nina features three types of states: standard (e.g., the app is listening, the app is speaking, the user is speaking), emphasis indications (e.g., an alert, a completion notification) and vanity animations, which carry no information but are useful for branding the app and can add personality to the interaction.

Animations also are a great way to alert users when the app is listening and thus when they can speak. Many speech-enabled apps lack such animations, forcing users to wonder whether the app is doing something or whether they can go ahead and speak a request.

5) Give your app a personality. Nina’s appearance is designed to give an app humanlike characteristics, but without the need for an avatar. For example, the animations can deliver “emotional” content, such as occasionally blinking so users don’t have to wonder if the app has gone to sleep. The key to good animations is to make them blend into the background, where users can get information from them solely with their peripheral vision instead of having to look directly at them.  This keeps the native app in the user’s focus. Speech interaction is new and exciting – and it augments the customer experience rather than replacing it.

This blog is the first in a series that will answer questions that enterprises and organizations have asked about mobile customer service and how to design for a user experience your customers will appreciate. In the meantime, for more information about Nina, click here.

Let’s work together
Engage us

Let’s work together
Engage us

About Elizabeth Dykstra-Erickson

This was a contributed post by Elizabeth Dykstra-Erickson. To see more content like this, visit the Customer experience section of our blog.