Ethical concerns for voice assistants
Oct 19, 2021
7 MIN READ

The Role of Ethics in Voice Assistant Design

As voice assistant usage expands to include greater audience sizes, varying age groups, and users in diverse regions, the ethics of voice technology have become important considerations for brands and voice AI providers alike. With the proliferation of voice interfaces, we are likely to see greater scrutiny of how and when data is collected, how it’s used, and how brands are working to ensure their voice assistants meet emerging ethical standards. If you haven’t started having the conversation in your organization, it might be wise to start now and avoid being surprised. 

What’s happening in the voice AI space isn’t unique. Typically, after any new technology gains popularity and universal acceptance, the public begins to question the unintended consequences of the technology. They can begin to question whether that technology is being used in the public’s best interests or if elements, such as data and tracking, are having inadvertent, negative consequences. 

When they are made aware of the issues, responsible companies typically respond quickly to questions and take measures to stay ahead of any concerns. Before they can act, companies need to be made aware of the possible concerns and given a way to address them. As with anything, transparency is often the key, and the more communication, the more users will trust the brand. 

While these issues are continuously evolving, and each brand should evaluate what works best for their users, there are general ethical considerations that arise when designing a voice assistant. Whether the brand has a developed voice assistant and is looking to improve or is in the first stages of planning, addressing ethical concerns now will create a stronger voice AI design.

Here are 4 ethical concerns you should consider when designing a voice assistant:

  • Privacy and data collection
  • Suggestive language
  • Child users
  • Cultural biases

Voice assistant privacy and data collection

Practices and designs of early day voice assistants created fear and wariness among consumers concerned about privacy, listening in, and data collection. This apprehension led to distrust between the brand and the user, to the point where some consumers decided to avoid adopting any voice-enabled devices. In fact, a survey by Voicebot.ai revealed that 33% of U.S. adults reported that fear of voice assistants recording them is their top reason for not purchasing the device. 

33% of U.S. adults state that the fear of voice assistants recording them is their top reason for not purchasing the device. 

Voicebot.ai

Some solutions to alleviate user concerns about privacy, listening in, and data collection include: 

  • Being transparent about how data is used and collected
  • Allowing users to opt-in or out of data collection
  • Creating a brand that is trusted by users
  • Using an edge or embedded voice assistant

Being transparent about how data is used and collected creates trust with the user, and giving your customers the option to opt-in or out of data collection provides them with a sense of control. When deciding whether to build, buy, or partner for creating a voice assistant, it’s essential that you evaluate whether users trust the brand you want to partner with. 

Implementing a voice assistant owned by one of the large tech companies with a history of listening in and data collection could dissuade consumers from using your voice assistant. Custom, branded voice assistants give your company the opportunity to form your own relationships, extend your brand, and determine and communicate your data collection and usage policies.

When the use case doesn’t require information from the cloud, manufacturers can choose other connectivity options, including edge or embedded voice assistants. Without a connection to the cloud, or a limited connection for pushing updates and information, sensitive data stays local on the device, and data collection ceases to be an issue. Alternate connectivity options are especially useful for voice assistants that store sensitive information, such as passwords. 

Privacy and data collection aren’t just limited to the voice AI industry. However, brands with voice assistants in their products, services, and mobile apps will want to stay ahead of such issues and plan for greater transparency and communicate their policies to create trust with their users. 

Suggestive language and voice assistants

Voice is one of the unique qualities that make us human. Because of that, many users anthropomorphize their voice AI, attributing human characteristics or behavior to their devices, and consider them like a friend. At times, when this relationship causes frustration those same people may choose to use inappropriate language when speaking to their voice assistant. Although designers and developers are often focused on creating the best experience with their product or service, keeping in mind these types of interactions is an important aspect of voice design. When designing a voice AI, brands should plan ahead for how their voice assistant should respond to suggestive or inappropriate language. 

Another cause for concern is whether or not users should express empathy and respect to their voice assistants. A poll of over 5,000 people by Android Authority revealed that 24% of users never say please or thank you to voice assistants, while 25% do all the time and 51% do sometimes. This issue becomes more prevalent when considering child users who may be practicing bad manners by commanding their voice assistants. One possible solution is to have the voice assistant respond with additional phrases when a please or thank you is used. Individual companies may want to consider how their branded voice assistant responds, even adding in a, “You’re welcome” phrase at the end of a response.

24% of users never say please or thank you to voice assistants, while 25% do all the time and 51% do sometimes. 

Android Authority

The issue of language can become much more extreme, though, with suggestive or inappropriate language, especially targeted toward female voice assistants. With the majority of voice assistants being female, many users have asked them inappropriate questions, some even rude or abusive. 

The voice assistant’s answers have the potential to reinforce gender bias by signaling that it’s okay to ask such questions or to discourage such questions and promote equality. While there are no simple answers to this ethical question, having a diverse team at the development stage can help design an inclusive voice assistant. 

Children and voice AI

Voice assistants can have many benefits for children, making playtime or learning more engaging through voice-enabled toys and tablets, giving them endless stories or songs on voice-enabled devices, or giving them easier access to smart TVs—eliminating the need to navigate complex remotes. In fact, a survey by Common Sense and SurveyMonkey found that 60% of parents say their young children interact with a voice assistant. However, there are some ethical concerns that companies will want to consider when designing voice assistants that children could be using.

60% of parents say their young children interact with a voice assistant.

Common Sense and SurveyMonkey

When a voice-enabled device is connected to the cloud, it has access to a wealth of information. Some of that information, such as explicit songs, movies, or answers to questions, may not be suitable for children. With children’s imprecise speech, the voice assistant may accidentally bring up something inappropriate. A possible solution for this is to enable parental controls on the voice assistant to ensure that such material can’t be accessed by children. 

There is also the concern of data collection and privacy for voice assistants specifically designed for children or ones that children are using. The Children’s Online Privacy Protection Act (COPPA) requires the Federal Trade Commission to issue and enforce regulations concerning online privacy for those under 13 years old. Companies will need to ensure that they are following this set of regulations. Communicating compliance to these regulations and letting your customers know that you are taking steps to protect the privacy and safety of their children will reassure parents while creating brand loyalty.

Cultural biases in voice user interfaces

When designing a voice user interface, it’s especially important to consider cultural biases, including racial, gender, accent, age, and regional. Creating an inclusive voice assistant will open the door to more users from more areas as well as promote the brand’s message of inclusivity and equality. 

Regardless of whether your voice assistant is intended for global audiences or for specific regions, you’ll want to consider the level of respect or tone that the voice assistant uses as well as any other cultural norms or accents used by your intended audience. Some cultures are more comfortable with a casual tone while others expect more formality from their voice assistant. 

Understanding accents or imprecise speech from children or the elderly is also important to ensure that the voice assistant is as accurate and responsive as possible for your users. Excluding specific accents or age groups could be taken as a sign of discrimination or simply as a message that those people aren’t an important part of your target audience. Inclusivity not only reflects well on your brand, but it also gives your company the opportunity to expand its user base.

Having a team of developers diverse in race, gender, age, and location can help overcome cultural biases in the voice assistant. Brands should also consider having a diverse range of people for user testing as well. 

As voice assistants continue to become a part of our daily lives, responsible brands should consider ethical concerns when designing voice user interfaces, including privacy and data collection, suggestive language, child users, and cultural biases. Companies wanting to stay ahead of the game should plan for these before their users turn to voice-enabled competitors that are transparent with their users. 

At SoundHound, we have all the tools and expertise needed to create custom voice assistants and a consistent brand voice. Explore SoundHound’s independent voice AI platform at SoundHound.com or speak with an expert or request a demo below.

Kristen is a content writer with a passion for storytelling and marketing. When she’s not writing, she’s hiking, reading, and spending time with her nieces and nephew.

Interested in Learning More?

Subscribe today to stay informed and get regular updates from SoundHound Inc.

Subscription Form Horizontal