As long as voice assistants are on the rise, data privacy should be too. The market is now saturated with voice assistants from Siri and Google Assistant to Facebook Portal and Amazon Alexa. In January, Amazon announced that more than 100 million Alexa enabled devices have been sold globally. And, Google boasted that more than 1 billion devices, including Android phones, have the Google Assistant built-in.
Whether it’s your alarm clock, bluetooth speaker, or even microwave, voice assistants can be found in practically every room of a home or office. But more and more people are starting to think twice before asking Alexa for the daily forecast. According to a recent PwC survey, 38 percent of participants chose not to purchase a smart device because they “don’t want something listening in on [their lives] all the time.” Additionally, 28 percent of respondents are “concerned about privacy issues with [their] data/security.”
Voice assistants are not only considered a household item in today’s tech-centric society, but they are also pervasive within offices and companies. 85 percent of businesses will use voice technology – such as Microsoft’s Cortana voice-activated assistants – to communicate with customers by June 2019, and 44 percent of retailers say they will implement IoT devices like Amazon Alexa and Google Home in stores by December 2019. However, stakes as a result of minimal corporate transparency are much higher for enterprise users, with estimates of the average data breach costing businesses $3.86 million.
Are voice assistants feeding their algorithms or invading your privacy?
Consumers are concerned that voice assistants are eavesdropping on their conversations – sometimes rightfully so. For some, this seems like a farfetched worry, but it’s closer to reality than one might think. Like many consumer electronics and technologies, voice assistants are opt-in by default. By design, the voice assistant will always be monitoring for its call to action and will only start recording once you issue a command. Those recordings are stored in the device’s app along with other information from your Google or Amazon accounts.
This feature is critical for the machine learning process of voice assistants; if your voice assistant mishears you and you give it feedback, it will be more likely to understand you the next time around. However, tech can make mistakes and as of now, consumers are wary this machine learning technology is a privacy concern more than a tool for improvement.
Fear in these devices is often enhanced by sensational stories. For example, Amazon’s Alexa once recorded a Portland woman’s private conversation and sent the recording to her husband’s employee in Seattle. In December, Amazon sent 1,700 Alexa voice recordings to the wrong user following a data request.
The tech giant explained that these incidents were clearly a miscommunication with the device, not an indication of privacy invasion, but consumers remain concerned of voice assistants’ trustworthiness. Oftentimes in security, evidence of the few cracks in the system that demonstrate privacy risks are emphasized more heavily than the laundry list of regulations companies implement to keep data secure. As such, it is important for companies to lean into data governance and be transparent with customers about the steps in place to keep their data safe.
Protect consumer data
Due to privacy concerns and monetary impact, legislation on data protection and cybersecurity is burgeoning across the world, mandating organizations to establish policies to safeguard personal information, manage the risk related to their data, and address their legal responsibilities. In California alone, there are more than 25 state privacy and data security laws.
In 2016, the European Union introduced the General Data Protection Regulation (GDPR), which requires companies to comply with data privacy guidelines and protects individuals’ personal data. At the rate with which data privacy is becoming a widespread concern, it would not be unlikely for the United States to implement similar regulations in the not-so-distant future. With these increased regulations, it will be important for companies to understand the laws and how to comply with new directives.
Currently, the number of companies that are GDPR compliant is still very low. In fact, a large number of companies aren’t putting the resources, commitment and priority behind a data privacy program. However, the push for privacy regulation is not going away; it’s only going to intensify. If companies can become complaint with complex regulations like the EU’s GDPR, they have a strong model to support new state-wide or future federal legislation.
Fortunately, companies can start self-regulating before it becomes federally mandated. Self-regulation can take the form of putting in place standards and rules surrounding data-related issues such as accessibility and quality. This will form the foundation of a data governance program before it’s a requirement, giving companies a competitive advantage when the time to comply arrives because they’ve already set in place the processes to win and maintain consumer trust.
Voice assistants are ubiquitous in today’s marketplace, and despite some mistrust, the privacy risk they present remains largely unregulated. As the industry continues to grow, the push for legislation and threat prevention will only increase. I recommend bringing data governance programs to the forefront of your business conversations, as it supports data protection compliance and broader value of the data within your organization. And now that you know she’s listening, consider asking, “Alexa, why should I care about data governance?”
TNW Conference 2019 is coming! Check out our glorious new location, inspiring line-up of speakers and activities, and how to be a part of this annual tech bonanza by clicking here.
Get the TNW newsletter
Get the most important tech news in your inbox each week.