Alexa Just Became Less Private: What The March 28 Amazon Changes Mean

Amazon's decision to remove two key privacy controls from its popular smart assistant Alexa has sparked widespread concern among security and privacy experts. As of March 28, the "Do Not Send Voice Recordings" feature, which allowed users to opt out of sending their voice recordings to the cloud for analysis by Amazon, is no longer available.

According to an email sent to some users in mid-March, the feature was disabled due to Amazon's expansion of its AI capabilities, specifically Alexa Plus. The change means that users' voice recordings will now be automatically sent to the cloud to be analyzed and transcribed into text, raising serious concerns about user privacy.

Amazon has maintained that the changes are designed to protect customer data and maintain the company's focus on providing a secure and trusted experience for its users. However, experts have expressed skepticism about this claim, citing the removal of local processing options as a step backward in terms of user trust.

The Impact on User Trust

"If you don't trust Amazon, then the removal of the local processing option will likely make you trust it even less," said Paul Bischoff, security and privacy advocate at Comparitech. "Amazon says that voice recordings uploaded to the cloud are encrypted and deleted after they've been transcribed into text. However, I would have liked to see Amazon expand local transcription to more devices, not remove it."

Erich Kron, security awareness advocate at KnowBe4, echoed Bischoff's concerns, stating that "many people already have concerns about the intrusive nature of marketing and advertising" and that this change is not helping to build trust with customers. He warned that for those who are truly privacy-conscious, this may be the last straw when it comes to keeping these devices in the home.

The Experts Weigh In

"With AI hoovering up more data than ever and the goldfish effect in full swing, it feels like we're rewinding the clock," said Ray Walsh, digital privacy expert at Comparitech. "Amazon quietly removing local processing and storage from Alexa without consent, without alternatives, and without compensation for users who object is a textbook case of a safeguard being stripped away."

"This change undermines the trust that users have in Amazon to protect their personal data," said Bischoff. "It's a step backward in terms of user privacy and security, and it's a clear example of how companies like Amazon can be overly aggressive in their pursuit of new technologies without properly considering the consequences for their customers."

The Future of Alexa and User Privacy

As Amazon continues to expand its AI capabilities and push forward with Alexa Plus, experts are calling on the company to prioritize user privacy and security. With the removal of local processing options, users may be forced to reevaluate their trust in Amazon and consider alternative smart devices that better protect their personal data.

"For those who value their online privacy, this change is a clear warning sign," said Kron. "It's time for companies like Amazon to take user consent seriously and provide alternatives that respect users' boundaries. Anything less is just not good enough."