how-alexa-is-listening

It’s frequently a humorous remark among individuals utilizing home artificial intelligence (AI) assistants that the algorithm is “constantly eavesdropping.”

Picture someone telling their voice-activated speaker, “Alexa, play ‘Dance’ by the Rolling Stones.” Then, subsequently, they encounter an advertisement for Rolling Stones poster artwork on their Instagram timeline.

Coincidental? Most probably it is merely a manifestation of targeted advertising in action, although it can leave some individuals feeling uneasy about the prospect of a device surveilling them.

These gadgets aren’t perpetually listening, but there are certain facets of this technology that consumers need to be cognizant of, as noted by Umar Iqbal, an assistant professor of computer science and engineering at the McKelvey School of Engineering at Washington University in St. Louis.

“Consumers should be enabled to capitalize on the advantages that generative AI offers,” Iqbal stated. “The challenge is how to ensure that these technologies align with user expectations, especially regarding privacy and security.”

Iqbal has previously researched how Amazon leverages smart speaker interaction data to deduce user preferences and subsequently utilizes those preferences to deliver tailored ads.

He explained that these AI assistants based on smart speakers function by being triggered by specific words, like “Alexa” or “Siri.” Amazon’s FAQ page refers to these words as designated “wake words.”

Recently, some Amazon users were informed about a modification in how the company handles voice directives given to AI assistant Alexa. Users are no longer given the option to process voice commands on the device itself; instead, all commands will be transmitted to Amazon’s cloud servers for processing.

Historically, Amazon and other technology firms have typically employed voice command data to personalize advertising for each user. Furthermore, Amazon now seeks to utilize these commands to enhance Alexa+, the next evolution of its AI assistant. However, until recently, users of three earlier iterations of Echo were permitted to opt out of having their voice commands processed server-side. Nevertheless, a very small fraction of English-speaking customers selected that option, according to Amazon.

The notification sent to Amazon users and distributed to the press indicated that if a customer’s Echo device was configured to “Don’t save recording,” their Voice ID functionality would not operate. Customers who opted out of saving recordings would forfeit many of the personal assistant-like features that Voice ID provides.

Iqbal pointed out that the core issue lies in the absence of transparency. Users ought to have control over how their data is managed and ultimately erased.

“From my perspective, the lack of transparency leads to a deficiency in trust,” he remarked.

Being less forthcoming about the destiny of those recorded requests merely amplifies pre-existing fears regarding the technology, he added. Iqbal likened Alexa to chatbots like ChatGPT, noting that Alexa employs large language models to interpret information. It is essentially a voice-driven chatbot.

Alexa functions on the hardware of smart speakers such as Echo and on mobile devices. These devices lack the computational power to operate large AI models, which is the reason Amazon prefers to send these recordings to the cloud for processing. However, Iqbal mentioned there are alternative ways to incorporate large language models that grant users more control over the fate of their data.

For example, Apple products provide enhanced protections regarding the selective transmission of data to the cloud and improved oversight on data tracking.

“There are alternatives that are more favorable to users and their privacy,” he remarked.

As AI assistants start to take on increasing responsibilities for consumers, such as organizing vacations and scheduling work events, the risk to data security escalates as various algorithms exchange user information, potentially exposing them to data manipulation. Iqbal and his colleagues at McKelvey Engineering have created tools to mitigate these risks. One such approach, known as “IsolateGPT,” maintains external tools separate from one another while still operating within the system, permitting the AI assistants to fulfill their roles while safeguarding user data.

The post How Alexa is listening appeared first on The Source.


Leave a Reply

Your email address will not be published. Required fields are marked *

Share This