The era of the AI agent is upon us. Artificial personalities like Siri, Cortana, Alexa and others are finding their way into our lives. Most of the time, they help out by giving us information or completing tasks upon our request. But sometimes they act up a bit and do something we didn't anticipate, like order a dollhouse.
That's what Amazon's Alexa, a personal assistant program found on Amazon Echo and related devices, reportedly did for some television viewers in San Diego, California. During a recent morning report on the CW6 News Station, anchors talked about a 6-year-old Texas girl who used her parents' Echo to order a dollhouse and 4 pounds (1.8 kilograms) of cookies. Then the trouble began.
While the anchors reported on the story, they apparently triggered several Echo devices owned by people watching the report on television. Below is CW6's report on that initial report:
The affected devices dutifully put in an order for a dollhouse. The dollhouse real estate market experienced a brief boom. There's no word on how many, if any, of those orders went through as actual purchases.
The humorous story illustrates the challenges companies face when designing voice-activated digital assistants. Amazon's strategy is to enable online purchases through Alexa by default, which makes sense from Amazon's perspective. People can change the settings on their Alexa-enabled devices to require authentication before making a purchase, but that responsibility falls to the owner, not Amazon.
I can report that Google Home sometimes reacts to audio from television, too. I own a Google Home device, and it has piped up a few times while I was watching something on TV. Fortunately, it has yet to place any orders. Considering that I've been watching a lot of “It's Always Sunny in Philadelphia,” I am thankful for that. The worst I've experienced is Google Home protesting that it didn't understand what it thought I was saying.
A more sobering aspect of this story is the realization that these devices are always listening to what is happening within an environment. They have to listen in order to respond when you give the command phrase. Questions remain as to how much of our conversations are cached in the memory of these devices and how secure the gadgets are from hackers. It's not hard to imagine a poorly secured AI agent effectively transforming into a microphone that records everything going on in your house. Wired ran a great story on this very issue in December, in case you're interested in doing some further reading.
Perhaps in the future we'll see more companies deploy authentication strategies by default to prevent accidental purchases or other unintended actions. For example, it would be awkward if your AI assistant ordered a cab for you every time it heard someone on TV doing so. Or maybe we'll see improvements to voice recognition technology so that these devices can tell the difference between their owners and other voices.
In the meantime, it's a good idea to research any device that uses voice activation before you adopt it. For some people, the potential impact on privacy might be too great for comfort. For others, a few surprise dollhouses might be a small price to pay for digital assistance.