What happens when the computers around you all but disappear?

Tiny sensors built into walls, household products, what you’re wearing, and perhaps your own body will make computers invisible to the eye, but responsive to a gesture, voice, and perhaps your movement as you walk into a room.

It is still very early, but the era of ambient computing is slowly taking shape, whether in the form of the voice-driven smart speaker on your kitchen countertop, or via the IoT (Internet of Things) devices and appliances that are designed to blend into the background. It’s a vision fused by advances in artificial intelligence, speech recognition, natural language processing, machine learning and cloud computing.

More: The real cost of setting up a smart home

Stakeholders include the giants of tech: Amazon, Apple, Google, IBM, Microsoft, and Samsung, among them. But disruption may also come from companies not yet on the public’s radar.

“The interesting ones will be… the Ubers of the IoT and the ambient world,” says Daryl Cromer, vice president of subsystem research at Lenovo Research. “And that’s what we’re still looking for.”

No one is suggesting that screens and keyboards are going to go away entirely, or that you’ll stop reaching for the smartphone.

“We still believe devices will play a huge part. They do certain tasks better than anything else, (and) provide a level of privacy, convenience and security that cannot be matched,” Cromer says.

But some of the regular features of our daily life may get computer-driven — without the tap of a finger. Imagine this:

Your autonomous car pulls into your driveway and the garage door opens, the front door unlocks, and the lights inside the house flip on. The temperature is already set to your liking, and the ideal music for the moment starts to play, tuned to your very mood. You’re reminded of a conference call you have to jump on an hour later, and are told it’s time to take your medicine.

Invisible sensors, feeding your movements and routines into cloud-computing servers where artificial intelligence systems absorb and refine the directions they give to the smart devices, will help make such scenarios happen.

The computers are watching

This ambient computing future is straight out of the world sci-fi writers envisioned for us decades ago — from leaps in communication and medicine to the potential for Big Brother-type surveillance.

Facebook is working on tech that will let you “hear” with your skin, an advance that could help people with hearing disorders.

Inventor and futurist Ray Kurzweil of Google predicts that in the 2030s, “we will have devices that are as powerful as your cellphones today that are the size of blood cells,” to keep us healthy.

There’s a perilous side, too. Tesla and SpaceX CEO Elon Musk and physicist Stephen Hawking have raised alarms that this artificial intelligence-driven future could lead to World War III, a dark cataclysm for human civilization.

It’s imperative that the companies pushing ambient computing pay heed to privacy and security. The specter of a government or organization exploiting these smart, data-hungry devices looms large.

Ambient intelligence is likely to expand through the continuing widespread deployment of sensors into devices that are capable of not only gathering information but reporting it back to systems run by the tech giants.

Vast amounts of data will reside in the cloud, while devices need to have “local” intelligence as well.

Battery life is crucial. “If you have to think about charging something it becomes less habitual,” says Dave Limp, senior vice president for devices & services at Amazon.com. “In an ambient computing world, the place where it works best is where it’s always on. That’s why I don’t think we’ve quite figured out ambient computing on mobile yet.”

Eventually, more devices and sensors will talk to one another, and begin to understand your “intent” or objective. Services constructed around such objectives will presumably follow.

Amazon’s push into an ambient intelligent environment is built around Alexa, the digital voice inside the company’s Echo-branded speakers.

More: Amazon’s Alexa is seemingly everywhere — except an Amazon phone

More: Microsoft-Harmon answer to Amazon Echo is promising, pricey, and plays catch-up

More: Google Home, Amazon Echo, Apple HomePod — or all 3? How to choose a smart speaker

More: Like USA TODAY Money and Tech on Facebook

“We do envision this world where Alexa can be everywhere—in devices we make, in devices third parties make, in homes, in coffeemakers, (and) dishwashers, says Toni Reid, vice president of Alexa Experience & Echo Devices at Amazon.com. “We definitely think that voice is the future of how we control technology.”

Google is pursuing a similar strategy around the Google Assistant and Google Home product line. Apple’s path includes Siri, an upcoming HomePod smart speaker, and the company’s HomeKit smart-home platform. Samsung owns the SmartThings line of smart home products and has teamed up through its Harman Kardon subsidiary with Microsoft on a speaker that uses Microsoft’s Cortana digital assistant.

“I think ambient in the home is going to take a lot of different forms but it’s going to be driven, especially for the next several years, around smart speakers and their extensions,” said Bob O’Donnell, the president and chief analyst at TECHnalysis Research.

Voice by its nature is invisible, but the results of your vocal request need not be. Amazon is also placing Alexa into products with screens, including the company’s Echo Show and Echo Spot speakers, as well as its Fire TV streaming devices and tablets.

Removing screens pose difficult computer science problems, says Rishi Chandra, a Google vice president who is general manager of Google Home products.

“The advantage of a UI on a phone is it tells you what it can do (and) it sets constraints…The moment you take that away it’s like, `oh great this thing can do anything I want it to do’ and you start throwing things at it. We’re finding people asking crazy questions and it’s great because that’s the bar that we have to hit.”

Making AI assistants more human

Another goal is to make our exchanges with the digital assistants more conversational.

Machines “need to be able to look at my face and say, `Am I happy or sad?’ and based on that decide what is the right thing to do,” said Jamshid Vayghan, the global chief technology officer at IBM Global Business Services.

Does that mean Alexa will get testy if you’re short with her? Probably not. But “you can imagine scenarios where based on your level of frustration maybe our responses change,” Amazon’s Reid says.

Amazon recently launched Alexa Routines that you can create from the Alexa app on your phone, a series of customizable actions. A simple way to start is with an “Alexa, start my day command” that may include having the speaker report the weather and traffic, play the news, and turn on the lights in your smart home.

Besides their main chore of vacuuming your floors, some of iRobot’s Roomba robots can automatically create a map of your home, for now to help the Roomba clean, but eventually factoring in lights, sensors, and such, and helping pave the way for smart home services. “So we can automatically do that logical programming for you in a way that the consumer is not burdened,” says iRobot vice president of technology Chris Jones. Jones insists user privacy is protected. For now you can control some Roombas by voice by linking them to Amazon Alexa or the Google Assistant.  

iRobot CEO Colin Angle took some heat for a July Reuters story suggesting the company could sell map data to Apple, Amazon or Alphabet (Google). iRobot disputed that account, which was corrected. iRobot says it does not share customer mapping data.

Ambient computing extends well beyond the home too. Samsung, for example, is working with partners on pilot projects to embed cameras and sensors in brick and mortar stores, with the goal of closing the knowledge gap between a shopper’s journey online to one in a physical location. Without capturing personally identifiable information, Samsung can track shoppers as they move through a store, determining where they spend their “dwell time”, what products they’re looking at and how they’re interacting within the physical space. Through facial recognition, Samsung can determine a shopper’s gender and approximate age.

“If you walk up to a wall of shoes, it doesn’t know which shoes you took off the shelf, but it does know that you’re standing in front of running shoes versus hiking books or dress shoes versus flip flops, says Ted Brodheim, a Samsung vice president for vertical business.

Tech may be out of sight but it should never be too far out of mind.


About the author

Related Post