
Google has announced that it will expand the capabilities of a new feature for the Nest Hub and Hub Max that started rolling out in November. It uses ultrasonic presence sensing to detect when people are around so the devices can automatically customize their user interfaces and the information they display.
The new feature uses the built-in speakers on the Nest Hub and Hub Max to emit high-frequency sounds and then listens for feedback using their microphones—echolocation, in other words. With this capability, the Nest Hub and Hub Max can detect people up to five feet away and customize what’s being shown on-screen, such as automatically increasing font sizes when you’re further away, or exposing touch controls as you approach.
Since the ultrasonic data isn’t detailed enough for something like identifying specific people, it can’t quite tailor the experience based on who is using the device. However, the sensors inside are refined enough to know when you’re looking at the smart assistant and how far away you are. “If you’re close, the screen will show you more details and touch controls, and when you’re further away, the screen changes to show only the most important information in larger text,” Google explains.
So far, the ultrasonic sensing capabilities baked into the Nest Hub and Hub Max are mostly only good for keeping an eye on timers. For instance, if you set a timer to remind you when your food has finished cooking, and you look at the device from a distance, it will automatically make that information more visible to you. Google also says the feature works for commute times and weather but we haven’t witnessed that for ourselves yet.
Going forward, the company plans to begin supporting the ultrasonic sensing feature for reminders, appointments, and alerts. In the coming weeks, you can expect to see interfaces around those areas of the Nest Hub and Hub Max starting to adjust themselves so you can see important information when you’re far away, and you have all the finer details when you’re closer up.
We wanted to create a better experience for people who have low vision. So we set out to create a way for more people to easily see our display from any distance in a room, without compromising the useful information the display could show when nearby. The result is a feature we call ultrasound sensing.
We needed to find a sensing technology that could detect whether you were close to a device or far away from it and show you the right things based on that distance, while protecting people’s privacy. Our engineers landed on one that was completely new to Google Assistant products, but has been used in the animal kingdom for eons: echolocation.