Alongside the renewed Nest Learning Thermostat and Google TV Streamer launched today, Google quietly revealed details about new improvements coming to the Google Home platform. They all include Gemini in some form or another. Nest security cameras will get a new camera intelligence ability, while Google Home routines should become easier to manage than they are presently. The Google Assistant is also getting a minor update despite its disappearing footprint on the Android side of the ecosystem.
For current Nest Cameras, Gemini will soon enable the devices to spit out more accurate descriptions of what’s happening in the clip before you press play. The demonstration animation on Google’s blog shows the ability to type in a search query like “Did the kids leave their bikes in the driveway?” The Google Home app will then check to see if there’s footage related to this particular action in the recent archives. It’s more convenient than the current method of slowly sifting through each clip to find the one with the thing that happened—sometimes, I’m scrubbing through 30 seconds of video before I realize it’s not the clip I was looking for. I’m curious how specific I can get with the queries. I reached out to Google to ask if this ability is coming to older Nest security cameras—the ones launched before the 2021 reprise—and will update when I hear back.
Rather than relying on yourself to think up routines, you can use a new capability in the Google Home app that asks Gemini to generate one instead. If you type in something like “lock the doors and turn off all the lights at bedtime,” an example offered by Google, the formula will get filed as an automated routine. Again, I am interested in how specific Gemini will allow its users to get with the generated queries.
Google also gave us a glimpse of where the Google Assistant is heading, and it doesn’t seem to be phasing out any time soon, which is good news if you’ve been getting frustrated with your Google-led smart home, as I have. The Assistant is getting a voice update later this year that should make it more contextual and able to pick out the specific commands within a conversation. This language-model upgrade should enhance what the Assistant can currently do. The Verge reports the Assistant won’t need “specific nomenclature” to work and that it will be able to understand “ums” and “uhs” in the sense that it’ll wait for you to finish a thought.
These new features aren’t available in the smart home just. Google will roll them out later this year in the Public Preview and only to Nest Aware subscribers. If you’re not paying for it, you won’t get the prompt to try it once it’s available. At the very least, Google has a specific plan, so we’re not entirely in the dark about what’s next from the Assistant. We have yet to see how it will live alongside the Gemini chatbot integrated into Android.