Sunday, November 17, 2024

Google Lens vs. Apple Visual Intelligence: Has Google Already Won? – Video

Must read

Google Lens vs. Apple Visual Intelligence: Has Google Already Won?

Google Lens vs. Apple Visual Intelligence: Has Google Already Won?

Google might have left Apple’s visual intelligence in the dust before it even exists on our phones. Let me explain. Hi, I’m Lexi. Welcome to your weekly dose of all things mobile. When Apple announced Visual Intelligence at the iphone 16 event, a lot of us were thinking the same thing. It sounds a lot like Google lens, right? You press the camera control button, hold the iphone up to something you’re curious about and your phone will tell you all about it. It can add an event to the calendar, bringing up chat GP T for more help or search Google if you see, say a bike that you might wanna buy some of this is similar to what you’ve been able to do with Google lens for a while now. But Google recently added a range of new features like voice input and video search to the Google app which go beyond what Apple showed us so far with visual intelligence and get this these features work on both I Os and Android versions of the Google app even better. You can get this functionality right now on the iphone without having to wait for Apple to officially roll out visual Intelligence in I OS 18.2 or whatever version, it finally hits our iphone sixteens on. I’ll show you how to get a similar experience using the Google app on any iphone complete with a custom button press in just a second. Ok, I know what you’re thinking. We don’t exactly know how visual intelligence will work yet beyond what Apple has shown us at the keynote because it’s not out and it’s not even in a beta release yet. And yes, I 100% agree. There are a lot of unknowns here. There’s also another element at play too. Apple has something called visual look up within the Photos app. You can take a photo or a video of a landmark plant or a pet and then tap that visual look up down here. How will this be any different to visual intelligence or will visual intelligence tie into this existing system? So many unanswered questions? But given that I can do so much with the existing Google app right now on the iphone, I’m not holding my breath for game changing visual intelligence features on top of the Google lens functionality that you might already know about including identifying things that you take photos of and finding stuff to buy. Here are the new things that it can do voice input, lets you hold up the phone to something, press and hold the shutter button on screen and ask a question. So if I have a random item in front of me. I can ask what is this? And it’s gonna pull up an A I overview using Gemini that gives me an idea of what it is and any other information that I might want, you can also record a video and feed that into Google lens to learn more about something and even narrate a question while you do it just like you would with a still capture. This one is available. If you’re enrolled in the A I overviews and more section within labs, you can do that by tapping on the labs icon in the top left of the Google app and then turn on under A I overviews. So say for example, I’m trying to play a friendly game of pool, but I am hypercompetitive, I can pull out my phone record a little video and say which one should I try and sync first? Alright, so you are ready to get some of that visual intelligence life on the iphone right now. So here is the formula to get Google lens on the iphone that simulates what it might be like to use visual intelligence when it finally rolls out first, make sure you have the Google app on the iphone, then open the shortcuts app and tap the plus icon in the search actions bar type Google and Google lens should show up, select it and then tap done. Now you can go into your action button settings in the settings app and swipe over to make it that shortcuts option. Make sure to select here that shortcut that you’ve just made. Now, if you don’t have an iphone with the action button, all is not lost, you can use the back tap to start the shortcut. With that same shortcut. You just made go to accessibility, touch back tap and then select either double or triple tap depending on which one you prefer. And then scroll down to choose that same shortcut. Ta da One housekeeping note. If you try and run this shortcut from the lock screen, you might get a prompt asking you if you wanna run it, you’ll need to unlock your iphone here. But if you do have face ID turned on, you’re probably looking at the phone anyway, when you’re pulling it out and it should just unlock automatically. And one final trick you can even run this shortcut from the control center swipe down and then long press it, then tap to add a control search for shortcuts and then add that shortcut. You just made, I’m curious to know if searching with video or voice is something that you would find helpful or would you even use it or are you feeling more like these tech companies are just trying to fall over themselves to bring us new useful A I features to market without a strong enough use case for me, it’s still early days with all of these new features, but I can tell you for sure. My iphone’s action button has well and truly been mapped to that google lens shortcut. Now, I really hope that there is something more to visual intelligence than meets the eye when it finally rolls out. So game on Apple time to show us what you got. Thanks so much for watching. I hope you enjoyed the episode. Make sure to drop me a comment with your thoughts about visual intelligence, Google lens and all things mobile. I’ll catch you next time.

Latest article