Tuesday, November 5, 2024

Apple partners with third parties, like Google, on iPhone 16’s visual search | TechCrunch

Must read

Apple’s relationship with Google as its search partner is taking a new turn with Apple’s introduction of visual search, or “Visual Intelligence,” as the iPhone maker dubbed it Monday during the company’s “It’s Glowtime” event. Already, Alphabet pays Apple roughly $20 billion per year to make Google the default search engine in its Safari browser. Now, iPhone 16 users will be able to access Google’s search engine — and its visual search capabilities — with a click of the device’s new Camera Control button.

OpenAI’s ChatGPT, which is becoming accessible via Siri, was also shown as a third-party partner in a demo where you could aim your phone’s camera at your class notes and get help understanding the concept or problem with a click of a button.

With the Camera Control, Apple explained how users can quickly take a photo or record video, and how they’ll be able to slide their finger across the button to frame their shot and adjust options like zoom, exposure, or depth of field in a new camera preview experience. However, the button also provides iPhone 16 users with access to Apple’s new “visual intelligence” search feature, which is where the Google partnership comes in.

When first introduced, the iPhone 16’s Camera Control seemed like Apple lingo for “shutter button,” but as the event continued, Apple explained there’s more you can do with this new hardware feature. With Visual Intelligence, there’s more to it than just an easy way to learn about the things in the camera’s view; you now also have another way to access third-party services without having to launch standalone apps.

Essentially a visual search feature, similar to Google Lens or Pinterest Lens, Apple described Visual Intelligence as a way to instantly learn about everything you see. Across a few examples, Apple demonstrated how you could click the Camera Control button to pull up information about a restaurant you saw while out and about in town, or how you could use the feature to identify the breed of a dog you saw on your walk. The feature could also transform an event poster tacked on a wall into a calendar entry with all the details included.

Apple’s Senior Vice President of Software Engineering Craig Federighi then casually mentioned that the feature could be used to access Google search, too.

“The Camera Control is also your gateway to third-party tools, making it super fast and easy to tap into their specific domain expertise. So, if you come across a bike that looks exactly like the kind you’re in the market for, just tap to search Google for where you can buy something similar,” he said.

Image Credits: Apple

The demo showed a person tapping the Camera Control button while aiming their iPhone at a bike, then reviewing an array of similar options available for purchase in a pop-up window overlaid on top of the camera’s view. The grid of images and descriptions of the matching bikes was then followed by a smaller onscreen button that read “More results from Google,” indicating you could continue your Google search with another tap.

What Apple didn’t explain is how or when a push of the Camera Control button would know to turn to a third-party partner for an answer rather than a built-in Apple service — like Apple Maps, which was shown in the demo about the restaurant. Nor did the company fully explain how users would be able to control or configure this feature. Instead, Federighi said, somewhat vaguely, “Of course, you’re always in control of when third-party tools are used.”

Reached for comment, a Google spokesperson said the company didn’t have anything to share on its partnership at this stage. Apple didn’t respond to a request for comment. However, we understand the deal is a part of the two companies’ existing relationship and does not involve Google’s Gemini AI.

What’s interesting about this feature is that it presents a new paradigm for interacting with software and services beyond those that Apple ships with the iPhone. And it arrives at a time when the concept of an App Store has begun to feel dated.

With AI technology, users can ask questions, perform productivity tasks, be creative with images and video, and more. Those are things consumers used to turn to apps to do, but can now do from a new interface of talking and texting with an AI assistant.

Instead of rushing to build its own competitor to ChatGPT, Apple is presenting itself as the platform to reach third-party services, including AI technologies, search services, and likely other providers in the future. What’s more, it can make these connections by way of behind-the-scenes deals with partners — like its partnership with OpenAI on select AI features — instead of tapping into transactions taking place inside the apps as a means of generating revenue.

It also smartly keeps Apple’s reputation from taking a hit when a third party, like ChatGPT, gets things wrong (as AIs tend to do) or when a Google Search doesn’t yield helpful results.

Edited, 1:15 p.m. ET; reversed Apple and Alphabet in the intro section about payments. This was corrected.

Latest article