Apple had its massive occasion yesterday, and in the course of the iPhone speak, they introduced Apple Visible Intelligence. The place you possibly can take pictures of something, and it’ll use Apple Intelligence for native search, procuring, homework assist, and extra.
Apple is utilizing Google, ChatGPT and I believe possibly Yelp and OpenTable with Apple Maps for these integrations. And sure, this seems quite a bit like Google Lens…
This a part of the speak begins at across the 57 minute mark however let me share screenshots of this.
Here’s a man taking a photograph of a restaurant’s entrance to study extra in regards to the restaurant – I believe on Yelp and OpenTable?
Listed here are the outcomes:
Then this one is taking a photograph of a motorbike, to look Google for that product and pricing:
The outcomes look tailor-made:
After which getting assist with homework utilizing ChatGPT:
Right here is the video embed at the beginning time, if you wish to watch:
So Apple is implementing AI as instruments, basically as built-in apps.
Ideas?
Discussion board dialogue at X.