Apple WWDC 2021: Alongside announcements of the arrival of upcoming updates to iOS, iPadOS, tvOS and watchOS, Apple on Monday announced at its annual WWDC developer conference that it was launching support for better Spotlight search and a new Live Text feature using improved on-device features.

Live Text is a new OCR-like feature announced by Apple that will allow users to point their cameras at a particular object or scene and immediately interact with text visible on the screen. For example, users can basically place a cafe’s sign in their camera viewfinder and tap the text of the contact information to place a call. Apple says you can simple copy text from a “whiteboard” by pointing your camera at one, and then paste it in a mail app and sent it to someone.

Spotlight now searches photos by location, people, scenes, and objects.

Spotlight now searches photos by location, people, scenes, and objects. (Apple)

Meanwhile, Visual Look Up on iOS 15 and iPadOS 15 will allow users to point their cameras to look up landmarks, plants and flowers, popular art, breeds of pets and even find books. This also sounds rather similar to another Google feature that already exists, on Google Lens. Apple also says that Spotlight is also getting a lot smarter with the ability to find text and handwriting in your existing photos, which is rather neat.

More From This Section

While it is true that Google Lens has offered features like these for a while now, Apple is touting the privacy benefits of the Live Text feature and the Visual Look Up feature to identify text on the camera and in existing images. Since Apple will be doing all the processing work on the iPhone or iPad that you’re on, using on-device intelligence, it looks like none of the information about those images or what you see on your camera viewfinder will ever leave your device.

Source link