AI has
become an integral part of every tech company’s pitch to consumers. Fail to
hype up machine learning or neural networks when unveiling a new product and
you might as well be hawking hand-cranked calculators.
This can
lead to overpromising. But judging by its recent WWDC performance, Apple has
adopted a smarter and quieter approach. Sprinkled throughout Apple’s
announcements about iOS, iPadOS, and macOS were a number of features and
updates that have machine learning at their heart.
Some weren’t
announced onstage, and some features that almost certainly use AI weren’t
identified as such, but here’s a quick recap of the more prominent mentions
that we spotted: Facial recognition for HomeKit. HomeKit-enabled smart cameras
will use photos you’ve tagged on your phone to identify who’s at your door and
even announce them by name.
Native sleep
tracking for the Apple Watch. This uses machine learning to classify your
movements and detect when you’re sleeping. The same mechanism also allows the
Apple Watch to track new activities like dancing and... Handwashing. The Apple
Watch not only detects the motion but also the sound of handwashing, starting a
countdown timer to make sure you’re washing for as long as needed. App Library
suggestions.
A folder in the new App Library layout will
use “on-device intelligence” to show apps you’re “likely to need next.” It’s
small but potentially useful. Translate app. This works completely offline,
thanks to on-device machine learning. It detects the languages being spoken and
can even do live translations of conversations. Sound alerts in iOS 14.
This accessibility feature wasn’t mentioned
onstage, but it will let your iPhone listen for things like doorbells, sirens,
dogs barking, or babies crying. Handwriting recognition for iPad. This wasn’t
specifically identified as an AI-powered feature, but we’d bet dollars to
donuts it is. AI is fantastic at image recognition tasks, and identifying both
Chinese and English characters is a fitting challenge.
There are absences in this
list — most notably Siri, Apple’s perennially disappointing digital assistant.
Although Siri is AI-heavy, it mostly got cosmetic updates this year (oh, and
“20 times more facts,” whatever that means). A new interface is a welcome
change for sure, but it’s small fry when you compare Siri’s overall performance
with other AI assistants.
What these
updates do show, though, is Apple’s interest in using machine learning to
deliver small conveniences rather than some grand, unifying “AI” project, as
some tech companies have promised with their own digital assistants, claiming
to seamlessly improve your life by scheduling your calendar, preempting your
commute, and so on.
This latter project was always going to be a
failure as AI, for all its prowess, is basically just extremely good
pattern-matching software. This has myriad uses — and some are incredibly
unexpected — but it doesn’t mean computers can parse the very human
complexities of something as ordinary as your calendar appointments, a task
that relies on numerous unspoken rules about your priorities, routine, likes
and dislikes, and more.
The best
example of Apple’s approach is the new handwashing feature on the Apple Watch,
which uses AI to identify when you’re scrubbing your mitts and starts a timer.
It’s a small and silly feature, but one that asks little of the user while
delivering a useful function.
This is a
strong tactic for Apple that plays to the company’s long-held reputation —
deserved or not — for delivering software that “just works.” It also avoids the
sort of iterative, tech-fortech’s-sake update that can fall flat with the
average consumer, like Samsung’s Bixby.
No comments:
Post a Comment