AI manifested in assistants or chat bots are the interface, the surface. What's happening deep inside the silicon is something else entirely...
I've mentioned before how I think what's going on behind the scenes with Artificial Intelligence, beyond the assistant- and chat-based interfaces, is every bit as exciting as what we're already seeing. Apple's kept quiet on their own plans in the area for a long, long time. This week, though, Tim Cook let just a little bit out.
Talking with Nikkei, Cook said:
Apple intends to capitalize on AI in various ways, in cooperation with Japanese companies. AI is "horizontal in nature, running across all products" and is used "in ways that most people don't even think about."
"We want the AI to increase your battery life" and recommend music to Apple Music subscribers, he continued. As another example, he said AI could "help you remember where you parked your car."
It's the "increase your battery life" part that's especially interesting here. There's no way to know what Cook is referring to specifically, but some reasonable extrapolation that can be made, given Apple's products and services to date.
Apple holds a massive and growing advantage in silicon — they make their own chipsets from the A10 Fusion that drives the iPhone and, soon, the iPad, to the M10 motion hub, the S2 system-in-a-package for Apple Watch, the W1 wireless chip in AirPods, the timing controller for the 5K iMac, and the list is growing.
That means they can build the atoms specifically to support the bits, and vice-versa. It's why iPhone can perform better with fewer cores, and have greater power efficiency even with a smaller battery.
We've seen Apple's AI, Machine Learning (ML), and Computer Vision (CV) efforts to date in everything from Siri to Proactive to Photos search. That's all in the software. What happens when it's baked into the silicon?
Previously we could launch apps with icons. Now we can launch actions with 3D Touch. What happens when iPhone has a reasonable chance of predicting what app and what action you're going to launch next — at the chipset level?
What happens when it can intelligently manage power not just based on what you are doing, but what it's learned you will be doing?
Worst case, if it messes up or you do something extraordinary, it's a wash. Best case, the order of efficiency is improved significantly.
Cook is remarkably straightforward when he speaks. When he says AI is "horizontal in nature, running across all products," take him at his word and consider the implication, then consider it again from a company that has full vertical integration from server to interface to device chipset, and will eventually have AI running horizontally across all of it.
A10 Fusion and Siri wouldn't have nothing on that stack.
And best of all, it fits in perfectly with Apple's stance on security and privacy. They still won't need to harvest your data to feed their machine. They can happily learn your data locally, combine what's needed from the public cloud, and give you all the benefits.
It's an incredible advantage, if Apple can take advantage of it. Because that's the equally big challenge here, and it's something the company has struggled with since ramping up its AI efforts with Siri.
Google, Amazon, and Facebook don't go down to the metal, at least not yet. But Apple needs to not only reach for those clouds — they need to nail them.