Google has Tensor cores. Could Apple soon have AI cores of its own?

Apple is already driving mobile silicon in a way few if any other companies have been able to keep up with. Now, a new report claims Apple will be using its considerable chipset chops to accelerate artificial intelligence as well.

Mark Gurman, writing for Bloomberg:

Apple is working on a processor devoted specifically to AI-related tasks, according to a person familiar with the matter. The chip, known internally as the Apple Neural Engine, would improve the way the company's devices handle tasks that would otherwise require human intelligence -- such as facial recognition and speech recognition.

If true, this should surprise no one. Arguably, Apple's biggest advantage in computing right now is that the company custom crafts the complete stack, from atom to bit to pixel. That includes an ever-increasing amount of custom silicon.

A few years ago, Apple offloaded motion tracking from its main A-series system-on-a-chip to an M-series sensor fusion hub. A few years later, it added onboard natural language parsing to the M-series to enable "Hey, Siri!" in as power-efficient a way as possible.

The latest A-series, A10 Fusion, combines two high-efficiency cores with two high-performance cores to try to eek out as much battery life as possible while pushing pixels and bits as fast as possible.

Anything that makes sense to offload to increase efficiency and performance, you can bet Apple is working on offloading. Including and especially AI, since it's such a hot topic these days.

While Siri gave Apple an early advantage in voice recognition, competitors have since been more aggressive in deploying AI across their product lines, including Amazon's Echo and Google's Home digital assistants.

This part reads oddly to me. Nothing against Mark, but it's a particularly bad narrative pervasive across U.S. media.

Google Home is a single product. Echo is a growing line of similar, home-based products. Apple has pushed Siri from iPhone to and through iPad, Apple Watch, Apple TV, Mac, CarPlay, HomeKit, and accessories like AirPods. Apple has also expanded Siri across dozens of languages, including Chinese, Hebrew, and Arabic, as well as dozens of regions.

It's fair to say no other vendor has yet been as aggressive as Apple in deploying virtual assistants across product lines or to customers around the world.

"Two of the areas that Apple is betting its future on require AI," said Gene Munster, former Apple analyst and co-founder of venture capital firm Loup Ventures. "At the core of augmented reality and self-driving cars is artificial intelligence."

This also reads oddly to me. Almost everything Apple does already requires AI. AI is going to be everywhere in computing the same way math is everywhere.

There's nothing unique about special projects like augmented reality or cars. Machine learning, computer vision, and related technologies are already powering everything from battery efficiency to photo tagging, sequential inference to computational photography.

Just because Google made its sound like they invented AI on the I/O 2016 stage, and it's now parroted alongside AR and VR the way mobile/social/local was a decade ago, doesn't mean companies like Apple (and Google) haven't been working on it for years.

What's interesting isn't that the big tech giants are working on it; it's how they will use it to solve real problems in the information and computer space and, in Apple's case, how it will solve those problems while protecting our security and privacy at the same time.