Pixel 4 event was bad — but Apple could still learn something from Google

It was like watching Pitch Meeting, with Studio Guy asking how they were going to fill the runtime, with amazing new products and services? And Writer Guy responding, no, with excruciatingly slow design slam poetry and on-stage interviews. Will it be hard to have an event and show almost nothing off? Super easy, barely an inconvenience, about an hour in, we'll just say we have some much more to show you, but we've simply run out of time. And then cut off the event.

It was like watching Game of Thrones Season 8, offering them more episodes, begging them for more episodes, and just watching them mic drop and end it.. like that.

Anyway. The products and technology that did manage to somehow sneak out on stage was so good, that it almost makes up for the obvious lack of planning and organization that went into the event, and the extreme disrespect shown the audience, both live and streaming. Almost.

But, I'll complain later. Right now, I want to focus on the positive. That cool tech and what I very much think Apple can learn from it.

Face Unlock

We got to see more of the Pixel phones than we did the Pixel Book. Poor lappy little bastard. Whatever. The Pixel 4 uses similar if not the same facial geometry scanning tech as the iPhone has since 2017.

There are two main differences, though. First, there's a radar chip, formerly called Project Soli, now called MotionSense, and it's all housed in a huge forehead assembly.

It remains to be seen what, if any difference that makes. Google committed the classic blunder of introducing new chipsets, not feature sets.

Google committed the classic blunder of introducing new chipsets, not feature sets.

Like, Apple didn't say anything about the U1 spatial positioning chip in the iPhone 11 at the September event because there was nothing yet to say. When/if the Tags get announced, they'll focus on that.

Google showed tickle-me-Pikachu and Eeevee, which aligns with my personal interests, but is pure stunt at this point. And, actually, I think I'll be more excited about it when it comes to devices without displays, where it might be a primary control mechanism, like voice.

Anyway, the second difference is a setting that lets you choose if you want to see the lock screen or not.

With the iPhone, you have no choice. You see the lock screen unless and until you swipe up, which makes the whole Face ID system feel slow.

Letting us set it to open on unlock, if we want it that way, would be a terrific option and make Face ID feel as fast as it is.

Super Zoom

Google does this weird thing with Pixel camera presentations, where they go out of their way to mention they don't need certain commonly adopted camera hardware because their software is so good… only to add the exact hardware a year later.

Optical image stabilization, camera bump, a second camera. No one would even notice it if they didn't make such a big deal about not needing it… when it's obvious they do. This year it was ultra wide angle, so one guess as to what we'll see next year?

But, what they do with it, that's magic.

Google still found time to show off some really damn impressive digital zoom technology.

In between slagging Apple for "catching up" in semantics and fusion, while at the same time catching up to Apple in virtual lens modeling and portrait depth data, Google still found time to show off some really damn impressive digital zoom technology.

It builds on their existing super zoom software from last year by adding in the optical zoom of the new telephoto lens. And it looks terrific.

Home

Google has gone and done a lot of what I'd been hoping Apple would do for the home. What Apple typically does for a lot of things: Integrate. I got kinda confused over all the Home and Nest branding, but what Google basically did was combine their mesh Wi-Fi product with their home assistant and speaker product to make something that really is the best of many worlds.

Not only does it mean you have less stuff to buy, but it means the stuff you buy works better. Your assistant isn't on the network, it is the network, and your router isn't shielded away, it's out and on display.

I still think Apple killing, instead of evolving, the AirPort router was one of their biggest mistakes in the modern era. I'm going to do a whole video on this. Again. Because it's just so frustrating. But an Apple mesh router system that, sure, you could buy independently but would also come in every HomePod and every Apple TV, and could pair seamlessly with an iPad while charging it for bedroom or kitchen use would just be so killer. Like planet killer.

I don't want a Google box as the endpoint of my home internet connection. But right now, Apple, the privacy company, isn't doing anything to help with that crucial bit if infrastructure.

And this is one of the very few times I'll armchair spend their money and resources to say they absolutely should be.

Ambient computing

The beginning of Google's event was a complete structural mess, with them jumping around from product to product, as though they thought the narrative was tailing them and they wanted to lose it as quickly as possible.

But the overall theme was one very near and dear to me: Ambient computing.

It's going to be a huge part of the future, both far-field with the aforementioned room speakers, and near-field, with wireless headsets.

Google is winning at ambient computing because they're winning at the core assistant technology that will drive it.

And Google is winning here because they're winning at the core assistant technology that will drive it. Actually, Google and Amazon both.

Some will say Apple's privacy focus prevents them from collecting the data necessary to be competitive when it comes to assistants.

Bunk.

Apple essentially abandoned Siri after Steve Jobs died and Scott Forstall left, and they've only recently picked it up again with the hire of John Gianandrea.

There will be room for multiple assistants in the future, and owning the device means owning assistant one and having the opportunity to re-onboard with every new hardware and software release.

But, if we're going to get to SiriOS, a voice and AI layer that abstracts away a lot of everything else, there's precious little time to waste.

Siri is still inconsistent between devices and from one moment to the next, and just like Apple invested heavily in getting to the best silicon in the world last decade, they need to invest even more heavily in getting to the best AI in the world right now.

I mean, as long as I'm spending their money and resources.

○ Video: YouTube
○ Podcast: Apple | Overcast | Pocket Casts | RSS
○ Column: iMore | RSS
○ Social: Twitter | Instagram

Rene Ritchie
Contributor

Rene Ritchie is one of the most respected Apple analysts in the business, reaching a combined audience of over 40 million readers a month. His YouTube channel, Vector, has over 90 thousand subscribers and 14 million views and his podcasts, including Debug, have been downloaded over 20 million times. He also regularly co-hosts MacBreak Weekly for the TWiT network and co-hosted CES Live! and Talk Mobile. Based in Montreal, Rene is a former director of product marketing, web developer, and graphic designer. He's authored several books and appeared on numerous television and radio segments to discuss Apple and the technology industry. When not working, he likes to cook, grapple, and spend time with his friends and family.