It's been over a year now since Siri launched alongside the iPhone 4S in October of 2011. When I first saw Siri, it seemed to have enormous potential as: 1) A natural language interface that may one day do to multitouch and graphics was they did to the command line; 2) thanks to that interface, a way for Apple to intermediate and broker search away from Google and towards parter content; and 3) by virtue of that intermediation and brokerage, a gateway into customer insight analytics.
On the client side, I've enjoyed the type of results Siri delivers enough, both in terms of content and presentation, to wish Apple would: 1) hook it into Spotlight so I could still use it when talking would be impossible or inappropriate, or the natural language parser wasn't available; and 2) fix it so the natural language parser wasn't so frequently unavailable. (Purple-dot-purple-dot-purple-dot-nothing is the mouse only randomly getting food.)
Since then, Apple has brokered deals for sports, restaurant, and movie knowledge bases in Siri, including the ability to start table reservations and, soon, movie ticket purchases right from within the service. However, also since then, Google has launched their competing Google Now service. And Google knows services the way Apple knows hardware and software. It offers on-device voice parsing, Google's industry-leading backend infrastructure, and goes a step beyond Siri by attempting to predictively provide information and answer questions before you even ask them.
Now, Apple has started hiring people away from Amazon to help with the service and, in the wake of a management re-ogranization, Siri has been given to Apple's "fixer", senior vice-president Eddy Cue to help set, or reset, its course going forward.
Because Siri is only as useful as its weakest server and slowest response, and both those things are going to need some serious attention.
Speed and reliability
It's tough to argue that the biggest problems Siri faced at launch, and continues to face today, is that it sometimes doesn't work, and oftentimes when it does, it's annoyingly slow. Part of that is due to the network. Literally everything you send to Siri needs to go to Apple's servers for parsing and back to your device before you get a response. That's certainly understandable if the result set includes information stores on the internet, but for local tasks like setting an alarm, it's a single point of unnecessary congestion and all too frequently, failure.
Google switched to on-device voice parsing for Android 4.1 and that should be at the top of Apple's Siri list for iOS 7. Moving that all on-device is no doubt non-trivial, but removing the burden of the cloud from where it's not needed has so many benefits that it's absolutely worth the effort. That way not only setting alarms but anything involving local or locally-cached data in apps, most especially dictation (say goodbye to purple-dot-purple-dot-purple-dot-nothing!) becomes not only nearly instantaneous, but immune to outages.
Sports scores, movie listings, Wolfram|Alpha queries, restaurant table books, and anything else that absolutely had to hit the internet would still be slower and riskier, but even local map and point-of-interest data could be cached locally, greatly reducing the dependency on Apple's backend. And about that backend...
The elephant in Apple's room, the wrench in their reliability, is their server-side infrastructure and its glass jaw. Siri and it's issues since launch are just one example. Game Center has infamously gone down thanks, perhaps, due to the launch of just one popular game. iMessage has had it's ups and down. So has the App Store (in fact, as I write this, App Store downloads aren't working). iOS 6 Maps feels more like a data aggregation, cleansing, and quality assurance issue than an infrastructure one at this point; I haven't seen maps "go down". The Apple Online Store has to go offline simply to be updated (even if there's marketing value to a stunt like taking the store down, there's real-world value to live updates on ecommerce engines).
Google and other competitors like Facebook and Amazon come from the clouds. Their infrastructure isn't as old as Apple's WebObjects past, and has been their singular focus since their respective launches. As good as Apple is at hardware and software, that's as good as Google, Facebook, and Amazon are at the data centers, servers, and services that comprise their clouds.
For Apple to re-create their backend architecture in a way that's more modern and advanced, or even as modern and advanced, as Google Facebook, and Amazon will be non-trivial. One look at Microsoft's valiant efforts to date in that respect shows just exactly how non-trivial it is to turn an old, stubborn aircraft carrier into a new, shiny hellicarrier.
Maybe Apple is already doing that. They're slowly but surely pushing their Objective-C based development platforms forward, maybe they're doing the same thing with their cloud infrastructure. Maybe something just as good as what runs on Macs and iOS devices is being worked on to run iCloud and all Apple's ancillary services.
If not, however, their should be. And soon. And with massive, billion dollar efforts not spent on data centers alone but on the next generation of software to run them.
Google, Facebook, and Amazon are buying up apps, developers, and designers to address their cultural weaknesses, and the Sofa, Sparrow, Snapseed, and other teams are hard at work making sure every new generation of native app they release is less embarrassing than the last. And it's its working.
Apple has a much harder problem to fix, but that just means that have to work harder at fixing it. Whether it's buying Nuance or former OS X head Bertrand Serlet's new startup (if what it does is even appropriate) or raiding Google, Facebook, and Amazon (again) for every cloud engineer they can, they need to get it done.
Otherwise Game Center, App Store, iTunes, iMessage, iCloud, and yes, Siri will suffer.
While engineers and architecture are vitally important to Apple, APIs are what matter to developers. And developers have wanted Siri APIs since they first saw it announced at the iPhone 4S event. And it still seems unlikely.
Guy English of Kicking Bear explained how Apple's internal secrecy made even the integration of an Apple app, Find my Friends, convoluted. Here's the crux (but read the whole thing, the linen-play is killer):
Guy also talks about hand-off collisions in the first episode of Debug as well, where different apps offer up potentially overlapping knowledge sources, and the Siri AI has to try to figure out which one gets what and when. A pop-up requester, the kind Siri already uses to offer up different contacts or locations, could handle the obvious stuff, but not everything is obvious. Good natural language parsing is all about subtlety, context, and yes, nuance.
A Siri API wouldn't just have the potential for conflicting app hand-offs, but for conflicting with Apple's partnership deals. That goes back to Apple using Siri as a way to intermediate and broker search. An API intermediates Apple. What value would a content deal have between Apple and Yelp, or Apple and OpenTable or Fandango or anyone else, if any app could hook into an API "for free"? Right now Apple seems to want to handle Siri access the way they handle Apple TV access, through closed partnering deals rather than open access.
That sucks for developers, and may or may not suck for users. Apple might feel controlling access provides a better, saner experience, even if many power users would disagree -- the classic conflict.
Either way, I'd argue fixing Siri's speed and reliability, fixing iCloud's backend infrastructure, and adding in predictive functionality should all be done way before Apple even considers taking on the responsibility of a Siri API.
Beyond speed and reliability, architecture and API, for end users, Siri is still a mixed bag when it comes to functionality. Even with Apple's built-in apps, there's not a lot of inconsistency. For example, Siri can compose both emails and messages, but can only read incoming messages, not emails. That Siri will tell you it can't do certain things shows the natural language and contextual parsing knows what you want to do, the ability to do it simply hasn't been implemented, turned on, or allowed. Over a year later, and Siri still doesn't provide basic Settings toggle functionality.
Kontra, questioning whether Siri is Apple's future on Counternotions, points out the advantage that contextually aware, targeted search has over Google's traditional, linear search algorithms. Here's an excerpt, and again, go read the whole thing:
This is even true with the the excellent Google Search iOS app. In terms of speed and, so far, reliability, it positively schools Siri. Yet it remains trapped in Google's traditional search paradigm.
But not so Google Now. In my experience Google now isn't (yet) the contextual equal of Siri, but it does something Apple (also yet) hasn't been willing to do with Siri: predictive response.
The idea isn't new. Roger McNamee, back when Elevation Partners still owned Palm, pitched the idea that your phone, because it knows where you are, what you have scheduled, and who your contacts are, could alert you if traffic became such that you could no longer make your meeting down town, and prepare messages to send the people you were supposed to meet to excuse your tardiness. Instead of a static alarm, it could remind you to leave for a meeting only a few minutes ahead of time if it was down the hall, or hours ahead if it was across town and there'd been an accident. Rather than you asking about the weather, it would know you had a trip planned for Banff and, when snow started to fall, alert you to bring your jacket. It could anticipate, and instead of making you ask for information, it could bring the information to you.
And forget asking to have Wi-Fi or LTE toggled off or on. It could know when you entered or left a trusted network or planned location to just do it for you. Not to mention, "it's 48hrs until your anniversary, dumbass, and you haven't made dinner reservations yet, would you like to see a list of romantic restaurants with seating for 2 still available?"
Add automatic search widening to that as well, so if the perfect restaurant is 11 miles away instead of 10, or there's nothing Italian available but there is something French, you don't get zero results back, and the future starts to become much more interesting and convenient.
Google Now is doing some of that already, and with a nice looking interface, and creepy as it is, it's convenient enough that many of us probably wouldn't be bothered by the privacy issues any longer than it took us to agree to the access requester.
While Siri was ahead of Google in terms of personal search, Google is getting ahead of Siri in terms of predictive search, and if it takes until iOS 7, presumably in the fall of 2013, for Apple to respond, Google Now will likely be even further ahead.
The bottom line
Whether it was the command line with the Apple II, the GUI with the Mac, or multitouch with the iPhone, Apple has been at the forefront of every major mainstream computing interface revolution in modern memory. They're not with Siri. Siri is the Apple I. The Lisa. The unreleased Safari Pad before the iPhone. If Apple needs the Apple II, the Mac, the iPhone version of Siri, or they cede the next great interfaces to the likes of Google Now or Microsoft Kinect, or whatever else comes next.
Services have never been Apple's forte, so the coming revolution could well favor the competition. But that just means Apple has to be bolder and fight harder to win this next battle for the future.
(Seriously, and not trying to be a pain, but we talk about a lot of this with Loren Brichter of Tweetie and Letterpress fame in this week's episode of Debug so check it out if you haven't already.)
Rene Ritchie is one of the most respected Apple analysts in the business, reaching a combined audience of over 40 million readers a month. His YouTube channel, Vector, has over 90 thousand subscribers and 14 million views and his podcasts, including Debug, have been downloaded over 20 million times. He also regularly co-hosts MacBreak Weekly for the TWiT network and co-hosted CES Live! and Talk Mobile. Based in Montreal, Rene is a former director of product marketing, web developer, and graphic designer. He's authored several books and appeared on numerous television and radio segments to discuss Apple and the technology industry. When not working, he likes to cook, grapple, and spend time with his friends and family.
Very thorough breakdown Rene. Siri has been a mixed bag for me. While I enjoy using it to compose and listen to texts while driving, a lot of its other features are just too inconsistent for me to use on a regular basis. More often that not, I find myself typing more than talking.
my roomate's step-aunt makes $81 every hour on the computer. She has been out of a job for six months but last month her pay check was $15158 just working on the computer for a few hours. Read more on this site ~0~ bing30,com ~0~
My last pay check was $9500 operational twelve hours per week on-line. My sisters friend has been averaging 15k for months presently and she or he or he's employed regarding twenty hours per week. i can't believe but simple it had been once i tried it out. this will be what I do, bing30,ḉ☺ℳ
I make $9.50 / hr at Wendy's!
It always got its ups and downs. For me, it's the most useful when I'm going to the station, walking as fast as I can. It's just nice to have features at your voice's range, such as "What time is it?" or "Play <song title>". The only time Siri's bugging is when I'm pronouncing in a bad way a song title, and it makes me repeat it again and again and again.
Great article. I personally like Siri and that is why Apple still has my vote. I have never liked the Andriod / Google interface. I hope Microsoft is able to catch up as I like the different twists they put on things. Kinect is awesome.
If you have honestly ever used Siri and the new google voice search side by side I don't think you could prefer Siri. I'm rooting for apple to catch up, but as long as most of their resources are devoted to making thinner desktops rather than better software and services, I don't see it happening,
When the earlier article this week http://www.imore.com/are-risks-apples-stock-serious-some-investors-think spoke of concerns with Apple for the future, my immediate reaction was that those are the wrong concerns. And those were easily defended. You hit on the right concerns to have. I know it's been mine in wanting to see Apple focus on ecosystem.
Ritchie provides plenty of grist for the mill. So far this week we've had two great articles on Siri. This one and an excellent piece about Siri from Counternotions here :http://counternotions.com/2012/11/12/siri-future/ both are excellent reads and worthy of archival.
I've found I mostly use Siri to set reminders, and she does pretty well at that, although, I still can't get her to set a reminder for the location of just a city - "Remind me to get gas when I arrive at Roseville, CA" doesn't work, she can't figure out what I mean. And my wife really enjoys using Siri to play songs - "Play songs by Blake Shelton" is easier than navigating to the Music.app, then to the Artists tab, then to the artist she wants to play. I'm really looking forward to future enhancements to Siri.
One question I have always had about Siri, is how does it correct issues with its voice recognition. There is restaurant chain in Los Angeles area called, "El Pollo Inka". I was able to find this via Voice Search and VLingo on my DroidX for at least a year and a half. When the 4S came out I had my friend try to find it and it totally mangles the name and comes up with all sorts of weird translations. My 5 does the same thing. If Siri thinks that it got what you asked, how will it ever learn that it's wrong? There doesn't seem to be any way to report bugs such as this to Apple. I hope that they do something to correct situations like this in the future, such as having a reporting problem.
You can tap on the text that Siri thought you said and manually correct it. I'm not sure if this notifies Apple to update Siri's recognition algorithm though (I would hope so).
What are the reasons (if any) why throwing some of the billions they have in the bank at the problems, won't help to solve these issues? They are in a position, with their massive resources, that I don't understand why they find themselves still struggling to improve Siri! What problems are there that money can't help to solve it?
The biggest problem is finding the world-class talent needed to build / develop AI & contextually services to get to the level (or beat) Google. The very best either already work at Apple, at Apple's competitors, or start-ups. That's on the software / services side. Then there's the hardware side which is the services infrastructure. I was listening to an episode of the Hypercritical podcast just recently, and in there John Siracusa was mentioning how Apple's cloud infrastructure is built using third-party solutions like MS Azure and Amazon. He said what Apple needs to do is follow in Google's footsteps and make cloud a core competency by hiring the right people to build a "server culture" within Apple and build their own custom cloud infrastructure. This is the same philosophy Apple uses to build their devices - a do-it-yourself mentality. Doing cloud services / infrastructure is very hard and this is where Google & MS have a leg-up on Apple (even Facebook but they don't really compete with Apple).
Siri and I definitely have a love hate relationship. lol
excellent article Rene love hate Siri as well
no one is spammer here!!!
I don't use siri and IMO it just sounds like a robot. It would be a lot better if it sounded like a human. Plus there is no point in using it if it don't work the way it should.
Thank you for signing up to iMore. You will receive a verification email shortly.
There was a problem. Please refresh the page and try again.