Recently there’s quite a buzz — and new evidence — about a close partnership between Apple and Nuance. Speech technology is not new and neither is the fact that computer operating systems have incorporated it to a certain extent (mainly text-to-speech, or TTS). Remember when Steve Jobs introduced the first ever Mac on January 24, 1984? He actually allowed Mac to greet the audience (see 3:30 mark):
The robotic TTS doesn’t sound as sexy as Mac OS X’s Agnes, Kathy, Vicki, or Victoria, but the crowd went wild nonetheless. Back then it was pretty impressive for such a tiny computer to be able to speak. Even until today, Apple stayed true to its core philosophy of delivering the best user-friendly products, and part of that means adopting I/O interfaces that are natural to a human: speech, handwriting, and gestures.
The Mac introduction footage showed speech in Mac OS. It also exists in Mac OS X and iOS (as Voice Control, to a limited extent).
Handwriting recognition (or, as some may ridicule, the lack thereof) was one of the highlighted features of the Apple Newton, the grandfather of PDAs introduced in 1993. The technology is present in Mac OS X and iOS as well — for example, I’m able to input Chinese characters by writing on the trackpad or iPhone screen.
Starting with the iPhone came the prevalence of gesture input. We’re all quite familiar with the tapping, swiping, and pinching gestures in using the iPhone and other smartphones as well. Apple even made this available on devices with trackpads. This input method is by far the most natural — even toddlers “get it.”
But I digressed. This article is about how the Apple-Nuance partnership could impact customer service technologies.
So, what really caught my attention was a piece from TechCrunch:
In digging into the information about the relationship between the two companies, we had heard that Apple might actually already be using Nuance technology in their new (but yet to be officially opened) massive data center in North Carolina. Since then, we’ve gotten multiple independent confirmations that this is indeed the case. And yes, this is said to be the keystone of a partnership that Apple is likely to announce with Nuance at WWDC next month.
More specifically, we’re hearing that Apple is running Nuance software — and possibly some of their hardware — in this new data center. Why? A few reasons. First, Apple will be able to process this voice information for iOS users faster. Second, it will prevent this data from going through third-party servers. And third, by running it on their own stack, Apple can build on top of the technology, and improve upon it as they see fit.
We already know Apple’s interested in better speech tech. It bought Siri and has been soliciting speech-related engineers. If TechCrunch is right, it would be Apple’s foray into cloud-based speech tech. The company certainly has the money and know-how to grow another massive online service besides iTunes, App Stores, and MobileMe.
Proven Nuance speech technology hosted on Apple’s massive infrastructure? That’s a dream come true for a customer service app developer!
A company with a speech-enabled IVR can look into developing an iOS app that’s also speech-enabled, without having to burden the company’s speech servers. If the app can deliver on a better user experience with speech capabilities, can you imagine the number of calls reduced to the IVR? Phone calls and speech licenses are expensive, so divert these interactions to the app and through the user’s own monthly data subscription and Apple’s servers in N.C.
Plenty of companies already have their own iOS apps, but not many are focused on customer service. It’s time they think about the next version of their Company App — with speech.
Of course, this all hinges on Apple making the speech services unrestricted to developers. More reason to keep an eye on this year’s WWDC announcements. That is, if you’re passionate about better customer service, speech tech, and mobile apps.