Americas

  • United States

Asia

WWDC: The evolution of Apple’s Siri

news analysis
May 22, 20186 mins
AppleCloud ComputingMobile

A look at the many Siri enhancements Apple has introduced at WWDC over the years.

HomePod lights
Credit: Michael Brown

Siri became a built-in element of iOS way back in October 2011, when Apple announced its inclusion inside iPhone 4S and iOS 5. Apple’s AI has seen numerous improvements since then, so I thought it would be interesting to explore the evolution of Siri at WWDC since launch.

WWDC is important for the evolution of Siri

Apple’s big developer conference isn’t just about wowing the crowd with consumer-focused improvements; it’s also about sowing new seeds developers can choose to use to build their own solutions and businesses.

In recent years, Apple has provided developers with new ways to integrate Siri into their products, and there’s no reason the company won’t continue to do the same thing. That’s why it’s highly probable we’ll see enhancements to the service at WWDC 2018.

So, what Siri improvements has Apple introduced since 2011?

WWDC 2012: More database integrations and more

You may recall that Siri opened WWDC 2012. Available in iPhones for only a few months, Siri took a few moments to poke fun at Google and Samsung, before admitting its “emotions hadn’t been coded yet.”

Apple then announced a series of iOS 6 improvements in the software, mainly driven by access to new sources of data about sports, restaurants (with Yelp and OpenTable integration), movie information thanks to integration with Rotten Tomatoes database, and new capabilities such as the capacity to post to social media, read incoming messages and notifications, and open apps.

Siri was now available in 15 languages and in hands-free mode. That international flavor extended to local search results, which worked outside of the U.S. for the first time.

WWDC 2013: A new design, better voice, more system controls

Following the 2012 ouster of Scott Forstall, Apple lost little time re-designing Siri’s interface. Beyond that change of image, Apple’s next collection of Siri improvements saw the introduction of a more natural sounding male or female voices.

Siri also gained the ability to transact more system-level actions, such as turning Bluetooth or Wi-Fi on or off and the capacity to interact with more databases in response to search requests. Apple introduced what was then called “iOS in the Car,” an extension of Siri Hands-free designed to work with in-car systems.

WWDC 2014: Hey Siri, Shazam, more languages

The introduction of now so familiar “Hey Siri” support was perhaps the biggest Siri-related news at WWDC 2014, matching the “OK Google” feature previously introduced in Google Chrome. This always-listening mode let you use Siri hands-free.

Apple also introduced the capacity to recognize music using Siri and Shazam, the ability to buy music through iTunes, and support for 22 new languages. Siri gained the ability to control HomeKit enabled devices.

WWDC 2015: Faster and a little smarter

While the way Apple has been able to introduce AI-driven proactive intelligence has been hampered by its desire to protect customer privacy, WWDC 2015 saw the company launch its own take on Google Now.

Siri became able to provide contextual advice based on what it learned about what you are doing. This meant it became more capable of automatically doing useful tasks, such as adding invitations to your calendar or figuring out who new phone numbers might belong to. Siri also began recommending news stories, music, and apps.

Another nice improvement meant a U.S. user could set Siri to speak in Australian English if they chose. Most importantly, Siri became significantly (40 percent) faster in listening and responding to instruction and 40 percent more accurate than it had been before, the company claimed. “An industry best,” it said.

WWDC 2016: Siri begins reaching out

Siri announced the dates of WWDC 2016 days before Apple’s official announcement.

As it transpired, the event saw the introduction of what developers had been hoping for since Siri hit the platform — the SiriKit API. This integration was limited to apps for messaging, phone calls, photo search, ride booking, personal payments and workouts, or some in-car apps. (You could book a ride using your voice and Apple Watch, for example.)

Apple also announced use of differential privacy, enabling Siri to analyze large quantities of customer data without eroding user privacy in any way — this will prove a seminal technology for the future of machine learning on Apple’s platforms.

The assistant also gained new skills, including intelligent scheduling, integration with the QuickType keyboard, and the ability to react to your text conversations and make useful suggestions of what you wanted to write yet. Siri also got became much better at searching through your photos. Apple also announced Siri support in macOS, added YouTube search to Siri on Apple TV, and widened HomeKit support.

Eagle-eyed Apple watchers noted the company began increasing Siri-related hiring at the end of 2016.

WWDC 2017: Siri becomes a core technology

Siri-related recruitment noticeably increased from the end of 2017 as the company put machine learning more deeply inside its products.

This followed the 2017 introduction of big improvements to the proactive features introduced in 2015. On-device learning became a core component, and contextual analysis recommendations were also improved.

Developers got support for more domains (such as to-do lists, notes, payments) in SiriKit, while Apple also introduced CoreML, enabling technology developers use in order to develop and deploy smarter machine learning experiences.

Apple revealed that Siri was being used by over 375 million devices each month in more countries and languages than any other voice assistant. Capitalizing on this international reach, the company introduced the capacity to use Siri to translate from English to Chinese, French, German, Italian, and Spanish. Siri’s voices became even more naturalistic in iOS 11 and voice recognition more accurate.

The Siri-controlled HomePod speaker system was introduced at WWDC, though it didn’t actually ship until the following year, when the iOS 11.2.5 update introduced deeper music understanding, Apple said.

WWDC 2018: Ask Siri

The year that began with the introduction of Business Chat was punctuated with the addition of new Siri jokes.

It will inevitably see Apple widen and improve use of machine learning intelligence across its products, particularly in light of its hiring of Google’s former search and AI unit chief, John Giannandrea.

WWDC 2018 improvements seem likely to be focused around more intelligent on-device learning, better voice recognitionaccess to new datasets to deliver more refined search results, human augmentation, and improvements to the SiriKit API.

At time of writing, a series of stories claimed Siri was hinting at big improvements to be introduced at WWDC 2018. These referred to a series of stock responses Siri had added in advance of WWDC 2017. Apple quickly changed those responses — proving one very important thing about Siri: Apple improves Siri throughout the year.

Google+? If you use social media and happen to be a Google+ user, why not join AppleHolic’s Kool Aid Corner community and get involved with the conversation as we pursue the spirit of the New Model Apple?

Got a story? Please drop me a line via Twitter and let me know. I’d like it if you chose to follow me there so I can let you know about new articles I publish and reports I find.