WWDC Week of Wants, Vol 6

WWDC Week of Wants, Vol 6

Today's Ramblings: Siri

Apple was the first compnay to put voice control in a phone with the iPhone 3GS. Siri took over for Voice Control with the iPhone 4S, and was a vast improvement.

Then everything slowed down. Siri requires Apple to integrate the data into the system. There are a few partners - Wolfram Alpha, Bing, Yahoo for weather, and some sports data. Beyond that, it’s still down to just telling you what’s already on your phone.

Amazon’s Alexa has an API, and Google’s Home will offer one this fall. Siri needs an API. Allowing other developers to insert data into the conversational interface stream is imperative for Siri to be competitive.

Spotlight Meets Siri

Spotlight on the phone and on the Mac should be integrated with Siri. Saying “Show me Word documents from last week” should show me that. Saying “what’s the weather” should show me. “What’s Taco Bell’s stock price” should show me that. Siri is a system-level component that finds information. Spotlight is a system-level component that finds information. Why are there two places for this?

Textual Healing

Why do I have to talk to Siri? Let me type, especially if/when Siri comes to Mac. There are tons of situations where speaking out loud isn’t appropriate. I’m dreading the day when people start talking to their computers in my open-plan office.

Hardware

There’s the Echo, and there’s Google Home. I want a Siri box in my house. In Volume 4 of this series, I talked about the eero system that I would love to see emulated with Siri boxes. I also wrote about wanting Siri to high-five Beats and make a Sonos-style system I can put all over the house. That’s the real dream.

Smarter Siri

Siri on the AppleTV seems smarter than other Siris. There is a better chain of logic. The example I show people is “Show me James Bond movies”, “Just the George Lazenby ones”, and I see On Her Majesty’s Secret Service. Siri remembers it’s place, and it’s great. Not so much on my phone.

Siri needs chain logic to remember it’s place (yes, I call Siri an it. the US-centric female voice is not universal). Siri needs to remember what you use for what, and the APIs should suppor that. Saying “Add Walk the dog to my to do list” shoud elicit a response from Siri similar to “I have several apps for your to do list. Which would you like to use?” and when I say “OmniFocus” it should remember that forever, or until I say “Siri, move my to do list to Reminders”.

Also, can we please name Siri something else? I use the British male voice, and call my Siri “Jarvis”. But, I can only do that antecdotally, Siri won’t answer to Jarvis. But it should.

Siri started the voice assistant revolution, but it hasn’t evolved much in 5 years. Siri 2.0 needs to be a huge leap in openness, functionality, and utility in order to have a hope of staying competitive. Apple’s notorious privacy stance already puts it at a disadvantage to companies like Google who have massive databases for machine learning. I hope that Apple’s aquisition of VocalIQ last year really pays off for the next generation of OSes.