Chris Hannah
My little piece of the internet

Currently Listening

Siri Shortcuts and the Shortcuts App

Kicking off my collection of writing on WWDC 2018, I’m going to talk about Siri Shortcuts, and the Shortcuts app. As soon as I saw it, I knew it would be one of my favourites from the whole event.

The announcement was received by most people as being “now we know what the Workflow team have been up to”, I’m not complaining, I also posted the same thing. It is probably the best way the Workflow acquisition could have gone, because now it’s completely tied into the OS. It may have a different name, but it will always be Workflow.

The features announced was really how the OS interacted with the shortcuts, and how Siri is more intelligent because of it. Not the voice Siri, but the computational Siri that can understand you, and suggest things.

It will, of course, require developers to open up about different user actions in their apps. Which will allow Siri to analyse their usage, suggest them later on, and also maybe for users to build with in the Shortcuts app.

There were some intriguing demos for the suggested actions, which is not something I really say, because usually they are based around unrealistic or ideal situations. But because Siri is in your phone? I’m not sure how to explain that, but it knows about you, what you’re like, and also the environment around you. Which is why it can suggest you turn on Do Not Disturb when you go to the cinema.

They also showed an example of a regularly occurring event, such as buying a coffee in the morning. Maybe not everyone buys a coffee from an app on their phone every day, but I use the Starbucks app every time I go. And that could easily be at least 3 times a week. So if it learned (or just used Maps) the location of Starbucks, recognised it was associated with that action, that would be very helpful! It’s certainly something I feel would be capable, and it’s not a usual Siri feature that’s nice to think about, but never use.

It does get more advance though, and that’s with the Shortcuts (Workflow) app. I conceptualise it by being similar to Scenes in HomeKit, where you could say a phrase such as “Good morning”, and then Siri can perform a bunch of tasks to set you up for the day. Maybe it sort of encompasses the automation of HomeKit?

I’ve already been playing around on the iOS 12 beta, and while I’ve already been suggested some actions, like enabling a alarm, messaging my girlfriend, and even adding a new to-do in Things, we don’t have the Shortcuts app yet. That will come in a later update via the App Store. So I will definitely have to write more about that in the future. But from the keynote, it looks like they’ve added the Apple-style to Workflow, which will definitely make it feel easier to use for general users.

One of my questions though, is how well suited is this to a general user? I will be very keen to see if it’s a widely adopted feature, and even if the Shortcuts app with custom actions might not be, I see the Siri suggestions being a bit hit.


Read more of my coverage of WWDC here.