Well in this update, Text Case will now be accessible via Siri! And you will also get to set a different accent colour in the app.
There’s not that much UI in Text Case, as it’s been intentionally kept rather simple. However there are now 6 colours, that will be used for the navigation bar, switches, and buttons:
With iOS 12 now being released, Siri has gotten a whole lot more powerful, and you’ll expect quite a lot of apps to try and make use of it. Which meant I just had to add support to Text Case.
There’s only one function in Text Case, and that is to convert text into various different formats. And that means it’s also really easy to use! The formats will be applied to any text that you have copied, and the formatted text will replace it on the clipboard, ready for you to paste anywhere.
From Settings, you will find the “Add to Siri” option at the bottom. If you tap this, you will then get to select a format, and then record a custom phrase.
That’s not all the ways you can use Text Case with Siri. Because as you use Text Case, iOS will learn what formats you are using, and begin to suggest them to you.
These suggestions will also appear in the new Siri Shortcuts app, where you will be able to automate them with everything else!
Kicking off my collection of writing on WWDC 2018, I’m going to talk about Siri Shortcuts, and the Shortcuts app. As soon as I saw it, I knew it would be one of my favourites from the whole event.
The announcement was received by most people as being “now we know what the Workflow team have been up to”, I’m not complaining, I also posted the same thing. It is probably the best way the Workflow acquisition could have gone, because now it’s completely tied into the OS. It may have a different name, but it will always be Workflow.
The features announced was really how the OS interacted with the shortcuts, and how Siri is more intelligent because of it. Not the voice Siri, but the computational Siri that can understand you, and suggest things.
It will, of course, require developers to open up about different user actions in their apps. Which will allow Siri to analyse their usage, suggest them later on, and also maybe for users to build with in the Shortcuts app.
There were some intriguing demos for the suggested actions, which is not something I really say, because usually they are based around unrealistic or ideal situations. But because Siri is in your phone? I’m not sure how to explain that, but it knows about you, what you’re like, and also the environment around you. Which is why it can suggest you turn on Do Not Disturb when you go to the cinema.
They also showed an example of a regularly occurring event, such as buying a coffee in the morning. Maybe not everyone buys a coffee from an app on their phone every day, but I use the Starbucks app every time I go. And that could easily be at least 3 times a week. So if it learned (or just used Maps) the location of Starbucks, recognised it was associated with that action, that would be very helpful! It’s certainly something I feel would be capable, and it’s not a usual Siri feature that’s nice to think about, but never use.
It does get more advance though, and that’s with the Shortcuts (Workflow) app. I conceptualise it by being similar to Scenes in HomeKit, where you could say a phrase such as “Good morning”, and then Siri can perform a bunch of tasks to set you up for the day. Maybe it sort of encompasses the automation of HomeKit?
I’ve already been playing around on the iOS 12 beta, and while I’ve already been suggested some actions, like enabling a alarm, messaging my girlfriend, and even adding a new to-do in Things, we don’t have the Shortcuts app yet. That will come in a later update via the App Store. So I will definitely have to write more about that in the future. But from the keynote, it looks like they’ve added the Apple-style to Workflow, which will definitely make it feel easier to use for general users.
One of my questions though, is how well suited is this to a general user? I will be very keen to see if it’s a widely adopted feature, and even if the Shortcuts app with custom actions might not be, I see the Siri suggestions being a bit hit.
While all of these assistants can turn things on, turn them off, move thing up and down, and such, they can only do those things now. I can turn on the lights now. I can open the garage door now.
It makes so much sense for this to be supported. Sure you might be able to schedule actions inside of an app. But if voice is an official method of input, you should be able to do everything with it.
There’s not even a particularly high barrier in creating a delay/schedule system. The simplest method I can think of, is that when a voice assistant hears a request with a related time, all it needs to do is store that exact request (even plain text is fine), along with the date/time. Then the system can set its own reminder, and at that time, it simply performs the request automatically, and deletes it from the queue.
Developers are previewing new Siri integrations for payments, messaging apps and more, creating new experiences using “Hey Siri” to simplify everyday tasks using just your voice. Siri can already help you send an iMessage to a friend, but with the introduction of SiriKit for developers, messaging apps can now tap into the power of Siri. You can use your voice to do things you couldn’t do before, like ask Siri to send a secure payment without ever opening an app.
I’m really happy to see Monzo (formerly Mondo) on the list, and they’ve proven to be a really forward thinking (now official bank) service!
There are a few more previews in the article as well, which aren’t for payments, but show some future third-party integration with Siri.
With all of these integrations coming to Siri, it may finally become useful.