Over 880 days ago on 24th April 2015, I received my original Apple Watch (recently dubbed Series 0), I have worn it every day since. I use it mainly for fitness and media playback control, but nothing else.
The main reason is that my Apple Watch was too slow to use any apps and even the ones that I do use had some performances issues, e.g. the timer in the workout app struggling to display smoothly on screen.
watchOS 4 has been a huge improvement – performance seem so much better, apps launch quicker, the interface is more fluid, everything seems much more reliable. My app Departure Board launches more faster, updates its location much faster. I’d even argue that it has crossed the threshold from unusable to usable. watchOS 4 has really given a new lease of life to my aging, scratched watch.
Since I got my AirPods in December and I lost the ability to control the volume on my headphone (I had been using Jabra Revo Wireless for years), I have been hoping the Now Playing screen would be available as a complication. The Now Playing screen lets you control the device volume with the digital crown. Having it as a complication would let me access it with a single tap, rather then pressing the side button and scrolling to find it, then selecting it.
With watchOS 4, Apple went one step further and displays the Now Playing screen whenever media is playing on your iPhone. As soon as I start media payback on my phone, the watch face is replaced with the Now Playing screen.
I was almost convinced that I would be upgrading at a Series 3 with LTE this year. However this performance improvement may help me defer this purchase for another year or until Series 4. In addition to this my carrier, Three, is not supported yet and podcast playback over LTE is also not available yet as far as I know.
Last week, Apple bought workflow. I have listened to many podcasts on the topic this week, there seems to be an option that was overlooked – the potential of SiriKit integration.
With iOS 10, Apple introduced SiriKit which allowed interactions with apps without presenting their own UI. The user can use Siri’s voice assistant to speak their intent request, which can then be handled by the app. The intents that could be handled in iOS 10 is very limited. However, I assume over time this will be opened up to allow all apps and any tasks to be supported by Siri, like Alexa’s skills.
As a first part citizen of the platform Workflow will be able to access the index of all intent handlers available for all installed apps. For example, my security camera app or personal budgeting app will never be supported by Workflow, they are too niche to be worth the Workflow developer’s time. However, when the developers of the individual apps add SiriKit support:
- get current view of my garden camera
- get last time a disturbance was detected in my hallway
- get remaining weekly budget
- add expense of £7.50 for lunch
this will make the possibilities of Workflow (or an Apple automation tool) so much more powerful that Workflow could ever have been as its own entity. All of these intents will also be available verbally from Siri’s voice assistant and without a URL scheme used anywhere, as Federico hypothesised in this week’s Connected.
This approach to problems is not uncommon in iOS, up until iOS 8, each developer who wanted to share beyond Messages, Facebook and Twitter would have to support every single service individually in their apps. With iOS 8 and the introduction of App Extensibility, individual apps could declare their ability to share to their own services. This allowed the service owner to create the sharing experience once and every other app use it leading to much more powerful and polished experiences. I can see the same happening with Workflow (or an Apple automation tool).