» Twitch Extensions & musing on interactive livestreams

This piece Inside the Future of Twitch: Watching Is the New Playing caught my attention this weekend, with Michael Andronico of Tom’s Guide covering the ways Twitch is changing video games and livestreaming thanks to deeper interactivity from the viewers.

Twitch for iPhone

Watching people game online has been around for years, but I hadn’t thought much about where platforms like Twitch and YouTube Gaming could take the experience next.

Not only will people be viewing, but they’ll be part of the gameplay too and have more to do while they’re tuned in thanks to Twitch Extensions:

As of this writing, there are roughly 150 Twitch Extensions, and according to Twitch, more than 2,000 developers have signed up to create more.

Some extensions consist of simple stat overlays that let you get a better look at a streamer’s performance in games like Fortnite and Destiny 2. Others, like Darwin Project’s Spectator Experience, allow viewers to become active participants in the games they’re watching. But they all share the common goal of making Twitch more than just a place to seek out passive entertainment.

“I think, at the end of the day, we want every game to have an official extension,” Shevat said, adding that a lot of the content you see on a streamer’s Twitch page — including links to social media channels and personal websites — will become more interactive over time.

There are already a few live examples of these types of add-ons, including a Spotify extension that lets you see what music a broadcaster is rocking or an Amazon extension that makes it easy to buy your favorite streamer’s preferred PC parts right from their channel.

The most intriguing part comes at the end, where he frames playing with interactive viewers against the progression of computers up to now (emphasis mine):

“There is — and this is a very conservative approximation — 20 times more people watching people play, than people playing any game,” said Darveau.

Playing without viewers involved will eventually feel like nowadays when you go on a computer, and there’s no internet.”

Anyone up for a Workflow livestream?

Writing about Workflow (and HomePod) on the Web

I’ve had the privilege to write for The Sweet Setup the last few months and now iMore, so I wanted to share some of the links here.

Primarily I’ve been writing about Workflow, trying to get some of the ideas in my head out and into the world so other people can take better of the app – especially now that it’s free. But I’m also dabbling in product reviews & photography, a new challenge that’s proving lots of fun and hard work.

Things for task management

I started writing about three workflows for Things templates, meant to act as quick ways to copy items into the task management app. They’re also good examples of using Split Text,

I followed it up with a deep dive into Things for iOS’s new URL scheme, which enables a huge set of automation capabilities for optimizing the capture and review processes for my productivity system. I tried to write about it in a way that people new to deep linking and automation might be able to learn as they go, partly echoing the way I wrote the documentation for Workflow.

HomePod in the house

After that I did my first product review, trying to capture the experience of what it’s like to own a HomePod and use it with Siri in the house. I also produced 30 photos for the review, taking way too much time but leaning into my other side business of product photography.

I really enjoyed taking the time to think about how the new product category fits into a consumer’s life, and I’m hoping HomePod gets better soon because I want to push it further. I’ve got a few articles in production about how I use HomePod beyond the practical parts of using the smart speaker, and I’m eagerly waiting for AirPlay 2.

How-To’s for Workflow

Since that I published two more articles for The Sweet Setup, starting by explaining how to set up your workflows to operate across both the widget and the share extension. I explained a bit about my “input check” method using the If action, which follows nicely into the second article about Using device details with Workflow.

In there I shared a cool Brightness by Battery workflow1 that dims your screen according to your power level, and a few others for tweaking your system settings programmatically. These are great for using with Run Workflow in the middle of other workflows, like little mid-automation widgets you can reuse across your different workflows.

Finally, today I published my first post for iMore detailing step-by-step instructions for 5 different workflows related to the Reminders app. I show you where to find the actions, explain how to place the actions & tweak the parameters to get it right, and include links to each of mine so you can get them yourself and follow along.

It’s interesting listing out steps this way, and the documentation probably could have used some similar formatting to make it easier to scan.

In that spirit, here are the articles mentioned above (links will open in a new window):

If you’d like to see me cover more or different topics, let me know on Twitter at @mattcassinelli and I’ll add it to my notes.

  1. I honestly may have gotten this idea from somewhere else, so if you know more don’t hesitate to let me know and I’ll credit the originator. 

Controlling your HomePod volume with iTunes and a simple Mac app

If you’ve picked up Apple’s HomePod in the past few weeks and tried to use iTunes on your Mac to Airplay something to the speaker, you probably got blasted with the music playing at full volume.

This occurs since HomePod uses iTunes’ in-app volume slider to adjust its levels rather than your Mac volume, and iTunes is usually at 100% because the hardware keys are used control my computer’s overall sound instead1. Plus, if I want to change the volume on HomePod after the music starts, I have to go into iTunes and drag the slider – you can’t turn it down that quickly.

Screenshot of iTunes Volume Control running in a Mac menu bar

To get around this, I installed a Mac app called iTunes Volume Control that’s available on GitHub. Created by Andrea Alberti, it’s an app that lives entirely in your menu bar and changes the Mac’s hardware volume keys to control iTunes instead. When it’s running, it can entirely take over mute, volume up, and volume down – or, you can set it so you have to hold a modifier key like Command before hitting the keys. I use the latter option, so I can control my Mac volume with the keys normally and then use ⌘ + or ⌘ – to adjust iTunes when I need to.

Once you’ve installed the app, you’ll find it’s much better experience playing music from iTunes with HomePod as your speaker. I set iTunes Volume Control to launch at login, so it’s basically always running when I use my computer and I never have to turn it on when I need it2. I’ll usually open iTunes, use ⌘ – to turn down the volume, then pick my song and AirPlay to my HomePod.

iTunes Volume Control also provides an option to change the step size for each press, so the volume can be changed in more specific intervals – you can set it go up 3% each time, for example, rather than the default 10% at a time. This gives you fine-grained control of the HomePod volume, right from your keyboard.3

I could see improving this setup using iTunes and AppleScript – you could set up a command to launch iTunes already set to 30% and set to AirPlay to the HomePod, avoiding the setup process each time I want to listen from my Mac on my HomePod. However, I have no experience there and that’s a project for another day.

The best part of this setup is that iTunes Volume Control is entirely free to download and use. Check out the documentation first, but use this link to get the app and start controlling your HomePod from your Mac.



  1. Instead of adjusting the levels in iTunes and on your Mac separately, it’s much more common to leave iTunes at 100% and change the volume on the whole computer instead. 
  2. I normally hide it in the menu bar using Bartender, so I can click on the Bartender icon to reveal it but keep it away from view otherwise. 
  3. I do the same thing with HomePod normally by using my Apple Watch. Once you change the source in Control Center on your iPhone to the HomePod, the Now Playing controls show up on Apple Watch and let you control the smart speaker from your wrist. 

Quickly Saving Web Pages to my Notes

I’ve been doing more research on iOS lately as my iPhone is the device I use the most, so capturing full web pages quickly saves me a lot of time. While I really like Apple Notes’ latest iterations, it’s not easy to clip websites there – so I adopted Bear for notes, which has support for Markdown, images, and a handy Get URL function.

Bear’s ability to download websites as a note is killer, but it’s usually easily available for most people via their Action Extension. Rather than limiting my access to the share sheet, I’ve been taking advantage of the Workflow action Get Bear Note From URL1 to save web pages from anywhere on iOS.

I use a workflow called Save Page to Bear either from the action extension, or by copying a link and running it from the app, the widget, Spotlight, or Launch Center Pro. I choose which way to start the workflow depending on the moment, so it’s designed to accept different types of inputs even if it’s started in a different spot2.

I usually add this flexibility to my workflows by counting whether there’s a Workflow Input to determine where it’s being run – using Count and If input is less than 1, then Get Clipboard otherwise Get Variable > Workflow Input.

If the workflow is run as an action extension, there will be content coming from the Workflow Input and will return a Count of 1, so Get Variable retrieves that input and passes it along. However, if it’s not run from the action extension there won’t be any input, so the Count would be 0 and the workflow then grabs your clipboard instead.

Workflow’s Content Engine will intelligently extract any links from the whichever content is output from the End If action, since the Create Bear Note from URL action is only set to accept URLs as input3. Bear will download the web page and its images into a note, then return to Workflow with the unique identifier for that Bear note.

The workflow places that unique ID into the template for Bear note links, then copies the new deep link to my note to the clipboard in case I want to save it elsewhere like in the notes of a Things task.

Now, if I want to grab a web page and save it to my notes, I can either:

  1. Share a link from the extension
  2. Copy a link and search for the workflow from Spotlight
  3. Copy a link & run the workflow from the widget
  4. Copy a link and run the workflow from Launch Center Pro

Try the workflow yourself: Save Page to Bear.

Quick Links:

  1. This action really just uses the Bear URL scheme for /grab-URL, which you can learn about here.
  2. Apple Watch workflows can’t display custom UI, so they’re usually uniquely designed to run on the watch instead of integrated like the rest. I bet I could devise a method for detecting whether it’s run from the watch and change behavior if so.
  3. Tap on the icon of any action in Workflow to see more details, like the description, what types of content it accepts and outputs, and any unique characteristics of the action or its parameters.







Get real-time BART departures in your widget

(Image modified from original source: https://goo.gl/yuBhyf)

Getting around with BART is one of the great benefits of living in the Bay Area. Whether you’re in San Francisco, headed across the Bay to Berkeley or Oakland, or coming in from close by, many choose to take the train instead of toughing it through in the nation’s second worst commute.

Getting rapid transit info

When I decide to take BART, I pick a nearby station, look up when the next train arrives, then decide when to leave based on how long it takes to walk there. Usually I just want to know if I need to leave now or if I should take the next train.

But BART doesn’t have an app, and their mobile website is too slow to navigate quickly. You can access the information in Apple Maps, but the data takes too many taps to retrieve. The Maps Transit widget only shows the “status” of your favorite transit lines, it takes a few steps to actually access the station data, and then arrivals are displayed as time of day instead of showing total minutes remaining.

Station data is buried a few layers deep in Apple Maps

With no quick way to view the data how I want, I built a tool to handle it myself – using Workflow, the free automation app bought by Apple earlier this year.

Enter Workflow

BART has their data freely available online, so I put together a workflow that lets you choose from the 46 stations in the Bay Area and get real-time arrivals sent right to your phone. Workflows combine a series of actions on your device into automation tools, like IFTTT but living on your phone and unlocking more of its capabilities as a computer in your pocket.

You can put the workflow in the widget area of your Notification Center too, so to use it you just swipe over from the Home Screen, tap the workflow, choose a station, and it will display departure times right there without even opening an app.

See up to 24 workflows in the widget by tapping “Show More”

I’ve entered the codes for every station in the workflow, so you can delete the ones you don’t need and set up a shortlist of favorites. The workflow will look complicated, but you don’t need to worry – just run it and it should work great.

Now, whenever you’re looking to catch BART, you can check departures for your favorite station in just a few seconds.

Get the workflow here. Safe travels!

If you’re looking to learn more about Workflow, check out the documentation on the website, MacStories’ extensive coverage, and the community on Reddit. Plus, I’ll be writing more here soon.