One type of apps that make the Mac more useful than iPad for many are clipboard managers.
Instead of copying & pasting one thing at a time, tools like Alfred, Pastebot, and Copied let Mac users copy lots of information in batches and then use it later (often with special formatting or inserting with keyboard shortcuts).
On iOS, the problem isn’t nearly as solved – since apps don’t have the same access to your clipboard at all times, they can’t capture everything you’re cutting & pasting on your iPhone or iPad.
However, Copied does provide a solution that works across the Apple device line, letting you save things to their database, sync it across iPhone, iPad, and Mac, and share it elsewhere.
And, with support for URL scheme actions on iOS, it’s possible to use Copied in conjunction with an app like Shortcuts. You can create shortcuts that clip the contents of your clipboard, share sheet & save it into your Copied lists for organization, and much more.
Even though I've had the iPhone 7 Plus and iPhone X, I haven't nearly taken advantage of the 2x zoom lenses on both. I default to using the wider iPhone's lens since that's what I've always had before, plus years of training against using digital zooms makes it feel unnatural to zoom in with a phone.
Instead, I've been trying to switch to the 2x camera lens right away each time so I could get better use of it and see if there were any places I hadn't realized it would be helpful beyond Portrait Mode.
Here are a few spots the iPhones with the double lens hardware makes getting the right shot easier:
Taking pictures of tiny text: getting into tight spaces is easier when you zoom in, plus you don't lose quality – for example, taking a photo of the lid of my AirPods in order to capture the serial number (which inspired this post1).
Capturing documents: instead of leaning over and getting the phone up close to frame up the paper, zooming in and just pointing the phone down can help you get through a lot of pages without breaking your back
Getting shots that are out of your reach: if your arms are fully extended and you're trying to get a photo that's above your head or on top of something, the 2x lens can help you get that additional bit of perspective that you might otherwise miss. I've found it can be super handy to stick your arm up and get a zoomed in photo of what's just out of view.
Taking photos that match your eye's perspective: the default 28mm lens on the iPhone is much wider than the way you see things normally – the 2x zoom's 56mm lens is closer to the perspective we see ourselves (albeit more cropped in).
The wider lens can also distort vertical lines, especially if they're up close. Shooting with the longer lens also helps prevent as much warping, although you may need to stand further back. That being said, it doesn't work very well in low light.
Taking sample photos for a bigger shoot later: when I was preparing to make the photography for my HomePod review, I went around first with my iPhone X to scope out how I wanted my photos to look without needing to lug around my full camera.
The 2x lens more closely matched the "in your home" perspective I was trying to achieve, plus I could zoom in and out further to mimic the full range of my 12-60mm lens. I got sample shots so I could properly integrate the imagery into how I wrote the piece, then later did a proper photoshoot with lights and my camera to get the highest quality photography.
Some of these aren't particularly innovative ways to use a camera, but if you hadn't thought of one before it might be helpful2.
Many of the shots won't be up to par for crisp, clear focus or high quality levels of photography, but for quick memories and productive use cases it does the job well.
Next time you open up the camera app on an iPhone X/Plus, try switching to 2x and just looking through the viewfinder for a while – it may help you see things in a different way.
No, that's not the complete serial number of my AirPods. ↩︎
If you have any other suggestions, let me know on Twitter and I'll add them here & credit you. ↩︎
As a writer who generally focuses on complicated processes for using technology, I can find it tempting to default to lazy language that over-simplifies for me, but tends to makes things confusing for new users. If something is difficult for everyone else and I describe it as “simple”, I’ve just lost many people who might’ve otherwise made it through.
For example, when I wrote the Workflow documentation, I took care to make sure I avoided assuming the directions given were as straightforward as possible and could always be understood by someone without any technical training (like me).
Using these words in an explainer context is now banned from all of my writing.
Nothing with iOS automation or the technical details of how something works is easy, simple, or clear – at some point, it was explained to you. Not everyone knows, you don’t “just” do something because there’s a verb for that action, and many complex things are rarely obvious how to use at first.
I want to avoid alienating anyone who reads my writing or wants to learn more about how to use technology – the goal is to empower, not educate from above.
If you see me using this language, don’t hesitate to call me out.
But my favorite part was the overall angle – you can just use other people’s shortcuts and never actually build one yourself. Shortcuts is a fantastic visual programming tool, but if you don’t give a shit about that and just want to save time instead of learning how it works, you totally can.
(This post has been updated to refer to Shortcuts instead of Workflow, now that the app has been converted by Apple. All of these still work as quick shortcuts, but not everything operates fully from Siri.)
If you’re in the content creation business, time is of the essence. While you’re busy working on your craft and trying to put your work out into the world, the last thing you want to do is spend a lot of time on mundane tasks.
Watching people game online has been around for years, but I hadn’t thought much about where platforms like Twitch and YouTube Gaming could take the experience next.
Not only will people be viewing, but they’ll be part of the gameplay too and have more to do while they’re tuned in thanks to Twitch Extensions:
As of this writing, there are roughly 150 Twitch Extensions, and according to Twitch, more than 2,000 developers have signed up to create more.
Some extensions consist of simple stat overlays that let you get a better look at a streamer’s performance in games like Fortnite and Destiny 2. Others, like Darwin Project’s Spectator Experience, allow viewers to become active participants in the games they’re watching. But they all share the common goal of making Twitch more than just a place to seek out passive entertainment.
“I think, at the end of the day, we want every game to have an official extension,” Shevat said, adding that a lot of the content you see on a streamer’s Twitch page — including links to social media channels and personal websites — will become more interactive over time.
There are already a few live examples of these types of add-ons, including a Spotify extension that lets you see what music a broadcaster is rocking or an Amazon extension that makes it easy to buy your favorite streamer’s preferred PC parts right from their channel.
The most intriguing part comes at the end, where he frames playing with interactive viewers against the progression of computers up to now (emphasis mine):
“There is — and this is a very conservative approximation — 20 times more people watching people play, than people playing any game,” said Darveau.
“Playing without viewers involved will eventually feel like nowadays when you go on a computer, and there’s no internet.”
I’ve had the privilege to write for The Sweet Setup the last few months and now iMore, so I wanted to share some of the links here.
Primarily I’ve been writing about Shortcuts/Workflow, trying to get some of the ideas in my head out and into the world so other people can take better of the app – especially now that it’s free. But I’m also dabbling in product reviews & photography, a new challenge that’s proving lots of fun and hard work.
If you’ve picked up Apple’s HomePod in the past few weeks and tried to use iTunes on your Mac to Airplay something to the speaker, you probably got blasted with the music playing at full volume.
This occurs since HomePod uses iTunes’ in-app volume slider to adjust its levels rather than your Mac volume, and iTunes is usually at 100% because the hardware keys are used control my computer’s overall sound instead1. Plus, if I want to change the volume on HomePod after the music starts, I have to go into iTunes and drag the slider – you can’t turn it down that quickly.
To get around this, I installed a Mac app called iTunes Volume Control that’s available on GitHub. Created by Andrea Alberti, it’s an app that lives entirely in your menu bar and changes the Mac’s hardware volume keys to control iTunes instead. When it’s running, it can entirely take over mute, volume up, and volume down – or, you can set it so you have to hold a modifier key like Command before hitting the keys. I use the latter option, so I can control my Mac volume with the keys normally and then use ⌘ + or ⌘ – to adjust iTunes when I need to.
Once you’ve installed the app, you’ll find it’s much better experience playing music from iTunes with HomePod as your speaker. I set iTunes Volume Control to launch at login, so it’s basically always running when I use my computer and I never have to turn it on when I need it2. I’ll usually open iTunes, use ⌘ – to turn down the volume, then pick my song and AirPlay to my HomePod.
iTunes Volume Control also provides an option to change the step size for each press, so the volume can be changed in more specific intervals – you can set it go up 3% each time, for example, rather than the default 10% at a time. This gives you fine-grained control of the HomePod volume, right from your keyboard.3
I could see improving this setup using iTunes and AppleScript – you could set up a command to launch iTunes already set to 30% and set to AirPlay to the HomePod, avoiding the setup process each time I want to listen from my Mac on my HomePod. However, I have no experience there and that’s a project for another day.
The best part of this setup is that iTunes Volume Control is entirely free to download and use. Check out the documentation first, but use this link to get the app and start controlling your HomePod from your Mac.
Instead of adjusting the levels in iTunes and on your Mac separately, it’s much more common to leave iTunes at 100% and change the volume on the whole computer instead. ↩
I normally hide it in the menu bar using Bartender, so I can click on the Bartender icon to reveal it but keep it away from view otherwise. ↩
I do the same thing with HomePod normally by using my Apple Watch. Once you change the source in Control Center on your iPhone to the HomePod, the Now Playing controls show up on Apple Watch and let you control the smart speaker from your wrist. ↩
I’ve been doing more research on iOS lately as my iPhone is the device I use the most, so capturing full web pages quickly saves me a lot of time. While I really like Apple Notes’ latest iterations, it’s not easy to clip websites there – so I adopted Bear for notes, which has support for Markdown, images, and a handy Get URL function.
Bear’s ability to download websites as a note is killer, but it’s usually easily available for most people via their Action Extension. Rather than limiting my access to the share sheet, I’ve been taking advantage of the Shortcuts action Get Bear Note From URL1 to save web pages from anywhere on iOS.
Getting around with BART is one of the great benefits of living in the Bay Area. Whether you’re in San Francisco, headed across the Bay to Berkeley or Oakland, or coming in from close by, many choose to take the train instead of toughing it through in the nation’s second worst commute.