Categories
Links

Apple is Hiring a Home Screen UI Engineer »

From Careers at Apple (line breaks added for clarity):

“The Home Screen team is responsible for many of the iconic system experiences on Apple devices.

The Team is responsible for the Home Screen, Control Center, Status Bar, Volume HUD and various other components on iOS and iPadOS.

As an engineer on the Home Screen team, your responsibilities will range from prototyping new user interface paradigms and implementing new features to defining API, fixing bugs, and improving performance.

You should have an excellent understanding of software design, good debugging skills and an eagerness to work hard and learn a lot.

You should have an eye for detail and a feel for making user interactions feel fluid and fun.

As our team works cross-functionally with many other groups across Software Engineering, Hardware Engineering and Design, you should have a good understanding of systems and excellent communication and collaboration skills!”

Very intriguing – I’d love to see things evolve for the future.

If this sounds like you and you’re qualified, please join Apple and set us up for the next generation of the Home Screen experience 🙏.

Check out the full job listing from Apple.

Categories
How To Links

How to Fix Images From Your Blog Not Showing Up in Posts on X (Twitter) »

When sharing from my blog, I’ve been having the tiniest problem for ages – for about one out of every ten posts, the images from my blog will simply not show up on Twitter, like the image below:

Thankfully blogger Deepak shared this solution on his site:

I discovered that adding a random query parameter at the end of the URL, that you’re going to share on X, works perfectly. X treats this as a new URL and re-fetches the card image. For example, you can add something like ?test=1 like below example, or whatever random query parameter you like. And then share this URL on X.

But don’t worry, adding the query parameter isn’t going to break the URL or affect the content shown on the page. It’s perfectly safe to use this.

THANK GOD. After a bit of testing, I’m going with ?refresh=true1 at the end of blog post URLs that are being troublesome – makes sense to me, and works well.

View the post from DeepakNess and follow my full feed account @CassinelliMedia.

  1. I’ve integrated the URL pattern into the shortcut from the header image – coming soon for members (or, you can just build it yourself from the photo).
Categories
Apps Links

Ulysses adds Find Sheet, Find Group actions for Shortcuts; Substack support »

Ulysses, my blogging app of choice, has updated to version 39 with the following changes:

  • Liquid Glass support
  • Menus reworked
  • Revision Mode is just another tab
  • Editor Focus is now “Hide Interface”
  • Swipe Gestures
  • Swipe Actions
  • Improved keyboard navigation
  • Share/import projects
  • Copy for Substack and Basecamp
  • Theming
  • App Intents
  • Other improvements
  • Buxgfixes

Naturally, I want to highlight the section on their new Shortcuts actions and Spotlight for Mac support in particular:

App Intents

We revamped our integration with Spotlight and the Shortcuts app. On macOS 26, actions can now be triggered directly from Spotlight. New actions include:

“Find Sheet” and “Find Group”, with configurable filters.

“Import File”, to convert any given text file into a sheet.

“Search in Ulysses”, to open the in-app search with a pre-defined search term.

Also:

Copy for Substack and Basecamp

You can now copy your texts for use on Substack and Basecamp.

Most formatting is supported; some limitations apply (images won’t work).

On Substack, we even support buttons (via raw source) and Spotify previews (via text links).

This will be good for my newsletter, which I don’t want to write within the Substack browser – I prefer a dedicated editor experience like Ulysses.

Check out the full page of Release Notes for Ulysses and get Ulysses from the App Store.

Categories
Gear Links

FYI: Reset Your Standing Desk By Lowering It All The Way Down »

Today, my standing desk started to make a clicking sound and only showed zeroes on the display. It wouldn’t raise, but it would move down – there was clearly nothing wrong with the motor, but I didn’t want to continue lowering it in case it wouldn’t raise back up. While looking for replacements, I thankfully found this video that explains what to do – reset the motors:

If your desk loses power, it also loses its memory – which means it no longer remembers which position it is in. Therefore, your desk needs to find a reference point – that being the lowest position.

Pretty simple, but something I hadn’t run into before – I was not happy with the idea that I’d have to replace the entire controller for ~$400.

The video also highlights the importance of moving anything out of the way too – standing desk motors are powerful and won’t stop if something is in the way. I’ve had my desk lift itself off the ground because it caught on the drawers nearby, which could have been more serious if my desk was fully automated1.

Watch the full video on YouTube.

  1. My standing desk is older so I have to hold down buttons to move them to the presets. ↩︎
Categories
Links

Apple Intelligence Shortcuts for Real Life »

Stephen Robles has 17 new real-life use cases for putting the Use Model action for Apple Intelligence to work in Shortcuts:

18 NEW Apple Intelligence Shortcuts, with AI-generated packing lists, dynamic birthday texts, smart “remember anything” to Notes, Apple Music automations, and more!

Watch the video on YouTube.

Categories
Links

Dictate To Poke with the Action Button on Apple Watch Ultra »

From Eli Kia, GP at Fortify VC:

My @interaction Poke shortcut for Apple Watch Ultra’s action button.

Dictate Text + Send Message – nice & simple, and a neat use of the Action Button (on either iPhone or Apple Watch).

For anyone who doesn’t know, Poke is “your proactive AI assistant that turns your emails into action” and is known on X for their AI’s cheeky-but-well-done personality as well as smooth integrations.

There are new Find Conversation and Find Messages actions for Shortcuts that I haven’t seen anyone play with yet – those could be an interesting addition to this workflow.

Also the Messages automation could be fun – you could flash the lights when you get a Poke message, for example.1

View the original on X.

  1. These are expanded notes from my quote tweet.
Categories
Developer Links

Ideally, Apple Intelligence Could Query Your Personal Context »

Jason Snell on the Upgrade podcast:

It’s the idea that there’s a personal data trove that you have. And then you’re asking the model to query the contents. […] You know about all this stuff, now do stuff with it.

And if you can do that on device, it’s really powerful. [I]t’s hard to do that in the cloud because you would actually need to upload it to Private Cloud Compute. And that’s a lot of data. So what you want is some parsing to happen on device.

But that’s the dream, right? Is that your phone knows about your stuff and then the LLM that’s running can make uh distinctions based on your stuff.

And ideally the model could potentially, and I know this is wild, I don’t think they’ve talked about it, but ideally the model could query your personal context, get a bunch of related data out, and then send that in a query to the private cloud and have it process it, right? You could have a kind of cherry picking your personal data model on device that then kicks it to the more powerful model to process it and intuit things about it.

There’s lots of ways you could do this. It’s a great idea. It was a great idea in 2024 when they showed it, but they got to do it – is the challenge there.

In reply to Jason, cohost Myke Hurley said the following:

So, I’m just going to make a little prediction. These[…] things that I’ve spoken about, we will see iOS 27 before [they] ship.

I believe they will have stuff – like, I believe they will have [releases] in the spring like it has been rumored, but I don’t think all of these things.

I think particularly the Personal Context thing… we may never see that.

For what it’s worth Apple has done this and named it Queries. Shortcuts users might better understand this as the Find actions, which allow actions to find and filter data from apps before using it in their shortcuts.

Introduced for developers alongside the App Intents API in 2022, Queries are how intents/actions retrieve entities/data from apps. In their most recent session “Get to know App Intents” from 2025, they explicitly say the following – a phrase that caught my attention in regards to the “new Siri” we’ve been waiting for:

Queries are the way the system can reason about my entities

Apple has also been building out their ability to index and query these entities through their Spotlight support, as well as now Visual Intelligence.

You can learn more about Entity Queries & Indexed Entities, and watch the developer sessions for Get to Know App Intents & Explore new advances in App Intents.

Check out Upgrade #588, follow the show on Apple Podcasts, or watch the video on YouTube.

Categories
Gear Links

Marvel 3D Movies on Apple Vision Pro “Look Better Than Anyone Has Ever Seen Them Before” »

With the news today that Marvel just updated The Fantastic Four: First Steps for 3D on Apple Vision Pro, I was reminded of an old thread from Marvel VFX supervisor Evan Jacobs where he made the following claim:

The 3D Marvel films on the AVP look better than anyone has ever seen them before. The capabilities of the VisionPro are really unique and we remastered all the films for this format.

And:

Our goal was to match the color and brightness of the 2D HDR versions but for 3D. The Vision Pro delivers basically perfect stereo contrast so no ghosting, HDR color, UHD resolution and we did some work on the older titles as well.

Two Reddit threads reference the post, but it appears Jacobs left Twitter and his X account no longer exists – however, I found a direct quote from this Apple Vision Pro forum.

In Disney’s press release at the time, they also said the following:

With 3D movies, Disney’s storytelling will also leap off the screen like never before with remarkable depth and clarity for an unprecedented in-home 3D experience on Disney+ with Apple Vision Pro.

Check out the forum post, view the original press release from Disney, and see the how F4 looks on Vision Pro from Ben Geskin on X:

Categories
Developer Links

How to integrate your app with Visual Intelligence »

From the Apple Developer documentation (break added):

With visual intelligence, people can visually search for information and content that matches their surroundings, or an onscreen object.

Integrating your app with visual intelligence allows people to view your matching content quickly and launch your app for more detailed information or additional search results, giving it additional visibility.

And:

To integrate your app with visual intelligence, the Visual Intelligence framework provides information about objects it detects in the visual intelligence camera or a screenshot. To exchange information with your app, the system uses the App Intents framework and its concepts of app intents and app entities.

When a person performs visual search on the visual intelligence camera or a screenshot, the system forwards the information captured to an App Intents query you implement. In your query code, search your app’s content for matching items, and return them to visual intelligence as app entities. Visual intelligence then uses the app entities to display your content in the search results view, right where a person needs it.

To learn more about a displayed item, someone can tap it to open the item in your app and view information and functionality. For example, an app that allows people to view information about landmarks might show detailed information like hours, a map, or community reviews for the item a person taps in visual search.

Browse the full documentation from the Apple Developer site and learn how to use Visual Intelligence for iPhone.

 

Categories
Gear How To Links

How to use Visual Intelligence on iPhone »

From Apple Support:

Use visual intelligence to quickly learn more about what’s in front of you, whether in your physical surroundings or on your iPhone screen.

To learn more about your physical surroundings using your iPhone camera on models that have the Camera Control, just click and hold it to do things like look up details about a restaurant or business; have text translated, summarized, or read aloud; identify plants and animals; search visually for objects around you; ask questions; and more. […You can also] access visual intelligence by customizing the Action button or Lock Screen, or opening Control Center. See Alternate options to using the Camera Control.

To learn more about the content on your iPhone screen across your apps, simply press the same buttons you use to take a screenshot. You can search visually, ask questions, and take action, like turning a flyer or invite into a calendar event.

I’ve been learning more about now that developers can integrate their app with Visual Intelligence.

View the full piece on the Apple Support site and read more about the Developer documentation.

 

Categories
Links

Mark Gurman on TBPN: How Siri Will Be Powered By Google’s Gemini »

In an effort to put my TBPN shortcuts to good use, I turned on today’s stream for Monday, November 3rd – and happened upon a segment with Mark Gurman, Managing Editor and Chief Correspondent at Bloomberg News:

They discussed how Gurman got started covering Apple, the stories resurfacing this week around the Siri update being powered by Google’s Gemini, and the iPhone 17 lineup.

View the clip on YouTube.

 

Categories
Links

Shortcuts Showdown on The Vergecast »

From Stephen Robles:

David Pierce, host of The Vergecast often complains that Shortcuts is too complicated and not useful. Equally as often, I tell him he’s wrong on social media, but this time I got to do it live! My thanks to David for inviting me on The Vergecast, and I’m pretty sure I won this round.

Here are the chapters:

Check the post on Stephen’s site Beard.FM, check out the episode of The Vergecast (and follow the show on Apple Podcasts), and see the full video on YouTube.

Categories
Links

Creative Neglect: What About the Apps in Apple? »

Joe Rosensteel, writing for Six Colors:

One of the things that I think about from time to time is Apple’s collection of apps. Some are the crown jewels, like Apple’s pro apps, and others help an everyday consumer to tackle their iLife. All are pretty starved for attention and resources, outside of infrequent updates aligned with showing off the native power of Apple Silicon, Apple Intelligence, or demos of platform integration that never quite get all the way there.

Three things really brought this up to the surface for me recently: The neglect of Clips and iMovie, the radio silence regarding Pixelmator/Photomator, and Final Cut Pro being trotted out for demos but not shipping appropriate updates.

I agree with Joe’s sentiment, but direct it more towards—you guessed it—the Shortcuts app than Pixelmator, which I’ve been saying is within a reasonable window for updating – anything they’re working on could only feasibly ship after en entire yearly cycle.

Shortcuts, on the other hand, has been out for over 5 years and still hasn’t evolved too far beyond its original Workflow UX – Six Colors’ own Jason Snell just talked about how Shortcuts is not really that friendly on Monday of this week.

Read the whole story on Six Colors.

 

Categories
Links

OpenAI acquires Software Applications Incorporated, maker of Sky

From the OpenAI company blog:

AI progress isn’t only about advancing intelligence—it’s about unlocking it through interfaces that understand context, adapt to your intent, and work seamlessly. That’s why we’re excited to share that OpenAI has acquired Software Applications Incorporated, makers of Sky.

And:

“We’ve always wanted computers to be more empowering, customizable, and intuitive. With LLMs, we can finally put the pieces together. That’s why we built Sky, an AI experience that floats over your desktop to help you think and create. We’re thrilled to join OpenAI to bring that vision to hundreds of millions of people.” —Ari Weinstein, Co-Founder and CEO, Software Applications Incorporated

Incredible run by my former teammates – first selling Workflow to Apple, and now Sky to OpenAI.

I’m super excited to see their deep talent and passion reflected in the ChatGPT app.

Read the full blog post from OpenAI.

 

Categories
Links

Jason Snell: Shortcuts Is Not Really That Friendly

From Jason Snell, on Upgrade: An LLM in the Woods:

“It’s like me saying, oh, you know, Shortcuts does a pretty good job of being a consumer user scripting utility.

It’s like, well, yeah, but also really no.”

Plus, later:

“I mean, that’s the bottom line is it’s a great idea. And like I said about Misty Studio1, all things considered, it does a pretty good job of being kind of a friendly face to building an AI model, but in the end, it’s like Shortcuts in that it’s not really that friendly.”

Fair enough – if it truly was, I’d have been out of a job for a long time.

Check out the Upgrade podcast on Apple Podcasts and YouTube.2

  1. For reference, they explained Misty Studio earlier:
    > “Misty Studio is a demo that Apple did for the M5. Misty Studio runs an open-source model locally”
  2. P.S. I apologize in advance to Jason for the URL slug 🙂

 

Categories
Developer Links News

Apple’s Foundation Models Framework Unlocks New App Experiences Powered by Apple Intelligence »

From Apple Newsroom:

With the release of iOS 26, iPadOS 26, and macOS 26 this month, developers around the world are able to bring even more intelligent experiences right into their apps by tapping into the on-device large language model at the core of Apple Intelligence.

The Foundation Models framework allows developers to create new intelligence features that protect users’ privacy and are available offline, all while using AI inference that is free of cost.

You can now add intelligence to your apps for free on Apple platforms – and while it’s relatively simple today… that’s only for now.

View the full article.

Categories
Links News Shortcuts

What’s New in Shortcuts for iOS 26 »

From Apple Support:

New in iOS 26, iPadOS 26, macOS 26, watchOS 26, and visionOS 26

This update includes enhancements to the Shortcuts app across all platforms, including new intelligent actions and an improved editing experience. Shortcuts on macOS now supports personal automations that can be triggered based on events such as time of day or when you take actions like saving a file to a folder, as well as new integrations with Control Center and Spotlight.

New Actions (Editor’s note: shortened for sake of space)

  • Freeform
  • Image Playground, requires Apple Intelligence*
  • Mail
  • Measure
  • Messages
  • Screen Time
  • Sports
  • Photos
  • Reminders
  • Stocks
  • Use Model, requires Apple Intelligence*
  • Visual Intelligence, requires Apple Intelligence*
  • Voice Memos
  • Weather
  • Writing Tools, requires Apple Intelligence*

Updated Actions

For those building custom shortcuts, some actions have been updated:

  • “Calculate Expression” can now evaluate expressions that include units, including real time currency conversion rates, temperature, distance, and more
  • “Create QR Code” can now specify colors and styling
  • “Date” can now specify a holiday
  • “Find Contacts” can now filter by relationship
  • ”Transcribe Audio” performance has been improved
  • “Show Content” can now display scrollable lists of items, like calendar events, reminders, and more

Shortcut Editor

For those building custom shortcuts, updates have been made to the shortcut editor:

  • Improved drag and drop and variable selection
  • Over 100 new icon glyphs are now available, including new shapes, transportation symbols, and more
  • Rich previews of calendar events, reminders, and more
  • The ability to choose whether shortcuts appear in Spotlight Search

macOS Improvements

Spotlight

Shortcuts can now accept input, like selected text from an open document, when being run from Spotlight.

Automations

Shortcuts can now be run automatically based on the following triggers:

  • Time of Day (“At 8:00 AM, weekdays”)
  • Alarm (“When my alarm is stopped”)
  • Email (“When I get an email from Jane”)
  • Message (“When I get a message from Mom”)
  • Folder (“When files are added to my Documents folder”)
  • File (“When my file is modified”)
  • External Drive (“When my external drive connects”)
  • Wi-Fi (“When my Mac joins home Wi-Fi”)
  • Bluetooth (“When my Mac connects to AirPods”)
  • Display (“When my display connects”)
  • Stage Manager (“When Stage Manager is turned on”)
  • App (“When ‘Weather’ is opened or closed”)
  • Battery Level (“When battery level rises above 50%”)
  • Charger (“When my Mac connects to power”)
  • Focus (“When turning Do Not Disturb on”)

Control Center

Shortcuts can be added as controls to Control Center and the menu bar, including Run Shortcut, Open App, and Show “Menu Bar” Collection

View the full release notes from Apple Support

Categories
Links

New Apple Intelligence Features Are Available Today »

From Apple Newsroom:

Search and Take Action with Updates to Visual Intelligence

Visual intelligence, which builds on Apple Intelligence, now helps users learn and do more with the content on their iPhone screen. It makes it faster than ever for users to search, take action, and answer questions about the content they’re viewing across their apps.

Users can search for the content on their iPhone screen to find similar images across Google, as well as apps that integrate this experience, such as eBay, Poshmark, Etsy, and more. If there’s an object a user is interested in learning about, like a pair of shoes, they can simply press the same buttons used to take a screenshot and highlight it to search for that specific item or similar objects online. And with ChatGPT, users can ask questions about anything they’re viewing onscreen.

Continue playback of video: Visual Intelligence on iPhone 17 Pro

Updates to visual intelligence help users learn and do more with the content on their iPhone screen.

Visual intelligence enables users to summarize and translate text, as well as add an event from a flyer on their iPhone screen to their calendar, with a single tap.

Users can also take advantage of these capabilities by using visual intelligence with their iPhone camera through Camera Control, the Action button, and in Control Center.

And:

Build Intelligent Shortcuts

Shortcuts help users accomplish more faster, by combining multiple steps from their favorite apps into powerful, personal automations. And now with Apple Intelligence, users can take advantage of intelligent actions in the Shortcuts app to create automations, like summarizing text with Writing Tools or creating images with Image Playground.

Users can tap into Apple Intelligence models, either on device or with Private Cloud Compute to generate responses that feed into the rest of their shortcut, maintaining the privacy of information used in the shortcut. For example, users can create powerful Shortcuts like comparing an audio transcription to typed notes, summarizing documents by their contents, extracting information from a PDF and adding key details to a spreadsheet, and more.

View the full story from Apple Newsroom.

Categories
Apps Gear Links

Stream Deck 7 Adds Virtual Decks, Key Logic, Weather, and App Status

My friends at Elgato have updated the Stream Deck app for Mac for version 7.0 with new features for creating virtual Stream Decks on your computer, key logic so each key have multi-tap abilities, weather updates in a new plugin, and quality-of-life features like showing whether an app is currently open.

Here’s how they describe the updates:

🎛️ Virtual Stream Deck — your on-screen workspace controller

Create unlimited virtual keys, customize actions and layouts, then pin them in place or summon to your cursor. It’s your OS sidekick, making every workflow fast and effortless. It’s Stream Deck on your computer, anywhere you go.

[…]

👇 Key Logic — multi-tap abilities

Assign up to three different actions to a single key using Key Logic. Perform a unique action based on how the key is pressed:

  • Press
  • Double press
  • Press and hold

For example, press to play/pause music, double press to skip tracks, or press and hold to go to the previous track.

[…]

⛅ Weather plugin – stay ahead of the forecast

The new Weather Plugin for Stream Deck puts live weather updates and forecasts at your fingertips, with minimal setup and configuration. Instantly see the sky’s latest mood and plan your day without ever picking up your phone or opening a browser.

[…]

🛠️ Improvements and bug fixes

The Open Application action now displays a green dot when the selected app is running.

You can now configure the Open Application action to either do nothing, close, or force quit the selected app when long-pressed.[…]

Virtual Stream Decks are extremely cool.

Check out the Elgato Stream Deck 7.0 Release Notes and get the Stream Deck from Elgato – be sure to sure use my discount code ZZ-CASSINELLI for 5% off.

Categories
Links

Apple News+ introduces Emoji Game 🍎📰➕ 😀🧩

From the Apple Newsroom:

Today, Apple News+ debuted Emoji Game, an original puzzle that challenges subscribers to use emoji to complete short phrases. Emoji Game is now available in English for Apple News+ subscribers in the U.S. and Canada.

“Emoji Game is the perfect addition to the Apple News+ suite of word and number puzzles, turning the emoji we use every day into a brainteaser that’s approachable and fun,” said Lauren Kern, editor-in-chief of Apple News.

More Apple News shortcuts incoming in 3… 2… 1…

View the full story from Apple.