Federico Viticci

9467 posts on MacStories since April 2009

Federico is the founder and Editor-in-Chief of MacStories, where he writes about Apple with a focus on apps, developers, iPad, and iOS productivity. He founded MacStories in April 2009 and has been writing about Apple since. Federico is also the co-host of AppStories, a weekly podcast exploring the world of apps, and Dialog, a show where creativity meets technology.

He can also be found on his two other podcasts on Relay FM – Connected and Remaster.

Mastodon: @viticci@macstories.net

| Instagram: @viticci |

THIS WEEK'S SPONSOR:

Collections

A Powerful Database with iCloud Sync


Fiery Feeds for iOS Added an In-App Split View Mode That I Wish More iPhone Apps Offered

Vertical split view in Fiery Feeds.

Vertical split view in Fiery Feeds.

A few weeks ago on Mastodon, I shared a simple feature request: a split-screen mode for iPhone RSS readers that would allow me to scroll headlines in the one half of the screen and preview actual articles in the other.

If this sounds familiar, you’re not alone: back in 2007, Steve Jobs demoed pretty much the same thing for the first version of the Mail app for iPhone OS 1.0. That layout mode never shipped, and probably rightfully so at the time given the limited screen real estate of the first iPhone.

Read more


S-GPT 1.0.2 Brings Date and Time Awareness, Integration with macOS Services Menu, Passthrough Mode, Better HomePod Support, and More

S-GPT 1.0.2.

S-GPT 1.0.2.

I just published version 1.0.2 of S-GPT, the shortcut I released last week to have conversations with OpenAI’s ChatGPT and integrate it directly with native features of Apple’s OSes. You can find the updated download link at the end of this post, in the original article, and in the MacStories Shortcuts Archive; before you replace version 1.0.1 of S-GPT, save your existing OpenAI API key somewhere as you’ll have to paste it again in the shortcut later.

I’m going to include the full changelog for S-GPT 1.0.2 below, but long story short: S-GPT is now aware of the current date and time, and I’ve heard all the requests about improving interactions with the HomePod and Siri, so I made that part much better. S-GPT can now perform a variety of date/time calculations with natural language, and you can end a conversation by saying “no” or “stop”.

Read more


S-GPT 1.0.1

I just released a small bug fix update for S-GPT, my shortcut to integrate OpenAI’s ChatGPT large language model with the Shortcuts app on all Apple platforms.

Version 1.0.1 of S-GPT is a relatively minor update that comes with an initial round of improvements based on early feedback I’ve received for the shortcut, including:

  • A proper error-checking alert that tells what went wrong with a request to the ChatGPT API;
  • A better summarization of Safari webpages passed from the share sheet that no longer “hallucinates” results but actually summarizes text extracted via Safari’s Reader technology from any web article;
  • A new behavior for text input on watchOS, which now defaults to dictation rather than keyboard input. I’ve covered this more in detail in today’s issue of MacStories Weekly for Club members.

Additionally, I also realized that the usage tips that S-GPT was displaying every time it asked you to enter some text may have been nice the first three times you used the shortcut, but became annoying very quickly. That was especially true when using S-GPT with Siri in a voice context since they would be read aloud every time. For these reasons, I removed tips and simplified the shortcut’s questions to “What do you want to ask?” and “Want to follow up?”.

In case you missed my introduction of S-GPT earlier this week, you can read the original story here and find out more about how the shortcut works and what it does. I updated the links to the S-GPT and S-GPT Encoder shortcuts in the story to the latest version; you can also find the updated shortcuts in MacStories Shortcuts Archive.

I just released a small bug fix update for S-GPT, my shortcut to integrate OpenAI’s ChatGPT large language model with the Shortcuts app on all Apple platforms.

Version 1.0.1 of S-GPT is a relatively minor update that comes with an initial round of improvements based on early feedback I’ve received for the shortcut, including:

  • A proper error-checking alert that tells what went wrong with a request to the ChatGPT API;
  • A better summarization of Safari webpages passed from the share sheet that no longer “hallucinates” results but actually summarizes text extracted via Safari’s Reader technology from any web article;
  • A new behavior for text input on watchOS, which now defaults to dictation rather than keyboard input. I’ve covered this more in detail in today’s issue of MacStories Weekly for Club members.

Additionally, I also realized that the usage tips that S-GPT was displaying every time it asked you to enter some text may have been nice the first three times you used the shortcut, but became annoying very quickly. That was especially true when using S-GPT with Siri in a voice context since they would be read aloud every time. For these reasons, I removed tips and simplified the shortcut’s questions to “What do you want to ask?” and “Want to follow up?”.

In case you missed my introduction of S-GPT earlier this week, you can read the original story here and find out more about how the shortcut works and what it does. I updated the links to the S-GPT and S-GPT Encoder shortcuts in the story to the latest version; you can also find the updated shortcuts in MacStories Shortcuts Archive.

S-GPT

S-GPT is a shortcut to have conversations with OpenAI’s ChatGPT assistant on your iPhone, iPad, and Mac. The shortcut supports both text conversations as well as voice interactions when used inside Siri. S-GPT comes with native system integrations on Apple platforms including the ability to process text from your clipboard, summarize text found in photos, export conversations to Files and Finder, and even create playlists in the Music app. The shortcut requires an OpenAI API token and a helper shortcut called S-GPT Encoder that needs to be downloaded separately.

Get the shortcut here.

S-GPT Encoder

This is a helper shortcut for S-GPT that needs to be downloaded and installed separately. Without this shortcut, S-GPT won’t work.

Get the shortcut here.

Permalink

Introducing S-GPT, A Shortcut to Connect OpenAI’s ChatGPT with Native Features of Apple’s Operating Systems

S-GPT for Shortcuts.

S-GPT for Shortcuts.

Update, April 13: I’ve updated S-GPT to version 1.0.2. You can read the full changelog here. All download links have been updated.

Update, April 13: For Club MacStories+ and Premier members, I’ve published Part 1 of an extensive ‘Making Of’ series about S-GPT. This is a technical deep dive for my Automation Academy series. You can find it here and sign up for or upgrade to a Premier account using the buttons below.

Update, April 7: For Club MacStories members, I’ve shared some optional prompts to add different personalities to S-GPT, including two inspired by Roy Kent and Steve Jobs. You can get the prompts and read more here; the main S-GPT shortcut is and will remain free-to-use for everyone, of course.

Update, April 7: I’ve updated S-GPT to version 1.0.1. You can read more details here. All download links to the shortcuts have been updated to the latest version.


It’s the inaugural week of the second annual edition of Automation April, and to celebrate the occasion, I’ve been working on something special: today, I’m introducing S-GPT, an advanced conversational shortcut for ChatGPT that bridges OpenAI’s assistant to native system features of iOS, iPadOS, macOS, and watchOS.

S-GPT (which stands for Shortcuts-GPT) is free to use for everyone, but it requires an OpenAI account with an associated pay-as-you-go billing plan since it takes advantage of OpenAI’s developer API, which has a cost. S-GPT was built with the latest ChatGPT API, and it can be used both with the existing ChatGPT 3.5 model or – if you have access to it – the ChatGPT 4 API.

While the shortcut is free for MacStories readers, I will be publishing a detailed, in-depth Automation Academy class soon for Club MacStories Plus or Premier members to explain the techniques and strategies I used to build this shortcut. I genuinely think that S-GPT is, from a technical perspective, my best and most advanced work to date; I hope my Academy class will help others learn some useful tips for Shortcuts and, in return, make even better automations for our contest.

With that said, let’s look at what S-GPT is and what you can do with it.

Read more


Using ‘Phone Link’ Between iOS and Windows for iPhone Notifications and iMessage

I enjoyed this explanation by The Verge’s Tom Warren on how Microsoft’s Phone Link app – which has long allowed Android users to connect their smartphones to a Windows PC – has been updated to support iOS notifications and sending texts via iMessage. From the story:

The setup process between iPhone and PC is simple. Phone Link prompts you to scan a QR code from your iPhone to link it to Windows, which automatically opens a lightweight App Clip version of Phone Link on iOS to complete the Bluetooth pairing. Once paired, you have to take some important steps to enable contact sharing over Bluetooth, enable “show notifications,” and allow system notifications to be shared to your PC over Bluetooth. These settings are all available in the Bluetooth options for the device you paired to your iPhone.

And:

Microsoft’s Phone Link works by sending messages over Bluetooth to contacts. Apple’s iOS then intercepts these messages and forces them to be sent over iMessage, much like how it will always automatically detect when you’re sending a message to an iPhone and immediately switch it to blue bubbles and not the green ones sent via regular SMS. Phone Link intercepts the messages you receive through Bluetooth notifications and then shows these in the client on Windows.

I got access to the updated version of Phone Link on my PC today, and this integration is pretty wild and it actually works, albeit with several limitations.

First, the setup process is entirely based on an App Clip by Microsoft, which is the first time I’ve seen and used an App Clip in real life. Essentially, my understanding is that this works similarly to how an iPhone can pair with an old-school Bluetooth car system: the iPhone and PC pair via Bluetooth, and you can then provide the PC with access to your notifications and contacts from iOS’ Bluetooth settings. This is the same UI I have for my KIA Sportage’s system, which uses regular Bluetooth to pair with my iPhone and can also display contacts and missed calls.

The setup process based on an App Clip.

The setup process based on an App Clip.

The difference between my car and Phone Link, of course, is that with Phone Link you can type text messages from a PC and they will be sent as iMessages on iOS. This bit of dark magic comes with a lot of trade-offs (check out Warren’s full story for the details on this), but it works for individual contacts. I’ve been able to start a conversation with John, reply to his messages from Windows notifications, and even send him URLs1, and they were all correctly “intercepted” by iOS and sent over as iMessages. I’ve also been impressed by the ability to clear notifications from a PC and have them go away on iOS’ Lock Screen immediately.

The Phone Link app paired with my iPhone.

The Phone Link app paired with my iPhone.

This was then sent as an iMessage.

This was then sent as an iMessage.

The limitations of Phone Link for iPhone users mean you’ll always have to fall back to the actual iOS device for something – whether it’s posting in an iMessage group or sending a photo or acting on notifications – but for quick messages, glancing at notifications, and clearing them, I think this integration is more than good enough.


  1. Fun fact: raw URLs sent from Windows are delivered as rich links from iMessage, but the card’s preview doesn’t load by default on the recipient’s device. ↩︎
Permalink

Apple Releases iOS and iPadOS 16.4 with New Emoji, Notifications for Web Apps on the Home Screen, Voice Isolation for Cellular Calls, New Shortcuts Actions, and More

iOS 16.4 brings new emoji, push notifications for web apps on the Home Screen , Mastodon link previews, and more.

iOS 16.4 brings new emoji, push notifications for web apps on the Home Screen , Mastodon link previews, and more.

Today, Apple is releasing iOS and iPadOS 16.4, the fourth major updates to the OSes that introduced support for the customizable Lock Screen and Stage Manager last year, respectively.

Ahead of the debut of Apple Music Classical tomorrow and just a few months before a WWDC that’s rumored to be focused on the company’s upcoming headset and a relatively small iOS 17 update, 16.4 is comprised of two big additions to iOS and iPadOS (new emoji and push notifications for web apps on the Home Screen) alongside a variety of smaller, but notable improvements such as some new Shortcuts actions, Mastodon link previews in iMessage, some tweaks to Podcasts and Music, and more.

Let’s take a look.

Read more


The Mac and iPad Pro Are on a Collision Course

Jason Snell, in an excellent column for Macworld:

Sometimes I look back at all the effort Apple has made with the iPad Pro and wonder if it was worth it. All the additions of Mac-ish features have added complexity that’s probably lost on most users of iPadOS, and the power users for whom they were intended are probably well aware of all the ways they don’t really match up the Mac features they’re duplicating.

I want to see what happens when the walls come down. Today’s iPad Pro is powered by the same chip that’s in the MacBook Air. Would it be such a cataclysm if I could simply reboot that iPad into macOS or run macOS inside a virtual machine?

Likewise, what if the Mac had a touchscreen and Apple Pencil support and came in shapes that weren’t the traditional laptop? What if the Mac began to offer the ergonomic flexibility that iPadOS is so good at? What if I ripped the keyboard off a MacBook and had the option to switch to a touch-based mode that was essentially iPadOS?

I love this story, which I recommend reading in its entirety, because it feels as if Jason stared directly into my soul and wrote about something I’ve been feeling for the past several months.

From my perspective, Stage Manager’s failure to reinvent multitasking and iPadOS’ perennial lack of pro features (Jason mentions a proper audio subsystem in his story, and I agree; I wrote this four years ago, and nothing has improved) were the final straw that convinced me to start looking elsewhere for a convertible computer in my life. I could buy a MacBook Air, but I don’t want to be stuck with a laptop that doesn’t have a touchscreen and whose keyboard you can’t detach.

I fear that I’m going to have to wait a couple of years for the Apple computer I want to exist, and I’m not sure anymore that iPadOS can evolve in meaningful ways in the meantime.

Permalink

Apple Frames 3.1.1 with Support for Passthrough Mode

The 'Shortcut Result' variable, used as an image variable in a shortcut that calls Apple Frames.

The ‘Shortcut Result’ variable, used as an image variable in a shortcut that calls Apple Frames.

I just released a small update to Apple Frames 3.1, which came out earlier this week, with a new output command: &passthrough. With this output command for the Apple Frames API, you’ll be able to generate a framed image (from whatever source you like) and simply pass its result to the next action in a shortcut as a native image variable.

I wrote about this as part of my Extension column in MacStories Weekly today, where I also covered the ability to run Apple Frames from the command line on macOS. Here’s the excerpt about version 3.1.1 of Apple Frames and the new passthrough mode:

As I was researching this column for Weekly, I realized there was an obvious candidate for an output command I did not include in Apple Frames 3.1: a passthrough command to, well, pass framed images along as input for the next action of a shortcut.

Here’s what I mean: when you run Apple Frames from a helper shortcut using the ‘Run Shortcut’ action, that action produces an output variable called ‘Shortcut Result’. If you’re running Apple Frames as a function, thus turning it into a feature of another workflow, it can be useful to take the framed images it produces and use them as a native variable in other actions of the shortcut. The problem is that the output commands I launched with Apple Frames 3.1 all involved “storing” the framed images somewhere, whether it was Files or the system clipboard.

This is no longer the case with the &passthrough output command I added to Apple Frames 3.1.1, which you can redownload from the MacStories Shortcuts Archive or directly from this link. If you run the Apple Frames API with this command, framed images will be passed along as native output of the shortcut, which you can reuse as a variable elsewhere in a shortcut that’s invoking Apple Frames.

And:

Any shortcut or longer workflow that involves running Apple Frames in the background and retrieving the screenshots it frames can take advantage of this method, allowing you to bypass the need to store images in the clipboard, even if temporarily. Essentially, passthrough mode turns Apple Frames into a native action of the Shortcuts app that returns a standard image variable as its output.

This is the only change in version 3.1.1 of Apple Frames, and I’m excited to see how people will take advantage of it to chain Apple Frames with other shortcuts on their devices. You can download the updated version of Apple Frames below.

Apple Frames

Add device frames to screenshots for iPhones (11, 8/SE, and 12-13-14 generations in mini/standard/Plus/Pro Max sizes), iPad Pro (11” and 12.9”, 2018-2022 models), iPad Air (10.9”, 2020-2022 models), iPad mini (2021 model), Apple Watch S4/5/6/7/8/Ultra, iMac (24” model, 2021), MacBook Air (2020-2022 models), and MacBook Pro (2021 models). The shortcut supports portrait and landscape orientations, but does not support Display Zoom; on iPadOS and macOS, the shortcut supports Default and More Space resolutions. If multiple screenshots are passed as input, they will be combined in a single image. The shortcut can be run in the Shortcuts app, as a Home Screen widget, as a Finder Quick Action, or via the share sheet. The shortcut also supports an API for automating input images and framed results.

Get the shortcut here.

Permalink

Apple Frames 3.1: Extending Screenshot Automation with the New Apple Frames API

Apple Frames 3.1 comes with a lightweight Apple Frames API to extend its automation capabilities.

Apple Frames 3.1 comes with a lightweight Apple Frames API to extend its automation capabilities.

Update, March 3: Version 3.1.1 of Apple Frames has been released with support for a new passthrough output command. This post has been updated to reflect the changes. You can redownload the updated shortcut at the end of this post.


Today, I’m happy to introduce something I’ve been working on for the past couple of months: Apple Frames – my shortcut to put screenshots captured on Apple devices inside physical device frames – is getting a major upgrade to version 3.1 today. In addition to offering support for more devices that I missed in version 3.0 as well as some bug fixes, Apple Frames 3.1 brings a brand new API that lets you automate and extend the Apple Frames shortcut itself.

By making Apple Frames scriptable, I wanted to allow power users – such as designers and developers who rely on this shortcut to frame hundreds of images each week – to save valuable time without compromising the accessible nature of Apple Frames for other people. This is why all of the new advanced features of Apple Frames are optional and hidden until you go look for them specifically. Furthermore, even if you do want to use the Apple Frames API, you’ll see that I designed it in the spirit of Shortcuts: it does not require any code and it’s entirely powered by simple, visual ‘Text’ actions.

I’m incredibly excited about what Apple Frames can do in version 3.1, so let’s dive in.

Read more