Federico Viticci

9531 posts on MacStories since April 2009

Federico is the founder and Editor-in-Chief of MacStories, where he writes about Apple with a focus on apps, developers, iPad, and iOS productivity. He founded MacStories in April 2009 and has been writing about Apple since. Federico is also the co-host of AppStories, a weekly podcast exploring the world of apps, and Unwind, a fun exploration of media and more.

He can also be found on his other podcasts on Relay FM: Connected and Remaster, two shows about Apple and videogames, respectively.

This Week's Sponsor:

Listen Later

Listen to Articles as Podcasts


How ChatGPT Changed Tech Forever

I thoroughly enjoyed this story from a couple weeks ago by David Pierce, writing for The Verge about OpenAI’s ChatGPT turning one and how it created a revolution in the tech industry that no one saw coming:

We definitely seem to like being able to more quickly write business emails, and we like being able to ask Excel to “make this into a bar graph” instead of hunting through menus. We like being able to code just by telling ChatGPT what we want our app to do. But do we want SEO-optimized, AI-generated news stories to take over publications we used to love? Do we want AI bots that act like real-life characters and become anthropomorphized companions in our lives? Should we think of AI more as a tool or a collaborator? If an AI tool can be trained to create the exact song / movie / image / story I want right now, is that art or is that dystopia? Even as we start to answer those questions, AI tech seems to always stay one step and one cultural revolution ahead.

At the same time, there have been lawsuits accusing AI companies of stealing artists’ work, to which multiple US judges have said, essentially: our existing copyright laws just don’t know what to do with AI at all. Lawmakers have wrung their hands about AI safety, and President Joe Biden signed a fairly generic executive order that instructed agencies to create safety standards and companies to do good and not evil. There’s a case to be made that the AI revolution was built on immoral and / or illegal grounds, and yet the creators of these models and companies continue to confidently go ahead with their plans, while saying it’s both impossible and anti-progress to stop them or slow them down.

This all gets really heady really fast, I know. And the truth is, nobody knows where all this will be even 12 months from now, especially not the people making the loudest predictions. All you have to do is look at recent hype cycles — the blockchain, the metaverse, and many others — for evidence that things don’t usually turn out the way we think. But there’s so much momentum behind the AI revolution, and so many companies deeply invested in its future, that it’s hard to imagine GPTs going the way of NFTs.

I recommend reading the whole piece on The Verge. I quoted these paragraphs because they get right to the heart of the conflict that I also feel whenever I think about ChatGPT and similar tools. On the one hand, they were (largely? Partially?) built with data sets stolen from artists and creators (including this very website); on the other, the practical benefits of, say, using ChatGPT to help me proof-read my articles are undeniable.

I’ve been thinking about these issues a lot, perhaps because I make a living out of, well, creating content for the Internet. Is there a way to enjoy the power of LLMs without feeling weird and conflicted about how they were made in the first place? Will it even matter years from now? I don’t know the answer, but I’m hoping Apple will have one.

Permalink

Improving the Copy and Paste Prompts on iOS

I couldn’t agree more with all the suggestions proposed by Matt Birchler, who envisions a more flexible permission flow for clipboard access on iOS that is entirely in line with Apple’s current privacy prompts for other personal data.

Apple could even hide the “always allow…” option until the user had allowed an app to see the clipboard like 5 times in a row. That would avoid giving full access to apps that you don’t want to give it to, and it even helps keep the number of apps with this always access down. After saying “allow paste” in Parcel 100+ times in the past few years and never hitting no, it might be safe to let me just say “always allow” at this point, but maybe an app where I paste once in a blue moon doesn’t need it.

They could go the other way as well: if you deny an app a few times in a row, there could be a new option the next time that asks if you want to block this app from the clipboard forever.

And as they’ve done recently with location, photos, and calendar access, it could make sense to occasionally show an alert that tells the user that this app has access to your clipboard and how often it’s used that access in the last X days.

I strongly disliked the redesigned clipboard prompts in the first version of iOS 16 (a perfect example of user experience dictated by security engineers rather than designers at Apple), and I was relieved when the company improved the system with per-app clipboard settings in 16.1. Still, these clipboard prompts feel antiquated, user-hostile, and not intelligent at all. For starters, they should be consistent – like Matt suggests – with Apple’s other privacy prompts. Second, they should learn from user habits in terms of granting access or reminding people to review their apps with clipboard access.

Third, I can’t believe it’s still not possible for third-party developers to make a proper clipboard manager for iOS and iPadOS – a software category that continues to thrive on macOS. I was writing about this stuff 13 (!) years ago, and it’s wild that nothing has changed.

Permalink

Workflow Co-Founders Want to Bring AI to the Desktop

When I read earlier this year that Ari Weinstein, one of the co-founders of Workflow before it was acquired by Apple, had left the company, I had a feeling he’d team up soon enough with Conrad Kramer, another Workflow co-founder. I was right. Alex Heath, writing for The Verge, has some initial details on Software Applications Incorporated, the new venture by Weinstein, Kramer, and Kim Beverett, another Apple vet you may remember from the original Siri Shortcuts demo at WWDC 2018:

In their first interview since leaving Apple to start something new, the trio tells me that their focus is on bringing generative AI to the desktop in a way that “pushes operating systems forward.” While they don’t have a product to show off yet, they are prototyping with a variety of large language models, including OpenAI’s GPT and Meta’s Llama 2. The ultimate goal, according to Weinstein, is to recreate “the magic that you felt when you used computers in the ’80s and ’90s.”

“If you turned on an Apple II or an Atari, you’d get this basic console where you could type in basic code as a user and program the computer to do whatever you wanted,” he explains. “Nowadays, it’s sort of the exact opposite. Everybody spends time in very optimized operating systems with pieces of software that are designed to be extremely easy to use but are not flexible.”

An example he gives: “Sometimes you’ve got a browser window open with a schedule on it, and you just want to say, ‘add this to my calendar,’ and somehow, there’s no way to do that… We think that language models and AI give us the ingredients to make a new kind of software that can unlock this fundamental power of computing and make everyday people able to use computers to actually solve their problems.”

They don’t have a product to show yet, but I’ll say this: if there’s anyone out there who can figure out how to turn generative AI into something more than a text prompt or writing assistant for Word and Notion – something that can be truly integrated with your computer, your data, and, well, your workflow, it’s this trio. I absolutely can’t wait to learn more about what they’re working on.

Also worth noting: the company’s website (great domain, too) is a delightfully retro, emulated browser version of Mac OS 8.

Permalink

Using the iPad Pro as a Portable Monitor for My Nintendo Switch with Orion, a Capture Card, and a Battery Pack

Tears of the Kingdom on my iPad Pro.

Tears of the Kingdom on my iPad Pro.

Those who have been reading MacStories for a while know that I have a peculiar obsession for portable setups free of the constraints typically involved with working at a desk or playing games in front of a TV.

It’s not that I don’t want to have a desk or dislike my 65” OLED TV: it’s that I don’t want those contexts to be my only options when it comes to getting work done or playing videogames. This is why I’ve spent the better part of my career fine-tuning my iPad-first lifestyle and why I’m so excited at the prospect of a giant screen that can always be with me. Modularity, portability, and freedom from a desk or TV are the driving factors in everything I use or buy these days.

For these reasons, it shouldn’t come as a surprise that I embraced the ability to use the iPad Pro as a portable monitor for videogame consoles thanks to UVC support. As I covered in my iPadOS 17 review, this feature was primarily conceived to let iPad users connect external webcams to their computers, but that hasn’t stopped developers from re-using the same underlying technology to create apps that allow you to display a video feed from any accessory connected via USB.

It’s a very intriguing proposition: the 12.9” iPad Pro has a gorgeous mini-LED display; what if you could use that to give yourself a little extra screen real estate when playing Super Mario Bros. Wonder or Tears of the Kingdom without having to pack a separate portable monitor with you?

In my review, I mentioned the Genki Studio app, which I used to play games with my Nintendo Switch and ROG Ally and output their video feeds to the iPad Pro’s display. Today, I want to explain how I took my setup a step further by enhancing the picture quality of the Nintendo Switch when viewed on the iPad Pro and, most importantly, how I created a fully-portable setup that allows me to play Switch games on the iPad Pro anywhere I am.

Read more


Apple Announces Pricing and Availability Dates for New MacBook Pros and iMac with M3 Chips

Spooky Ternus.

Spooky Ternus.

At this evening’s ‘Scary Fast’ Apple event, the company announced the new generation of M3 chips – M3, M3 Pro, and M3 Max – that will power three new models of MacBook Pro and the updated iMac, which is getting a refresh for the first time since its redesign featuring the M1 chip was introduced in 2021.

We’ll have more detailed overviews of the new computers up on MacStories soon, but in the meantime, here’s a breakdown of when you can expect to be able to get your hands on these new machines:

Read more


iOS 17.2 Beta’s Sticker Reactions Need a Different Approach

The first developer beta of iOS 17.2 was released earlier today, and among a variety of new features (I’ve been sharing some of the highlights on my Mastodon), there’s the highly anticipated expansion of Tapbacks with custom sticker reactions.

The problem is that, put simply, this feature just isn’t good enough in this first version of iOS 17.2. And since I’m always told to “file feedback early in the process to make sure things get seen”, and since blogging about iOS feature requests on my website makes me feel better than begrudgingly filing actual feedbacks about them, here we are.

Jason Snell, writing for Six Colors:

This new feature has no connection at all with the fun double-tap gesture that’s synonymous with Tapbacks. I didn’t expect stickers to be a peer to Apple’s classic collection of six Tapback icons, but I did sort of assume that at the very least, performing the Tapback gesture would also give you the option of choosing a sticker. (And the right thing for Apple to do would be to display recently used stickers alongside the Tapback icons.)

Instead, to send a sticker response you have to tap and hold on a message and then choose Add Sticker from the resulting contextual menu, then choose a sticker or emoji. It’s an extra step that really shouldn’t be necessary and makes stickers feel like an afterthought, which they apparently are.

I get why Apple doesn’t want to let users customize the default roster of “official” Tapbacks. iMessage is used by hundreds of millions of people every day, and they don’t want to overcomplicate an established feature with too many options. However, I think a much better compromise would be the following:

  • Align custom sticker reactions with regular Tapbacks in the message bubble so they don’t cover text;
  • Make the ‘Add Sticker’ button appear when you double-tap a message instead of requiring a long-press.

That’s it. I really like this feature, but the design isn’t quite there yet. Hopefully, there’s enough time (and willingness on Apple’s part) to change it.

Permalink

With Version 1.2, Matter Now Supports over 20 Device Types

Jennifer Pattison Tuohy, writing for The Verge on the latest update to Matter, the interoperable home automation standard:

Matter — the IOT connectivity standard with ambitions to fix the smart home and make all of our gadgets talk to each other — has hit version 1.2, adding support for nine new types of connected devices. Robot vacuums, refrigerators, washing machines, and dishwashers are coming to Matter, as are smoke and CO alarms, air quality sensors, air purifiers, room air conditioners, and fans. It’s a crucial moment for the success of the industry-backed coalition that counts 675 companies among its members. This is where it moves from the relatively small categories of door locks and light bulbs to the real moneymakers: large appliances.

And:

While it’s possible today to get your lights to flash when your laundry is done, turn a light red when your fridge’s temperature rises, or shut off the HVAC system if the smoke alarm goes off, it can be complicated to set up and often wholly unreliable. You need to download multiple apps, maybe buy a sensor or two, deal with laggy cloud integrations, and worry about whether your washer is even compatible with your smart home app in the first place. With Matter support, this type of simple command and control should be much easier to implement in any ecosystem.

I spent the past year making as many parts of my new home as connected as possible (our apartment is powered by KNX; I had this bridge installed to bring lights, temperature sensors, and shutters into HomeKit) so, as you can imagine, I’m very much on board with the idea of having my accessories be compatible with multiple ecosystems at once. We live in a “mixed assistant” household (we use Siri in English and Alexa in Italian), so the idea behind Matter is the kind of technology we’re looking for.

The problem, at least from my perspective, is that I have very little hope regarding Apple’s ability to support the new device types added to Matter in their Home app anytime soon.

Apple’s Home app is, by far, the UI I use most for manually controlling my smart home, whether it’s from the Home app itself or Control Center. It’s also leagues beyond the terrible design of the Alexa app; its integration with automations and the Shortcuts app is also incredible. But let’s be honest: the Home app already struggles to fully support device types that were added years ago, such as sprinklers; realistically, how long is it going to take Apple to integrate with robot vacuums and air purifiers?

The way I see it, any connected home standard is only as useful as the UI that lets you control its accessories. So while I’m excited about Matter and strongly believe in the initiative, the weak link for me remains Apple’s Home app.

Permalink

Apple Releases iOS and iPadOS 17.1 with New Apple Music Features, Small iPad Enhancements, and More

iOS 17.1.

iOS 17.1.

Today, Apple released iOS and iPadOS 17.1 – the first major updates to the operating systems that launched (and I reviewed) in September. I’ll cut to the chase: these are not big updates and don’t come with new emoji. Instead, iOS and iPadOS 17.1 bring a variety of previously-announced (and then delayed) features such as AirDrop over the Internet and new cover art templates in Music, but they don’t address the complete list of functionalities for this OS cycle that Apple originally announced last June.

Let’s take a look.

Read more


Macintosh Desktop Experience: No Mac Is an Island

One of the perks of a Club MacStories+ and Club Premier membership are special columns published periodically by me and John. In this week’s Macintosh Desktop Experience column, John explained how widgets in macOS Sonoma are the glue between apps and services that make the Mac feel even more like part of an integrated ecosystem of platforms and devices:

The Mac’s place in users’ computing lives has changed a lot since Steve Jobs returned to Apple and reimagined the Mac as a digital hub. Those days were marked by comparatively weak mobile phones, MP3 players, camcorders, and pocket digital cameras that benefitted from being paired with the Mac and Apple’s iLife suite.

The computing landscape is markedly different now. The constellation of gadgets surrounding the Mac in Jobs’ digital hub have all been replaced by the iPhone and iPad – powerful, portable computers in their own right. That’s been a seismic shift for the Mac. Today, the Mac is in a better place than it’s been in many years thanks to Apple silicon, but it’s no longer the center of attention. Instead, it sits alongside the iPhone and iPad as capable computing peers.

What hasn’t changed from the digital hub days is the critical role played by software. In 2001, iLife’s apps enabled the digital hub, but in 2023, the story is about widgets.

Stay until the end of the story and don’t miss the photo of John’s desk setup, which looks wild at first, but actually makes a lot of sense in the context of widgets.

Macintosh Desktop Experience is one of the many perks of a Club MacStories+ and Club Premier membership and a fantastic way to recognize the modern reality of macOS as well as get the most of your Mac thanks to John’s app recommendations, workflows, and more.

Join Club MacStories+:

Join Club Premier:

Permalink