There’s something about the design of this machine that I can’t escape. The footprint is pretty similar between the two notebooks, but in my backpack, there’s a huge difference. Don’t get me wrong: I am thrilled that the MacBook Pro has beefed up to be a better computer, but I’m drawn to the clean, simple look of the Air. I know the Pro is a better match for my workflows, but the Air can do everything I need — if just a little bit slower. And I don’t care about that speed difference any time I pick up the Air to take it with me. Something about it just clicks with me in a way I didn’t anticipate.
I completely understand where Stephen is coming from on this. On paper, the MacBook Pro’s advantages are undeniable, but they’re also expected. It’s a bigger, heavier ‘pro’ computer with fans, after all.
In contrast, the M2 Air feels like magic, despite the M1 version that preceded it. The performance boost from the M2 SoC and features like a bigger, brighter screen and a higher memory option are part of it, but so, too, is the fact that the new Air even looks like a MacBook Pro. Yet, the M2 MacBook Air is still the svelte, silent laptop that it replaces, which feels improbable if not impossible. Like Stephen, the MacBook Air has captured my heart, and I don’t see myself switching to a different Mac laptop anytime soon.
It’s not unusual for Apple keynotes to feature gaming. Sometimes it’s about Apple Arcade, and other times it’s a demo of a third-party title coming to one of the company’s platforms. However, this year’s WWDC keynote was a little different, sprinkling developer-focused gaming announcements throughout the presentation and focusing on the upcoming release of No Man’s Sky and Resident Evil Village on the Mac. With Metal 3, controller functionality that continues to be extended, and an emphasis on titles with name recognition, many came away wondering if Apple is trying to position its latest Macs as legitimate challengers to high-end gaming PCs.
At just slightly larger than an Xbox Series S by volume and with ultra-low power consumption, the Mac Studio is unlike any high-performance PC. Digital Foundry came away impressed with the technical details of the M1 Ultra SoC, which held its own against high-end Intel CPUs and was in the ballpark in comparison to top GPUs:
The M1 Ultra is an extremely impressive processor. It delivers CPU and GPU performance in line with high-end PCs, packs a first-of-its-kind silicon interposer, consumes very little power, and fits into a truly tiny chassis. There’s simply nothing else like it. For users already in the Mac ecosystem, this is a great buy if you have demanding workflows.
These results are really just for evaluating raw performance though, as the Mac is not a good gaming platform. Very few games actually end up on Mac and the ports are often low quality. If there is a future for Mac gaming it will probably be defined by “borrowing” games from other platforms, either through wrappers like Wine or through running iOS titles natively, which M1-based Macs are capable of. In the past, Macs could run games by installing Windows through Apple’s Bootcamp solution, but M1-based chips can’t boot natively into any flavour of Windows, not even Windows for ARM.
Gaming on Mac has historically been quite problematic and that remains the case right now - native ports are thin on the ground and when older titles such as No Man’s Sky and Resident Evil Village are mooted for conversion, it’s much more of a big deal than it really should be. Perhaps it’s the expense of Apple hardware, perhaps it’s the size of the addressable audience or maybe gaming isn’t a primary use-case for these machines, but there’s still the sense that outside of the mobile space (where it is dominant), gaming isn’t where it should be - Steam Deck has shown that compatibility layers can work and ultimately, perhaps that’s the route forward. Still, M1 Max and especially M1 Ultra are certainly very capable hardware and it’ll be fascinating to see how gaming evolves on the Apple platform going forward.
Digital Foundry’s results highlight that tech specs are necessary but not sufficient for videogame industry success. The Mac hasn’t been in the same league as high-end gaming PCs for a long time, and tech specs historically were just one of the issues. Given Apple’s lackluster history in desktop gaming, it’s fair to be skeptical about whether the company can attract the developers of current-generation, top-tier games to the Mac. Still, for the optimists in the crowd, the power of the M1 Ultra has brought the Mac a long way from where it stood during the Intel-baed days as a gaming platform. Personally, I’m a skeptical optimist with one foot in each camp. The hardware is heading in the right direction, but the jury’s still out on the software and Apple’s business plan to attract game developers.
For these reasons, as you can imagine, when Apple got in touch with me last November asking if I wanted to try out one of the new MacBook Pros with the M1 Max chip, I welcomed their suggestion with a mix of surprise, trepidation, and, frankly, genuine curiosity. What could I, a longtime iPad user, even contribute to the discourse surrounding the comeback of the Mac lineup, the performance of Apple silicon, and the reality of modern Mac apps?
But I was intrigued by the proposal regardless, and I said yes. I was very skeptical of this experiment – and I told Apple as much – but there were a few factors that influenced my decision.
First and foremost, as many of you have probably noticed, I’ve grown increasingly concerned with the lack of pro software (both apps and OS features) in the iPad Pro lineup. As I wrote in my review last year, iPadOS 15 was, by and large, a quality-of-life update that made iPadOS more approchable without breaking any new ground for existing pro users of the platform. As much as I love the iPad, at some point I have to face its current reality: if Apple thinks iPadOS isn’t a good fit for the kind of functionalities people like me need, that’s fine, but perhaps it’s time to try something else. If my requirements are no longer aligned with Apple’s priorities for iPadOS, I can switch to a different computer. That’s why I believe 2022 – and the upcoming WWDC – will be a make-or-break year for iPad software. And I don’t think I’m the only iPad user who has felt this way.
Second, the arrival of Shortcuts on macOS Monterey gave me an opportunity to expand and rethink another major area of coverage for MacStories, which is automation. Along with iPad and iOS, I consider Shortcuts the third “pillar” of what I do at MacStories: with the Shortcuts Archive, Shortcuts Corner and Automation Academy on Club MacStories, and Automation April, I’m invested in the Shortcuts ecosystem and I know that our readers depend on us to push the boundaries of what’s possible with it. With Shortcuts on macOS, I felt a responsibility to start optimizing my shortcuts for Mac users. That meant learning the details of the Shortcuts app for Mac and, as a result, use macOS more. From that perspective, Apple’s review unit couldn’t have come at a better time.
Third, and perhaps most important to me and least helpful for you all, is one of my greatest fears: becoming irrelevant in what I do. As a writer, I guess I shouldn’t say this; I should say that I write for me, and that I would write regardless, even if nobody read my stuff. But as a business owner and someone who’s gotten used to having a medium-sized audience, that would be a lie. I love the fact that I can write for my readers and get feedback in return. I love that I can write something that is wrong and be corrected by someone. I don’t want to lose that. Do you know what’s a really easy way to make it happen? Grow into someone who’s stuck in their ways, only writes about a certain topic, and doesn’t think anything else is worth trying or even remotely considering. In my case, I don’t want to look back at MacStories in 10 years and regret I didn’t at least try macOS again because I was “the iPad guy” and I was “supposed to” only write about a specific topic. I make the rules. And the rule is that curiosity is my fuel and I was curious to use macOS again.
So that’s my context. For the past six months, I’ve been using my MacBook Pro instead of the iPad Pro to get my work done on a daily basis. I’ve kept using the iPad Pro to test my shortcuts, read articles, and write in places where I didn’t have enough room for a MacBook, but, by and large, I’ve lived the macOS lifestyle for half a year by now.
As we head into WWDC, here’s my story on how this experiment went.
Thursday is Global Accessibility Awareness Day. To mark the occasion, Apple has announced a long list of accessibility features coming to its products later this year and shared other ways it is celebrating the day through its apps and services.
Using advancements across hardware, software, and machine learning, people who are blind or low vision can use their iPhone and iPad to navigate the last few feet to their destination with Door Detection; users with physical and motor disabilities who may rely on assistive features like Voice Control and Switch Control can fully control Apple Watch from their iPhone with Apple Watch Mirroring; and the Deaf and hard of hearing community can follow Live Captions on iPhone, iPad, and Mac. Apple is also expanding support for its industry-leading screen reader VoiceOver with over 20 new languages and locales. These features will be available later this year with software updates across Apple platforms.
Door Detection will be part of the Magnifier app later this year. The feature helps blind and low vision users find and navigate doors when they arrive somewhere. The feature will judge the distance to the door using LiDAR, describe the door’s attributes, like whether it opens by pushing or using a doorknob, and read signs and symbols next to doors.
Apple Watch Mirroring.
The Apple Watch will add several Physical and Motor accessibility features too. Apple Watch Mirroring, which is built on AirPlay in part, will allow users with physical and motor disabilities to control their Watches from an iPhone using Voice Control, Switch Control, voice commands, sound actions, head tracking, and Made for iPhone switches. The Apple Watch will also add a new double pinch gesture for controlling, like answering and ending phone calls and taking photos.
Apple Watch will add a new double pinch gesture.
For Deaf and hard of hearing users, Apple will add Live Captions on the iPhone, iPad, and Mac. Apple says captioning will work with video calling apps like FaceTime, streaming video services, video conferencing apps, and in-person conversations. Live Captions occur on-device to preserve privacy, and on the Mac, users will be able to type a response that will be spoken aloud.
VoiceOver will get an update, too, adding the following languages:
Mandarin (Liaoning, Shaanxi, Sichuan)
VoiceOver on the Mac will also gain Text Checker that will discover formatting issues.
Additional upcoming accessibility features.
Apple previewed several other upcoming accessibility features across its products, including:
Buddy Controller, the ability for someone to use a second game controller to assist with playing a game as though the two controllers were one
Siri Pause Time, which will allow users to customize the period Siri waits before responding to a user
Voice Control Spelling Mode, for dictating words letter-by-letter
Customizable sound recognition of the sounds in your environment
New themes and text adjustments in the Books app for a more accessible reading experience
Apple apps and services are celebrating Global Accessibility Awareness Day, too.
Also, Apple has announced that Global Accessibility Awareness Day is being celebrated with Apple Store sessions, an Accessibility Assistant shortcut in the Shortcuts app, special Fitness+ sessions and Time to Walk or Push episodes, an accessibility-oriented Maps guide, and highlighted content on the App Store and in Apple Books Apple Podcasts, Apple Music and Apple TV.
To mark Star Wars Day, Apple released a special Behind the Mac video featuring the sound designers of Skywalker Sound, who rely heavily on Apple devices. The film, which runs just under 17 minutes, is a fascinating look at how the Skywalker Sound team captures and mixes real-world sounds that they transform into otherworldly sound effects for the Star Wars movies and other films.
The Behind the Mac video, which is currently featured on Apple’s homepage, focuses on Skywalker Sound’s production process from start to finish. Behind the scenes, the group’s mixing and editing are backed by a lot of Apple hardware. Sound editor Ryan Frias is featured in the video touring Skywalker Sound’s central machine room, which is filled with racks of Mac Pros. According to Apple’s press release:
With the power of approximately 130 Mac Pro racks, as well as 50 iMac, 50 MacBook Pro, and 50 Mac mini computers running Pro Tools as their main audio application, along with a fleet of iPad, iPhone, and Apple TV devices, Skywalker is advancing sound artistry and reshaping the industry.
That’s a lot of gear, but the video’s primary focus is on the creative process of the professionals at Skywalker Sound, who have been pushing the boundaries of sound design for many years. If you enjoy getting a behind-the-scenes look at creative people doing cutting-edge work, I recommend checking out Behind the Mac: Skywalker Sound.
Longtime MacStories readers may be familiar with UI Browser, an incredible scripting tool for macOS created by Bill Cheeseman. UI Browser lets you discover the AppleScript structure of an app’s menu system, taking advantage of Apple’s Accessibility APIs to make it easier to script UI, which is not – how do I put this – normally “fun”, per se. UI Browser developer Bill Cheeseman, having turned 79 years old, has decided it is now time to “bring this good work to a conclusion”, and the app will be retired in October.
Long story as short as possible: “Regular” AppleScript scripting is accomplished using the programming syntax terms defined in scriptable apps’ scripting dictionaries. If you ever merely tinkered with writing or tweaking AppleScript scripts, this is almost certainly what you know. But as an expansion of accessibility features under Mac OS X, Apple added UI scripting — a way to automate apps that either don’t support AppleScript properly at all, or to accomplish something unscriptable in an otherwise scriptable app. UI scripting is, basically, a way to expose everything accessible to the Accessibility APIs to anyone writing an AppleScript script. They’re not APIs per se but just ways to automate the things you — a human — can do on screen.
A great idea. The only downside: scripting the user interface this way is tedious (very verbose) at best, and inscrutable at worst. Cheeseman’s UI Browser makes it easy. Arguably — but I’ll argue this side — “regular” AppleScript scripting is easier than “UI” AppleScript scripting, but “UI” AppleScript scripting with UI Browser is easier than anything else. UI Browser is both incredibly well-designed and well-named: it lets you browse the user interface of an app and copy the scripting syntax to automate elements of it.
I first covered UI Browser in 2019, when I published a story on how I could control my Mac mini from the iPad Pro using Luna Display and some AppleScript, which I was able to learn thanks to UI Browser. I then mentioned UI Browser twice last month for Automation April: it was thanks to the app that I managed to create shortcuts to toggle the Lyrics and Up Next sidebars in the Music app for Monterey. Maybe it’s silly, but I think there’s something beautiful in the fact that the last thing I did with UI Browser was bridging the old world of AppleScript with the modern reality of Shortcuts.
Gruber argued that Apple should acquire UI Browser and make it part of their built-in scripting tools for macOS; while I don’t disagree, I think it’s more realistic to hope another indie developer or studio picks up UI Browser and continues developing for as long as possible. There’s nothing else like it on the market, and I’d like to thank Bill Cheeseman for his amazing work on this application over the years. It’ll be missed.
For the past week, I’ve been rethinking my approach to time tracking with the Timery app with a focus on simplicity and automation. I appreciate the insights into my habits and patterns afforded by time tracking and Timery’s excellent Reports view, but lately I’ve felt like my setup with projects, tags, and sub-tasks was too convoluted since it was based on a structure I designed years ago.
My daily routine is different now – and it’ll continue to change in 2022 – and I wanted to get rid of the overhead caused by a time tracking system that was too granular. For time tracking to be effective, you need to remember to start a timer whenever you’re working on something; too much friction in the process – such as having to carefully pick from a list of similar projects – defeats the whole purpose of it. There’s also the opposite problem – forgetting to stop a long-running timer – which John explained and fixed in a separate story for Automation April.
So I went back to the drawing board of my Timery projects and reorganized everything with simplicity and ease of activation in mind. I cleaned up my saved timers and shortcut that activates those timers, which I can now trigger system-wide via Raycast on the Mac and the Shortcuts widgets on iPad. I split my work projects into three main areas – MacStories, Club, and podcasts – removed redundant sub-tasks, and grouped related activities under the same tags for more reliable filtering.
How I access my saved timers from the Home Screen.
The approach worked well for MacStories and the Club, but podcast timers turned out to be a different beast. You see, when I sit down to record a show like Connected or AppStories, I need to take care of key tasks such as making sure my audio inputs are correct, checking out notes for the show’s outline and intro, and keeping an eye on the Connected audience in Relay’s Discord server. These tasks distract me from time tracking and, as a result, I often forget to start a timer for when I begin recording and, conversely, stop the timer when I’m done. I could automatically start a timer when a calendar event for a show is due in my calendar, but that also doesn’t work for me since it doesn’t account for the time before we actually record the show when I may be chatting privately with Myke and Stephen. Wouldn’t it be great if there was One True Way to automatically start tracking my real recording time when I start talking into the microphone for a show?
As it turns out, thanks to the latest update to Audio Hijack – the new version 4.0 that recently launched on macOS – there is. So for this week’s Automation April story, I’m sharing the custom system I created to trigger a single shortcut that starts time tracking in Timery based on the show I’m recording in that specific moment. Let’s take a look.
Last week, I shared an initial batch of 10 shortcuts I prepared for Automation April here on MacStories. I’m back this week with another set of 10 shortcuts that encompass a variety of platforms, app integrations, and functionalities. In this week’s collection, you’ll find even more shortcuts to speed up macOS multitasking; a shortcut that makes it easy to create a calendar event starting from a date; there will be a couple of shortcuts for Markdown and Obsidian users too.
I’m having a lot of fun sharing these sets of shortcuts for Automation April. So once again, let’s dive in.
ComicTrack is a new app from Joe Kimberlin for tracking and organizing the comic books you’ve read, are currently reading, and want to read. The idea is similar to his excellent videogame tracking app, GameTrack, but applied to comics. I’ve been using ComicTrack for a couple of weeks now, and it’s excellent. Of course, you’ll also need a separate app to read your comics, but having a dedicated app to organize what I’m reading and plan to read has single-handedly gotten me back into reading comics more regularly.
ComicTrack on the iPhone.
For context, I’ve always been a very casual comic book fan. I have Comixology and Shonen Jump installed on my iPad mini, but I often go weeks or months without touching either. It’s not a lack of interest, so much as a combination of not knowing where to start and having limited free time. ComicTrack hasn’t magically granted me more free time, but it has made getting started much easier by reducing the friction of picking something to read.
ComicTrack on the 12.9” iPad Pro in dark mode
I’m going to focus on the iPad version of ComicTrack because that’s where I read most of my comics, but the app also works and looks great on the iPhone and Mac, as you’ll see in the screenshots below. Everything syncs over iCloud too. Also, although I spend most of my time using the iPad version of ComicTrack, I find the iPhone and Mac apps indispensable for collecting new ideas of things to read. I run across comic books all the time via RSS, Twitter, podcasts, and friends’ recommendations. That doesn’t just happen when I’m using my iPad, but with ComicTrack on all the platforms I use, it’s easy to add something that catches my eye.