If your summer’s going to be anything like mine, some of these things should sound familiar: you’re going to play around with the iOS and iPadOS 17 betas and get on some TestFlights for third-party apps; you’re going to spend some time at the beach or perhaps even travel abroad; and you’ll still be playing through The Legend of Zelda: Tears of the Kingdom, which is a ridiculously massive game well worth waiting six years for. If the latter scenario applies to your life right now and in the near future, you’ll want to install TotK Travel Guide, which came out earlier this week for iPhone and iPad.
Fernando Silva of 9to5Mac came up with a clever workaround to have ‘clamshell mode’ in iPadOS 17 when an iPad is connected to an external display. The catch: it doesn’t really turn off the iPad’s built-in display.
Now before readers start spamming the comments, this is not true clamshell mode. True clamshell mode kills the screen of the host computer and moves everything from that display to the external monitor. This will not do that. But this workaround will allow you to close your iPad Pro, connect a Bluetooth keyboard and mouse, and still be able to use Stage Manager on an external display.
Essentially, the method involves disabling the ‘Lock / Unlock’ toggle in Settings ⇾ Display & Brightness that controls whether the iPad’s screen should lock when a cover is closed on top of it. This has been the iPad’s default behavior since the iPad 2 and the debut of the Smart Cover, and it still applies to the latest iPad Pro and Magic Keyboard: when the cover is closed, the iPad gets automatically locked. However, this setting can be disabled, and if you do, then sure: you could close an iPad Pro and continue using iPadOS on the external display without seeing the iPad’s built-in display. Except the iPad’s display is always on behind the scenes, which is not ideal.1
Still: if we’re supposed to accept this workaround as the only way to fake ‘clamshell mode’ in iPadOS 17, I would suggest some additions to improve the experience.
Excellent developer-focused take on the Vision Pro by David Smith, who also tested one last week at Apple Park. I particularly liked his reasoning for why it’s important to begin understanding a new Apple platform sooner rather than later:
Another reason I want to develop for visionOS from the start is that it is the only way I know for developing what I’ll call “Platform Intuition”.
This year watchOS 10 introduced a variety of structural and design changes. What was fascinating (and quite satisfying) to see was how many of these changes were things that I was already doing in Pedometer++ (and had discussed their rationale in my Design Diary). This “simultaneous invention” was not really all that surprising, as it is the natural result of my spending years and years becoming intimately familiar with watchOS and thus having an intuition about what would work best for it.
That intuition is developed by following a platform’s development from its early stages. You have to have seen and experienced all the attempts and missteps along the way to know where the next logical step is. Waiting until a platform is mature and then starting to work on it then will let you skip all the messy parts in the middle, but also leave you with only answers to the “what” questions, not so much the “why” questions.
I want that “Platform Intuition” for visionOS and the only way I know how to attain it is to begin my journey with it from the start.
As Underscore concludes, Widgetsmith will be on visionOS from day one in 2024.
Chance Miller, writing last week at 9to5Mac, notes how Apple’s most expensive AirPods model are going to miss out on two key features announced at WWDC: Adaptive Audio (which blends Active Noise Cancelation and Transparency mode) and the new ‘Siri’ command that does not require saying ‘Hey’.
As my colleague Filipe Espósito also pointed out yesterday, the new “Siri” command is also exclusive to second-generation AirPods Pro. The same also applies to the new Faster Automatic Switching upgrade.
For context, AirPods Max are powered by two H1 chips, with one located in either ear cup. AirPods Pro 2 feature a next-generation H2 chip inside. Unsurprisingly, H1 + H1 does not equal H2.
I like my AirPods Max, but they’re over two years old at this point, and the gap between them and the second-generation AirPods Pro continues to grow.
The performance of noise cancelation is vastly superior on the AirPods Pro. I just had to travel 14+ hours back and forth between Italy and California for WWDC, so I was able to test AirPods Max on a plane for the first time since I bought them. They were fine, but I ultimately preferred using AirPods Pro because they removed more noise.
I hope Apple is working on an AirPods Max revision with support for H2, a foldable design, a new case, and support for the latest software features they just announced.
Earlier today in my Vision Pro story, I wondered about how Apple will showcase and set up the headset for customers in retail stores in the future.
For some excellent analysis on this topic, look no further than Michael Steeber’s latest issue of the Tabletops newsletter. Michael (who’s the leading expert on Apple retail stores) put together some fascinating thoughts on how Vision Pro could marketed and demoed inside the stores, as well as how the product compares to AirPods Pro and Apple Watch from a retail perspective.
Ultimately, the onus of ushering in the era of spatial computing will be on the Specialists and Creatives. The Vision Pro retail experience must be guided from end to end. Apple Stores started as a place to educate, and as technology faded to the background, customers began to intuitively understand their tools and seek out the Apple Store as a product destination. But visionOS is a fundamentally new paradigm that thrusts the role of education front and center once again.
These are just some of the many new challenges and opportunities Vision Pro will bring to Apple Stores. The dawn of spatial computing transforms far more than just the way we interact with software. This new category of device will impel Apple to reshape the retail experience around a more immersive, personalized environment. It’s an incredibly exciting moment.
Check out the concepts and details Michael posted here.
I’m going to be direct with this story. My 30-minute demo with Vision Pro last week was the most mind-blowing moment of my 14-year career covering Apple and technology. I left the demo speechless, and it took me a few days to articulate how it felt. How I felt.
It’s not just that I was impressed by it, because obviously I was. It’s that, quite simply, I was part of the future for 30 minutes – I was in it – and then I had to take it off. And once you get a taste of the future, going back to the present feels…incomplete.
I spent 30 minutes on the verge of the future. I have a few moments I want to relive.
I’m in Cupertino for WWDC this week, and after yesterday’s whirlwind of announcements and surprises, I had some time to sit down with my 12.9” iPad Pro, install iPadOS 17 beta 1 on it, and try the improved version of Stage Manager. As you know, I have a…complicated history with the iPad’s latest multitasking system. Before coming here, I was worried Stage Manager would be left untouched without any updates for at least another year.
I’ll cut to the chase: Apple listened to feedback about Stage Manager and – at least so far – implemented the key improvements I wanted to see. I’ve been using Stage Manager on my iPad Pro since yesterday afternoon, and I even tested it on a portable external display that I brought with me for this trip. If this early, limited experience is of any indication, I think I’m going to be happy with Apple’s revised version of Stage Manager for iPad by the end of the summer. But then again, caution is necessary given how last year’s beta evolved over time.
A few weeks ago on Mastodon, I shared a simple feature request: a split-screen mode for iPhone RSS readers that would allow me to scroll headlines in the one half of the screen and preview actual articles in the other.
If this sounds familiar, you’re not alone: back in 2007, Steve Jobs demoed pretty much the same thing for the first version of the Mail app for iPhone OS 1.0. That layout mode never shipped, and probably rightfully so at the time given the limited screen real estate of the first iPhone.
I just published version 1.0.2 of S-GPT, the shortcut I released last week to have conversations with OpenAI’s ChatGPT and integrate it directly with native features of Apple’s OSes. You can find the updated download link at the end of this post, in the original article, and in the MacStories Shortcuts Archive; before you replace version 1.0.1 of S-GPT, save your existing OpenAI API key somewhere as you’ll have to paste it again in the shortcut later.
I’m going to include the full changelog for S-GPT 1.0.2 below, but long story short: S-GPT is now aware of the current date and time, and I’ve heard all the requests about improving interactions with the HomePod and Siri, so I made that part much better. S-GPT can now perform a variety of date/time calculations with natural language, and you can end a conversation by saying “no” or “stop”.