In my ongoing quest to try all the visionOS apps developers have been building over the past few months and turn the Vision Pro into my main computer (I’m making great progress on this front), I was pleased to see that 1Blocker is available as a native app and Safari extension on visionOS.
Just yesterday I was looking for ways to VNC into my gaming PC because I wanted to stream Like a Dragon: Infinite Wealth (an amazing game that you should play) on my Vision Pro using my new NDI encoder (I understand this is a very Ticci phrase; more on this topic soon).
Anyway, I went looking for Screens, my favorite VNC client, on the visionOS App Store, and it wasn’t available. I was surprised by its absence, but I just assumed the folks at Edovia were working on some fixes for the app running on the Vision Pro. I downloaded Jump Desktop, which worked pretty well, but I’ve never been a fan of the Jump Desktop UI, and I’d rather use Screens everywhere.
Right on schedule, Screens for iPad has now been made available on visionOS in compatibility mode. This is excellent news since I can now use Screens on all my Apple devices to quickly connect to my PC and Mac mini server; you can also check out how Screens’ trackpad mode works with visionOS’ gesture system in this blog post.
I received my Apple Vision Pro yesterday (for the full story of how it eventually found its way to Italy, you don’t want to miss the next episode of AppStories), and, as you can imagine, I’ve been busy downloading all the apps, learning my way around visionOS and – just today – using the extended Mac display mode. The first 24 hours with a Vision Pro are a whirlwind of curiosity and genuine nerd excitement, but despite my attention being pulled to a hundred different places, I’ve found the time to test one app in particular: Shortcut Buttons by Finn Voorhees.
I thoroughly enjoyed this story by Anil Dash on how the openness of podcasting (a topic I’ve covered before) has proven so resilient over the years and is, in fact, more relevant than ever in the era of federated social networks:
But here’s the thing: being able to say, “wherever you get your podcasts” is a radical statement. Because what it represents is the triumph of exactly the kind of technology that’s supposed to be impossible: open, empowering tech that’s not owned by any one company, that can’t be controlled by any one company, and that allows people to have ownership over their work and their relationship with their audience.
Side note: this is the first post I’m writing and publishing on MacStories directly from the Vision Pro, which I received this morning. A lot more to follow soon – including wherever you get our podcasts.
There’s no tech commentator better equipped to talk about the history of spatial interfaces in Apple operating systems than John Siracusa, and I enjoyed his latest, thought-provoking column on where visionOS and the Vision Pro’s gesture system fit in the spatial computing world:
Where Vision Pro may stumble is in its interface to the deep, spatial world it provides. We all know how to reach out and “directly manipulate” objects in the real world, but that’s not what Vision Pro asks us to do. Instead, Vision Pro requires us to first look at the thing we want to manipulate, and then perform an “indirect” gesture with our hands to operate on it.
Is this look-then-gesture interaction any different than using a mouse to “indirectly” manipulate a pointer? Does it leverage our innate spatial abilities to the same extent? Time will tell. But I feel comfortable saying that, in some ways, this kind of Vision Pro interaction is less “direct” than the iPhone’s touch interface, where we see a thing on a screen and then literally place our fingers on it. Will there be any interaction on the Vision Pro that’s as intuitive, efficient, and satisfying as flick-scrolling on an iPhone screen? It’s a high bar to clear, that’s for sure.
In yesterday’s review on The Verge, Nilay Patel shared a similar idea: it’s a strange feeling to use a computer that requires you to look at what you want to control at all times. I don’t know what to think about this yet since I don’t have a Vision Pro, but I’m curious to learn how this interaction method will scale over time as we start using this new platform on a daily basis. It’s quite fitting, however, that visionOS is based on the one Apple platform that supports both kinds of manipulation with a pointer and touch.
Every once in a while, I come across a new app whose design, feature set, or combination of the two redefines my expectations for a particular category of software. The new Arc Search app for iPhone, which is launching today on the App Store as a separate app from The Browser Company’s previous Arc Companion utility, is one of those experiences.
From the first moment I tried Arc Search for iOS, I knew I wanted to use it as my default iPhone browser. And the reason isn’t because Arc Search does more than Safari (there’s actually a long list of missing features that I’ll cover below), but because despite offering less functionality, the essence of how Arc Search rethinks one-handed web browsing on a phone is so refined and thoughtful, going back to another browser feels like a downgrade.
Arc Search has a long way to go to become a full-featured, mature browser for iOS, and it doesn’t even come with an iPad counterpart yet. But, at the same time, it’s the best take on mobile web browsing I’ve seen in years.
Yesterday, Apple began adding transcripts to Apple Podcasts, detailing the change on the Apple Podcast for Creators site and making them available for in iOS 17.4 developer beta 1.
This change is a huge win for accessibility, will surely improve searching in the Podcasts app, and makes quoting your favorite podcast an easy task by letting you copy and paste the text out – something I’ll definitely have to turn into a shortcut soon.
All these benefits are great in their own way and will make podcasts more shareable as a whole, allowing us to unlock so many people’s great ideas that are currently stored within hours of audio files and obscured behind URLs that point only to the show or episode as a whole.
However, I think Apple needs to go one step further in their next step and add timestamps to Apple Podcasts, a long-overdue feature that’d enable users to share links to individual moments within a podcast, directly to a specific point in the transcript.
I couldn’t agree more. From sharing to personal note-taking and research purposes, there several use cases I can think of to take advantage of timestamp links for podcast episodes – especially now that they have transcripts. (Pocket Casts, my favorite third-party podcast player, goes even further: it lets you share timestamp links and save private, time-synced bookmarks for specific parts of any episode.)
I like Matthew’s suggestions for how Apple could implement this feature, and I’ll add: Apple has already built this system for the Music app. When the company added shareable lyrics to the Music app in iOS 14.5, they did so with the ability to share selected lyrics as a special “snippet” on iMessage that is actually an interactive, timestamped song preview based on a special URL. Here’s what I wrote:
Besides Apple’s custom implementation of lyrics selection in the share sheet, what’s also interesting about this is the method the company is using to share Apple Music lyrics URLs. Unlike regular
music.apple.comlinks that reopen a particular song or album in the Music app or play a generic preview snippet in iMessage, lyrics URLs are timestamped: in iMessage, the lyrics card has a play button that will preview the lyrics you shared inline within a conversation; if you tap the link in iMessage and the same song is already paused in the Music app, the Now Playing screen will automatically advance to the section highlighted in shared lyrics.
I’m assuming that Apple is aware of this missing feature from the Podcasts app in iOS 17.4 beta 1; I have to believe their future implementation will be very similar to what already exists in Music.
Silver, writing on the Obsidian blog:
It has been nearly four years since the first line of code of Obsidian was written on January 31st, 2020. Today we’re thrilled to announce the winners of our fourth annual Gems of the Year awards!
This year the Obsidian community nominated 287 projects, including plugins, themes, tools, content, and templates. After our panel narrowed down the selection and the community voted on the entries, we’re now excited to announce the winners.
Solid list of plugins and themes for the best note-taking app out there, many of which I wasn’t familiar with or hadn’t tested yet. The Border theme looks stunning and I’m going to give it a try as my primary theme in the app; the Home Tab plugin does exactly what I want from a new empty tab in Obsidian (a search box + recently opened files); Omnivore, which I’m testing as my read-later app after they added better Shortcuts integration, does a fantastic job syncing highlights to Obsidian with its plugin. Go check out this list of gems if you haven’t yet.
Interesting perspective by David Pierce, writing for The Verge, on how, for the time being, Vision Pro users may have to use Safari to access popular services more than they anticipated:
But what if you don’t need the App Store to reach Apple users anymore? All this corporate infighting has the potential to completely change the way we use our devices, starting with the Vision Pro. It’s not like you can’t use Spotify on the headset; it’s just that instead of tapping a Spotify app icon, you’ll have to go to Spotify.com. Same for YouTube, Netflix, and every other web app that opts not to build something native for the Vision Pro. And for gamers, whether you want to use Xbox Game Pass or just play Fortnite, you’ll also need a browser. Over the last decade or so, we’ve all stopped opening websites and started tapping app icons, but the age of the URL might be coming back.
If you believe the open web is a good thing, and that developers should spend more time on their web apps and less on their native ones, this is a big win for the future of the internet. (Disclosure: I believe all these things.) The problem is, it’s happening after nearly two decades of mobile platforms systematically downgrading and ignoring their browsing experience. You can create homescreen bookmarks, which are just shortcuts to web apps, but those web apps don’t have the same access to offline modes, cross-app collaboration, or some of your phone’s other built-in features. After all this time, you still can’t easily run browser extensions on mobile Safari or mobile Chrome. Apple also makes it maddeningly complicated just to stay logged in to the services you use on the web across different apps. Mobile platforms treat browsers like webpage viewers, not app platforms, and it shows.
As we saw when we surveyed the state of apps already submitted to the visionOS App Store, more companies than we expected have – for now – decided not to offer their apps on the Vision Pro, either in the form of native visionOS apps or iPad apps running in compatibility mode.
I think that “for now” is key here: if visionOS proves to be a successful platform in the long term (and early sales numbers for the Vision Pro seem encouraging), most companies won’t be able to afford ignoring it. And why would they? If the users are there, why shouldn’t they provide those users with a better app experience?
This idea is predicated upon the assumption that native apps still offer a superior app experience compared to their web counterparts. The tide has been turning over the past few years. Workflows that would have been unthinkable in a web browser until a few years ago (such as design and collaboration) can now live in a browser; the most popular AI service in the world is literally a website; the resurgence of browsers (with Arc arguably leading the space) proves that a new generation of users (who likely grew up with Chromebooks in school) doesn’t mind working inside a browser.
With this context in mind, I think Apple should continue improving Safari and extend its capabilities on visionOS. My understanding is that, in visionOS 1.0, Safari cannot save PWAs to the user’s Home Screen; I wouldn’t be surprised if that feature gets added before visionOS 2.0.