This Week's Sponsor:

Notify

Monitor Websites & RSS Feeds. Private. Simple. No Servers Required.


Posts in Linked

Wallpaper Interviews Apple’s Alan Dye and Richard Howarth

Today, Wallpaper published an interview with Alan Dye, Apple’s Vice President of Human Interface Design, and Richard Howarth, Vice President of Industrial Design. It’s a fantastic read with some great images, including an exploded view of the Vision Pro’s components.

Something I noticed as soon as I unboxed the Apple Vision Pro was how approachable it was. The setup process was easy, well-paced, and felt natural. That carries through to the hardware itself, too, which Dye explained to Wallpaper:

We wanted people around you to also feel comfortable with you wearing it, and for you to feel comfortable wearing it around other people. That’s why we spent years designing a set of very natural, comfortable gestures that you can use without waving your hands in the air. That’s also why we developed EyeSight, because we knew more than anything, if we were going to cover your eyes, that takes away much of what is possible when you connect with people. Getting that right was at the core of the concept of the product because we wanted people to retain those connections in their actual world.

My very early impression is that Apple’s design team accomplished its goal. Howarth puts a slightly different spin on the same message:

There’s a hardness and precision to the front of the product that is completely technical and feels like it’s been sent from the future, but then everything else that connects the product to you is soft and really approachable, so you feel cushioned and there’s not a barrier to putting it on or taking it off. And in fact, it should be a pleasure.’

Nobody is going to confuse the Vision Pro for something that it’s not. Still, the care that has been taken in its design goes a long way toward taking a device that is completely foreign to many people and making it one that isn’t intimidating. That’s something very uniquely Apple and why I’m optimistic about Vision Pro’s long-term prospects.

Permalink

On Vision Pro’s Spatial Computing

There’s no tech commentator better equipped to talk about the history of spatial interfaces in Apple operating systems than John Siracusa, and I enjoyed his latest, thought-provoking column on where visionOS and the Vision Pro’s gesture system fit in the spatial computing world:

Where Vision Pro may stumble is in its interface to the deep, spatial world it provides. We all know how to reach out and “directly manipulate” objects in the real world, but that’s not what Vision Pro asks us to do. Instead, Vision Pro requires us to first look at the thing we want to manipulate, and then perform an “indirect” gesture with our hands to operate on it.

Is this look-then-gesture interaction any different than using a mouse to “indirectly” manipulate a pointer? Does it leverage our innate spatial abilities to the same extent? Time will tell. But I feel comfortable saying that, in some ways, this kind of Vision Pro interaction is less “direct” than the iPhone’s touch interface, where we see a thing on a screen and then literally place our fingers on it. Will there be any interaction on the Vision Pro that’s as intuitive, efficient, and satisfying as flick-scrolling on an iPhone screen? It’s a high bar to clear, that’s for sure.

In yesterday’s review on The Verge, Nilay Patel shared a similar idea: it’s a strange feeling to use a computer that requires you to look at what you want to control at all times. I don’t know what to think about this yet since I don’t have a Vision Pro, but I’m curious to learn how this interaction method will scale over time as we start using this new platform on a daily basis. It’s quite fitting, however, that visionOS is based on the one Apple platform that supports both kinds of manipulation with a pointer and touch.

Permalink

AppStories, Episode 368 – Workflow Experiments

This week on AppStories, we explore workflows we’ve been trying in a variety of apps.

Sponsored by:

  • Memberful - Help your clients monetize their passion. Get started for free today!

On AppStories+, Federico and John follow up last week’s Vision Pro episode with follow up on their plans and consider what the device will be good for besides sitting by yourself watching movies.

We deliver AppStories+ to subscribers with bonus content, ad-free, and at a high bitrate early every week.

To learn more about the benefits included with an AppStories+ subscription, visit our Plans page, or read the AppStories+ FAQ.

Read more


QuickTime VR and Spatial Computing

Source: Tabletops.

Source: Tabletops.

Soon, the world will get a glimpse of the Apple Vision Pro outside of the tightly controlled demos provided to a select number of people. As we wait for that moment, it’s worth taking a look back at QuickTime VR with Michael Steeber through his excellent newsletter, Tabletops.

QuickTime VR was a 3D image format that explored some of the spatial video concepts coming to the Vision Pro, albeit using CD-ROMs in 90s-era Macs. To show off the technology, Apple created a demo CD that included a virtual tour of the recently closed Company Store. Steeber got the tour up and running and shared some wonderful images and videos of Apple’s vision for VR 30 years ago.

The story is full of interesting details about Apple retail when the Company Store was all there was to Apple retail:

At the entrance to the store is a physical map of the space, like the kind you’d find at a trailhead or in a museum lobby. In the Performa department, a cutout of a child hanging upside down looms from the ceiling. Along the wall is a disheveled pile of AppleDesign Speaker boxes. In the Newton department, an entire wall is wrapped with a print of someone’s backside, toting a Newton in their jeans pocket.

One section of the store is filled with more than 700 software titles. In early promotional materials, Apple called this aisle “Technology Way,” which is so similar to the “Software Alley” in early Apple Stores that I can’t help but wonder if it was carried over.

Be sure to check out the latest issue of Tabletops to see QuickTime VR in all its glory and sign up for the newsletter while you’re there. It’s always a good read.

Permalink

Add Timestamp Links to Apple Podcasts Next

Matthew Cassinelli:

Yesterday, Apple began adding transcripts to Apple Podcasts, detailing the change on the Apple Podcast for Creators site and making them available for in iOS 17.4 developer beta 1.

This change is a huge win for accessibility, will surely improve searching in the Podcasts app, and makes quoting your favorite podcast an easy task by letting you copy and paste the text out – something I’ll definitely have to turn into a shortcut soon.

All these benefits are great in their own way and will make podcasts more shareable as a whole, allowing us to unlock so many people’s great ideas that are currently stored within hours of audio files and obscured behind URLs that point only to the show or episode as a whole.

However, I think Apple needs to go one step further in their next step and add timestamps to Apple Podcasts, a long-overdue feature that’d enable users to share links to individual moments within a podcast, directly to a specific point in the transcript.

I couldn’t agree more. From sharing to personal note-taking and research purposes, there several use cases I can think of to take advantage of timestamp links for podcast episodes – especially now that they have transcripts. (Pocket Casts, my favorite third-party podcast player, goes even further: it lets you share timestamp links and save private, time-synced bookmarks for specific parts of any episode.)

I like Matthew’s suggestions for how Apple could implement this feature, and I’ll add: Apple has already built this system for the Music app. When the company added shareable lyrics to the Music app in iOS 14.5, they did so with the ability to share selected lyrics as a special “snippet” on iMessage that is actually an interactive, timestamped song preview based on a special URL. Here’s what I wrote:

Besides Apple’s custom implementation of lyrics selection in the share sheet, what’s also interesting about this is the method the company is using to share Apple Music lyrics URLs. Unlike regular music.apple.com links that reopen a particular song or album in the Music app or play a generic preview snippet in iMessage, lyrics URLs are timestamped: in iMessage, the lyrics card has a play button that will preview the lyrics you shared inline within a conversation; if you tap the link in iMessage and the same song is already paused in the Music app, the Now Playing screen will automatically advance to the section highlighted in shared lyrics.

I’m assuming that Apple is aware of this missing feature from the Podcasts app in iOS 17.4 beta 1; I have to believe their future implementation will be very similar to what already exists in Music.

Permalink

Obsidian’s ‘2023 Gems of the Year’

Silver, writing on the Obsidian blog:

It has been nearly four years since the first line of code of Obsidian was written on January 31st, 2020. Today we’re thrilled to announce the winners of our fourth annual Gems of the Year awards!

This year the Obsidian community nominated 287 projects, including plugins, themes, tools, content, and templates. After our panel narrowed down the selection and the community voted on the entries, we’re now excited to announce the winners.

Solid list of plugins and themes for the best note-taking app out there, many of which I wasn’t familiar with or hadn’t tested yet. The Border theme looks stunning and I’m going to give it a try as my primary theme in the app; the Home Tab plugin does exactly what I want from a new empty tab in Obsidian (a search box + recently opened files); Omnivore, which I’m testing as my read-later app after they added better Shortcuts integration, does a fantastic job syncing highlights to Obsidian with its plugin. Go check out this list of gems if you haven’t yet.

Side note: I’m really curious to see how the Obsidian team prioritizes updates to its iPhone and iPad apps (by far, the weakest spot of the app) in 2024.

Permalink

Apple Shares the Secret of Why the 40-Year-Old Mac Still Rules

Steven Levy, writing for Wired, interviewed Apple executives about the secret to the Mac’s 40-year run:

“With the transition to Apple silicon that we started in 2020, the experience of using a Mac was unlike anything before that,” says John Ternus, Apple’s senior vice president of hardware engineering.

Ternus’ comment opens up an unexpected theme to our conversation: how the connections between the Mac and Apple’s other breakout products have continually revitalized the company’s PC workhorse. As a result, the Mac has stayed relevant and influential way past the normal lifespan of a computer product.

In the past few years, Mac innovations sprang from the transition to custom Apple silicon chips first pioneered to power iPhones. “I joke that we had to create the iPhone to create the scale to build the Mac we wanted to build,” says Craig Federighi, Apple’s senior vice president of software engineering. Ternus also notes that the iPhone’s contribution to Apple’s bottom line has been very good to the Mac. “As the business has been successful, it’s enabled us to invest and do the things we always wanted to do,” he says.

One example of that, I mention, must have been the recent boost to battery life in Mac notebooks. “When we broke physics?” jokes Joswiak. Indeed, the almost daylong span, 22 hours of battery life in some Macbook Pros, can feel life-changing. Again, this was a collateral effect of efforts to extend battery life in the iPhone.

“When we first started working with Apple silicon, it honestly did feel for us like the laws of physics had changed,” says Ternus. “All of a sudden, we could build a MacBook Air with no fan with 18 hours of battery life,” he says. “The best arrow in our quiver is efficiency. Because if you can improve efficiency, everything gets better.”

Levy has been covering the Mac from the beginning. His article is a fascinating look back at important moments in the computer’s history and at where it stands today.

Apple silicon is just the latest inflection point for a computer that has seen more than its fair share of changes over four decades. For a while, it looked like the Mac would be relegated to history’s dustbin – left behind by the iPhone. But, it’s the very success of the iPhone formed the foundation of some of the greatest strengths of today’s Mac. It’s an age-old story of reclaimed success built on reinvention necessitated to avert irrelevance.

Permalink

40 Years of Macintosh

This morning, the Steve Jobs Archive remembered the 40th anniversary of the Mac with a message to its email subscribers that tells the story of when Rolling Stone photographer Norman Seeff and reporter Steven Levy visited Apple. It’s a great anecdote that captures the spirit of the team that created the Mac in the time leading up to its public unveiling.

The post also explains Jobs’ approach to building the Mac:

Steve knew that the very best work conveys the ideas and intentions of the people who created it. And he believed deeply that this team of engineers, designers, and programmers, who were also sculptors, photographers, and musicians—a team that integrated technology and the liberal arts—could create a machine for everyday people, “a computer for the rest of us.” 

At a time when computers were complex and difficult to use, it was a radical objective. To get there, Steve encouraged the team and protected them; he pushed them hard and shared his critiques. He asked them to sign their work like artists, even while reminding them that they were building a tool for others to use. “We’re going to walk into a classroom or an office or a home five years from now,” he promised, “and somebody’s going to be using a Macintosh for something we never dreamed possible.”

The Steve Jobs Archive has published a handful of stories to its email list since its inception, and today’s is one of my favorites. I do wish, though, that the Archive maintained a blog on its site for this sort of material. Locking these stories up in a third-party service like Mailchimp is a shame for a bunch of reasons.

Permalink

Author of ICONIC Marks the Mac’s 40th Anniversary with Over 1,000 Photos

Jonathan Zufi, the creator behind the coffee table book ICONIC - A Photographic Tribute To Apple Innovation has dug into his archive of Mac photography to mark the 40th anniversary of the Mac with over 1,000 photos and videos that he’s taken and collected over the years, all of which are on display on mac40th.com. Here’s Zufi on the Mac’s milestone:

Over the past 40 years Apple developed and launched hundreds of products in and supporting the Macintosh line - culminating in 2024 with the latest range of M3 powered desktops and laptops which are technological marvels of speed, power management and design.

To celebrate this milestone, mac40th.com showcases every Macintosh desktop and portable Apple has ever made with hundreds of the photos taken as part of the work creating the coffee table book ICONIC: A Photographic Tribute to Apple Innovation (3rd edition now available up to date as of the end of 2023). The site also includes photos taken by Kevin TaylorForest McMullin and others (including video) that I’ve collected over the past 14 years.

Zufi’s website is wonderful. There’s so much to browse here. You can easily spend hours discovering old favorites alongside obscure curiosities. It’s the perfect way to spend some downtime and mark today’s anniversary.

Permalink