Posts in Linked

1Password Announces Plans to Adopt Passkeys

Today, 1Password announced that it’s moving to a passkey-based system for unlocking its password manager app. Using a password manager like 1Password already means not having to remember passwords for every site and service you use because it locks your passwords behind a single, hard-to-guess password. With passkeys, that single password approach will become a thing of the past, allowing users to access their passwords through biometric-based passkeys generated locally on their devices.

1Password’s new passkey feature is coming this summer. The company explains how passkeys differ from the way the app works today:

Now, unlocking 1Password without a password is nothing new. It’s something we do every day using biometrics. 1Password was the first third-party iOS app to offer Touch ID, all the way back in 2014, and since then we’ve added support for Face ID, Windows Hello, Android Fingerprint, and more.

But as convenient as biometrics are today, they don’t actually replace the password; they only mask it. That’s why 1Password asks you to type in your password periodically in order to ensure that you have it memorized.

Passkeys also use biometrics, but they allow us to go farther and eliminate the underlying password entirely.

By replacing passwords with passkeys, 1Password will be able to preserve the benefits of biometrics while eliminating the need to ever use a password to access the app’s data, no matter what platform you use.

Passkeys are a big deal for security. The apps, sites, and services you use may not adopt passkeys for a while, but with 1Password doing so, the passwords you still need to use will be protected better than before. I know I’ll be switching to this system as soon as it’s available.

Permalink

Stephen Robles on the Value of the New HomePods Compared to Sonos Soundbar Setups

Last week, I wrote about the Sonos Move in MacStories Weekly. I love the Move’s portability and rich, warm sound, which make it perfect for use in multiple places around my house and outside. In fact, I’ve enjoyed the Move so much that I’d begun looking at Sonos soundbar and subwoofer options, anticipating that the original HomePods I use with my living room media setup would eventually need to be retired.

Then, Apple released the HomePod (2nd Generation), which iterates on the original version. I had hoped that Apple would make a soundbar of its own, so when all we got was a HomePod, I was disappointed. That pushed me further into the Sonos camp, but with my original HomePods going strong, my window shopping has been just that: window shopping.

However, after watching Stephen Roble’s latest video comparing the new HomePod to its predecessor and the Sonos Beam and Arc soundbars paired with subwoofers, my interest in soundbars has waned. Robles evaluates the HomePods from a bunch of different angles, from music and movies to smart home integration, making a compelling case for a pair of the new HomePods as the best value for someone who wants a multipurpose device.

When I think about it, that’s exactly my use case. My pair of original HomePods are the only speakers on the main floor of our house. I AirPlay podcasts and music to them, use them to control HomeKit devices, and for watching TV and playing games on my PS5 and Xbox.

I’m still disappointed Apple didn’t announce more than a new HomePod last month. I’d like to see the company explore new home-centric devices that address use cases beyond speakers. Still, for audio, it’s hard to argue against the HomePod.

Permalink

macOS Isn’t as Small as You Think

Interesting comparison of macOS and iPadOS interface elements by Matt Birchler:

I will say that there are certainly some macOS UI elements that could be tricky to use with touch, but I think they’re the exception, not the rule. Still, Apple will certainly make some UI changes to accommodate touch as an officially-supported input method on the platform.

And:

There’s a narrative out there that touch is just so incompatible with macOS and that in order to make it work, the macOS UI would have to get blown up to comical proportions, but I don’t think that’s the case. Changes will be made, but I think macOS is more touch-friendly today than many people give it credit for.

I don’t disagree, and count me among those who think Apple should consider bringing touch support to the Mac.

I’ve seen this argument regarding the concern of “blowing up” the macOS UI in recent years too, and I think it’s shortsighted. Look no further than the iPad Pro: in a single device, Apple was able to let touch, pointer, and now even hover interactions coexist. Even without display scaling, I don’t think iPadOS has a comically large interface, as some believe.

There is a lot of work to be done to achieve a similar kind of input balance on macOS (think of all the elements that haven’t been redesigned in recent years, like drag controls for windows; the list is long), but it is possible, and I hope Apple gets there in the near future.

Permalink

The Mac’s 30th Anniversary Icon Font Shared As SVG Images

In 2014, for the 30th anniversary of the Mac, Apple celebrated with a mini site featuring the stories of the people behind the computer and its users. As part of that event, Apple created a special font of line-drawn versions of every Mac from its introduction on January 24, 1984 through 2014.

Robb Knight, my co-host on the Ruminate podcast, has had that font sitting on his Mac for years until yesterday when he released it as a series of downloadable SVG images with the help of friends Keir Ansell and Josh Calvetti.

The Mac's 30th anniversary website.

The Mac’s 30th anniversary website.

I love this sort of project. The line drawings of these Macs look great and, as SVGs, are suitable for a wide range of projects. Robb has a long list of other interesting projects worth checking out on his website, including Alfred workflows, a Mastodon bookmarklet, a Mac utility to eliminate trackers from URLs, and a set of tools for Micro.blog to name a few.

Permalink

Unread 3.3

Saving an article from Unread to Readwise Reader.

Saving an article from Unread to Readwise Reader.

Unread, the elegant RSS reader by Golden Hill Software that we’ve covered before on MacStories, received its 3.3 update today, and it’s an interesting one I’ve been playing around with for the past week. There are two features I want to mention.

The first one is the ability to set up an article action to instantly send a headline from the article list in the app to Readwise Reader. As I explained on AppStories, I decided to go all-in with Reader as my read-later app (at least for now), and this Unread integration makes it incredibly easy to save articles for later. Sure, the Readwise Reader extension in the share sheet is one of the best ones I’ve seen for a read-later app (you can triage and tag articles directly from the share sheet), but if you’re in a hurry and checking out headlines on your phone, the one-tap custom action in Unread is phenomenal. To start using it, you need to be an Unread subscriber and paste in your Readwise API token.

The second feature is the ability to save any webpage from Safari as an article in Unread, even if you’re not subscribed to that website’s RSS feed. Essentially, this is a way to turn Unread into a quasi-read-later tool: the app’s parser will extract text and images from the webpage, which will be then be saved as a ‘Saved Article’ in Unread Cloud, Local feeds, or NewsBlur, or as a ‘Page’ in Feedbin.

If you’re a new Readwise Reader user, I recommend checking out Unread 3.3, which is available on the App Store for iPhone and iPad.

Permalink

Micro.blog, Mastodon, and Ivory

Manton Reece has a fantastic explanation of the underpinnings of Micro.blog and Mastodon and how they work with third-party clients like Ivory, which Federico reviewed yesterday.

Manton’s post is in response to questions about why Micro.blog work with Tapbots’ Ivory since both Micro.blog and Mastodon implement the ActivityPub standard. The answer is that ActivityPub is primarily a service-level server-to-server API that allows Micro.blog and Mastodon servers to interact with each other. However, clients like Ivory use a different Mastodon API for reading and writing Mastodon posts that doesn’t match up feature-for-feature with Micro.blog. Manton explains the problems that causes:

Could Micro.blog implement the Mastodon API, thereby allowing Ivory to connect to Micro.blog as if it was a Mastodon server? Technically yes, but doing so would introduce a couple problems. By design, Micro.blog does not have exactly the same features as Mastodon. We left out boosts, trends, and follower counts, and added other things that are outside the scope of Mastodon.

If Micro.blog worked with Ivory, what would the UI look like when the features didn’t exactly match up? It would be confusing. Ivory would appear broken and it would disrupt the experience we’re going for with Micro.blog’s community.

That isn’t to say that signing into Micro.blog from Ivory to read and post to Micro.blog in the future is impossible. However, as Manton points out, it will require further experimentation and, ultimately, coordination with third-party apps while keeping an eye on preserving Micro.blog’s identity. Because, after all, Micro.blog and Mastodon are two distinct services that approach social media with different philosophies that are reflected in their designs. Interoperability is appealing on the surface, but not if it comes at the expense of the unique features that users of Micro.blog or any other service have come to expect and rely on.

Permalink

AppStories, Episode 312 – Read-Later Apps

This week on AppStories, we look at the history of read-later apps, the many similarities and differences between them, and today’s read-later apps, like Matter and Reader, as well as link organizers, such as GoodLinks and Anybox.

Sponsored by:

  • Fitbod – Get stronger, faster with a fitness plan that fits you. Get 25% off your membership.
  • Squarespace – Make your next move. Enter offer code APPSTORIES at checkout to get 10% off your first purchase..
  • Pillow – Sleeping better, made simple.

On AppStories+, we share our themes for 2023.

We deliver AppStories+ to subscribers with bonus content, ad-free, and at a high bitrate early every week.

To learn more about the benefits included with an AppStories+ subscription, visit our Plans page, or read the AppStories+ FAQ.

Permalink

The Trouble with Mixing Realities

Mark Gurman recently reported that Apple’s much-rumored headset will combine AR and VR technologies, which Brendon Bigley argues could be the wrong approach:

… I don’t think the road to mass adoption of virtual reality actually starts with virtual reality, it starts instead with augmented reality — a technology that can quickly prove its function if presented in a frictionless way. While even the best VR headsets demand isolation and escapism, a hypothetical product focused first and foremost on augmented reality would be all about enhancing the world around you rather than hiding from it.

Brendon’s story nails something that has been nagging me about recent headset rumors. The iPhone was a hit because it took things we already did at a desk with a computer and put them on a device we could take with us everywhere we go, expanding the contexts where those activities could be done. As Brendon observes, the Apple Watch did something similar with notifications. AR feels like something that fits in the same category – an enhancement of things we already do – while VR is inherently limiting, shutting you off from the physical world.

Like Brendon, it’s not that I’m not excited about the prospect of an Apple headset or the long-term prospects for virtual reality as a technology, but given where the technology is today, it does seem as though jumping into VR alongside AR could muddy the waters for both technologies. Of course, we’re all still working off of speculation and rumors. I have so many questions still and can’t wait to see what Apple has in store for us, hopefully later this year.

Permalink

MKBHD on Apple’s Processing Techniques for iPhone Photos

In his latest video, MKBHD eloquently summarized and explained something that I’ve personally felt for the past few years: pictures taken on modern iPhones often look sort-of washed out and samey, like much of the contrast and highlights from real life were lost somewhere along the way during HDR processing, Deep Fusion, or whatever Apple is calling their photography engine these days. From the video (which I’m embedding below), in the part where Marques notes how the iPhone completely ignored a light source that was pointing at one side of his face:

Look at how they completely removed the shadow from half of my face. I am clearly being lit from a source that’s to the side of me, and that’s part of reality. But in the iPhone’s reality you cannot tell, at least from my face, where the light is coming from. Every once in a while you get weird stuff like this, and it all comes back to the fact that it’s software making choices.

That’s precisely the issue here. The iPhone’s camera hardware is outstanding, but how iOS interprets and remixes the data it gets fed from the camera often leads to results that I find…boring and uninspired unless I manually touch them up with edits and effects. I like how Brendon Bigley put it:

Over time though, it’s become more and more evident that the software side of iOS has been mangling what should be great images taken with a great sensor and superbly crafted lenses. To be clear: The RAW files produced by this system in apps like Halide are stunning. But there’s something lost in translation when it comes to the stock Camera app and the ways in which it handles images from every day use.

Don’t miss the comparison shots between the Pixel 7 Pro and iPhone 14 Pro in MKBHD’s video. As an experiment for the next few weeks, I’m going to try what Brendon suggested and use the Rich Contrast photographic style on my iPhone 14 Pro Max.

Permalink