Federico Viticci

10779 posts on MacStories since April 2009

Federico is the founder and Editor-in-Chief of MacStories, where he writes about Apple with a focus on apps, developers, iPad, and iOS productivity. He founded MacStories in April 2009 and has been writing about Apple since. Federico is also the co-host of AppStories, a weekly podcast exploring the world of apps, Unwind, a fun exploration of media and more, and NPC: Next Portable Console, a show about portable gaming and the handheld revolution.

Connected, Episode 110: Swimsuits and Sunglasses for Dogs

This week Federico struggles to use the new W1-enabled Beats with his new iPhone 7, Myke shares some thoughts on Snapchat Spectacles, and Stephen is away so the europeans join Snapchat on-air.

On this week’s Connected, I share some initial impressions on the iPhone 7 and explain why I want to understand Snapchat. You can listen here.

Sponsored by:

Permalink

How Apple Music’s Algorithmically Personalized Playlists Work

Reggie Ugwu, in a BuzzFeed feature story on the second act of Apple Music, got some details on the new ‘My Favorites Mix’ and ‘My New Music Mix’ playlists introduced with iOS 10:

Revealing how the mixes operate for the first time to BuzzFeed News, Apple claimed a potential advantage over similar algorithmically personalized playlists, including Spotify’s Discover Weekly and Pandora’s Thumbprint Radio: deep historical knowledge of individual users’ tastes and habits, based on years of data carried over from iTunes.

If you gave high ratings to a song or album in your old iTunes library, or just played it a lot more than others, you’ll find that behavior reflected in your My Favorites Mix. Meanwhile, the My New Music Mix algorithm serves recently released songs — as well as songs that Apple Music knows you haven’t played before — that the service’s music experts have flagged as similar to others in your taste profile. Apple Music executives suggested even more personalized playlists will follow in the series; but only after prototypes have been vetted, with all possible outcomes — intentional and otherwise — given careful consideration.

This explains why the ‘My Favorites Mix’ playlists have (anecdotally) been spot-on for people I talked to. As I previously noted, back-catalogue songs from artists you’re not familiar with (essentially, Discover Weekly) would be a good addition to Apple Music’s mixes.

Permalink

Sonos Speakers Are Coming to the Apple Store

Sonos PR:

We’re very excited to announce today that we’re making it easier than ever by expanding our collaboration with Apple and bringing Sonos into Apple’s physical and online retail stores. Two of our most popular speakers — the PLAY:1 and PLAY:5 — will be available later today at Apple.com in the USA, at 468 Apple Stores around the world starting October 5, and Apple.com in the coming weeks in most markets outside of the USA.

Better still, those who buy a Sonos system at Apple retail stores or at Apple.com between today and December 31 will receive a free three-month Apple Music Gift Card.

And:

This is a big deal — both for our companies, and for music lovers around the world who can now experience Sonos and Apple Music like never before. We know the best way to discover Sonos is by experiencing it first hand, which shoppers in more than 140 Apple Stores will be able to do with a hands-on Sonos and Apple Music demo featuring a guided Sonos app experience on the iPad Pro.

This makes sense given the popularity of Sonos, the problem it solves (it is, effectively, an ecosystem for audio), and its integration with Apple Music. I’m a Sonos customer, and their products fill a specific niche: good-looking speakers for the home with great audio quality and none of Bluetooth’s annoyances. Does that seem familiar?

On the other hand, a closer partnership also gives Apple a way to study and learn how and why people buy Sonos devices. If I were a company playing around with the idea of releasing a smart speaker, studying Sonos’ line is something I’d consider. I’m curious to try their hands-on demo with the iPad Pro.

Permalink

Shazam’s Interactive Notifications on iOS 10

A rich Shazam notification in iOS 10. Also, a great song.

A rich Shazam notification in iOS 10. Also, a great song.

I was recently watching a movie with my girlfriend, and it had a great soundtrack. After scrubbing the video back to open Shazam on my iPhone for the third time, I remembered that Shazam offered an automatic tagging feature to let the app continuously listen in the background to recognize songs. Shazam’s auto-tagging isn’t meant to be active all the time, but we were home, with my iPhone charging next to me, and it seemed like a perfect time to try it.

To my surprise, Shazam started pushing tagged songs using iOS 10’s new notification framework. Their implementation is a great example of what developers can achieve with rich notifications: a notification can be expanded and you’ll be presented with a custom view showing the song’s title, artist, album artwork, and global number of Shazams by users. But that’s not all – you can also tap on the artwork to listen to a song’s preview inside the notification without opening the Shazam app. If you want to act on the notification, there are three quick actions (another change made possible by iOS 10) to buy the song, add it to a playlist on Apple Music, or share it.

Once I realized I could catch up on tagged songs from Notification Center, I left Shazam running and enjoyed the rest of the movie. At the end, I went through my notifications, listened to each audio snippet, and saved a few songs in my Apple Music playlists.

The final result would have been the same in iOS 9, but the experience wouldn’t have been as nice (or as fast) without rich notifications. I’m looking forward to more apps adopting similar notification features in the next few months.


Opening Any Apple Maps Place or Address on the Web

Yesterday, Benjamin Mayo reported that Apple published public webpages for “some landmarks and points of interest” listed in Maps:

Apple is now publishing public web pages for landmarks and POI in its Apple Maps database: here’s one such place. The website link shows details about the place such as location, name, telephone number and reviews. The design is identical to the cards in the iOS 10 Maps app.

When on an Apple platform, these URLs appear to act as deep links into the native Maps app. If viewing in Chrome or on a non-Apple device like Android, the fallback webpage is loaded. The purpose for these URLs is unclear, but it may possibly foreshadow a larger move by Apple to offer its own online mapping service to compete with Google Maps on the web.

I did a bit of digging, and I discovered that you can open any Apple Maps place or address in another web browser if you reformat the URL properly.

Read more



Matthew Panzarino’s Hands-On with the iPhone 7 Plus’ Portrait Mode in iOS 10.1 Beta

Matthew Panzarino, writing for TechCrunch, was able to test the iPhone 7 Plus’ upcoming Portrait mode, released to developers with a first beta of iOS 10.1 earlier today.

If you’ve skipped here to see how the heck it works, I don’t blame you. The short answer: incredibly, miraculously well in many instances. And pretty rough in others. Apple says this is still in beta and it is. It has trouble with leaves, with chain link fences and patterns and with motion. But it also handles things so well that I never thought possible like fine children’s hair and dog fur, shooting pictures with people facing away and objects that are not people at all.

What does it have major trouble with? Fine lines, wires, chain link, glass, leaves. Anything that merges with the edges of your subject a bunch of times could confuse it. The closer to the subject the harder it is for it to distinguish. Motion, too, is a no. If the subject moves a bit, ok. If it moves too much you get ghosting, as you do in HDR mode — because there is compositing involved.

Let’s look at some examples and I’ll dissect what works, what doesn’t and how the mode is applying the effect in each image. In each case, I’ll include both the standard and Depth Effect image for comparison.

Panzarino reports that Portrait works on non-human subjects as well (which Apple didn’t originally mention) and that it uses new body detection systems and a “sliding scale” mechanism to apply blurs for the background. Fascinating explanation – with some good points on how Apple could improve Portrait mode in the future.

Permalink


Tweetbot Updated with Support for Longer Tweets

When Twitter rolled out support for longer tweets yesterday, we mentioned that Tweetbot – the popular third-party client – would soon support the new format natively. Tapbots has released updates to the iOS and macOS apps today that let you view and create longer tweets (where media, polls, and quotes don’t count against 140 characters) without having to rely on Twitter’s official app. You can get the iOS update here.

Permalink