I was recently watching a movie with my girlfriend, and it had a great soundtrack. After scrubbing the video back to open Shazam on my iPhone for the third time, I remembered that Shazam offered an automatic tagging feature to let the app continuously listen in the background to recognize songs. Shazam's auto-tagging isn't meant to be active all the time, but we were home, with my iPhone charging next to me, and it seemed like a perfect time to try it.
To my surprise, Shazam started pushing tagged songs using iOS 10's new notification framework. Their implementation is a great example of what developers can achieve with rich notifications: a notification can be expanded and you'll be presented with a custom view showing the song's title, artist, album artwork, and global number of Shazams by users. But that's not all – you can also tap on the artwork to listen to a song's preview inside the notification without opening the Shazam app. If you want to act on the notification, there are three quick actions (another change made possible by iOS 10) to buy the song, add it to a playlist on Apple Music, or share it.
Once I realized I could catch up on tagged songs from Notification Center, I left Shazam running and enjoyed the rest of the movie. At the end, I went through my notifications, listened to each audio snippet, and saved a few songs in my Apple Music playlists.
The final result would have been the same in iOS 9, but the experience wouldn't have been as nice (or as fast) without rich notifications. I'm looking forward to more apps adopting similar notification features in the next few months.
Yesterday, Benjamin Mayo reported that Apple published public webpages for "some landmarks and points of interest" listed in Maps:
Apple is now publishing public web pages for landmarks and POI in its Apple Maps database: here’s one such place. The website link shows details about the place such as location, name, telephone number and reviews. The design is identical to the cards in the iOS 10 Maps app.
When on an Apple platform, these URLs appear to act as deep links into the native Maps app. If viewing in Chrome or on a non-Apple device like Android, the fallback webpage is loaded. The purpose for these URLs is unclear, but it may possibly foreshadow a larger move by Apple to offer its own online mapping service to compete with Google Maps on the web.
I did a bit of digging, and I discovered that you can open any Apple Maps place or address in another web browser if you reformat the URL properly.
Matthew Panzarino, writing for TechCrunch, was able to test the iPhone 7 Plus' upcoming Portrait mode, released to developers with a first beta of iOS 10.1 earlier today.
If you’ve skipped here to see how the heck it works, I don’t blame you. The short answer: incredibly, miraculously well in many instances. And pretty rough in others. Apple says this is still in beta and it is. It has trouble with leaves, with chain link fences and patterns and with motion. But it also handles things so well that I never thought possible like fine children’s hair and dog fur, shooting pictures with people facing away and objects that are not people at all.
What does it have major trouble with? Fine lines, wires, chain link, glass, leaves. Anything that merges with the edges of your subject a bunch of times could confuse it. The closer to the subject the harder it is for it to distinguish. Motion, too, is a no. If the subject moves a bit, ok. If it moves too much you get ghosting, as you do in HDR mode — because there is compositing involved.
Let’s look at some examples and I’ll dissect what works, what doesn’t and how the mode is applying the effect in each image. In each case, I’ll include both the standard and Depth Effect image for comparison.
Panzarino reports that Portrait works on non-human subjects as well (which Apple didn't originally mention) and that it uses new body detection systems and a "sliding scale" mechanism to apply blurs for the background. Fascinating explanation – with some good points on how Apple could improve Portrait mode in the future.
It’s a busy week at Connected HQ: cyclists are drowning, iPhones are hissing and macOS Sierra is shipping.
In the latest episode of Connected, Myke and Stephen discuss the latest macOS release, and we talk about strange noises coming out of iPhones. You can listen here.
When Twitter rolled out support for longer tweets yesterday, we mentioned that Tweetbot – the popular third-party client – would soon support the new format natively. Tapbots has released updates to the iOS and macOS apps today that let you view and create longer tweets (where media, polls, and quotes don't count against 140 characters) without having to rely on Twitter's official app. You can get the iOS update here.
Daylite is a business productivity app for Mac, iPhone, and iPad.
Organize you and your team’s contacts, calendars, projects, tasks, emails, notes, and new business opportunities all in one app.
Track sales and set reminders for follow-ups. See a full history of all emails, calls, and notes for each customer. Customize your own pipelines to track sales and projects. View your whole team’s calendar to make scheduling meetings simple. Daylite even integrates with Apple Mail so you can update customer info, set tasks and reminders, and add appointments to your calendar – all without leaving Mail.
Automate lead generation from online web forms with Daylite & Zapier integration. When someone fills out a form on your website through Google Forms or Wufoo, a new contact and business opportunity are creating in Daylite. You can then segment leads for specific email campaigns and track all of your communication with them in Daylite.
Always have your business info no matter where you go. Daylite is a native app so you can access your information on your Mac, iPhone, or iPad even when you don’t have an Internet connection.
Read how businesses all over the world are becoming more efficient with Daylite.
Our thanks to Daylite for sponsoring MacStories this week.
In addition to improvements for sharing between users, Google has announced a new feature for Google Photos dubbed 'movie concepts'. Automatically generated like the service's previous slideshows and Assistant creations, movie concepts are based on "creative concepts" – themes found in your photos that go beyond recent uploads.
As Google writes:
We’re also upping our game when it comes to automatic creations. Google Photos has always made movies for you using your recently uploaded photos. Now we’re going further, with new movies that are based on creative concepts — the kinds of movies you might make yourself, if you just had the time. And they’re not only limited to your most recent uploads.
Look out for a concept to commemorate the good times from this summer, and another one for formal events like weddings. And you don’t need to do a thing — these movies get made automatically for you.
Here's an example of a concept created by Google Photos:
Casey Newton, writing for The Verge, has more details:
Tim Novikoff, who joined Google last year when it acquired his video-editing company, Fly Labs, said the feature takes advantage of Google’s advancements in deep learning and computer vision. The idea, he said, was "let’s leverage this to make movies that are emotionally powerful — that make your really smile, or even make you cry and reminisce and show your family."
More concept movies are planned. "You can imagine where this goes," Novikoff said. "Christmas, Halloween, Thanksgiving, Little League highlights, dance recitals. All the things that people do, we can make special movies around them."
The new feature comes less than a week after the launch of iOS 10, which includes Memories – a feature of Apple's Photos app that creates personalized movies based on location, dates, and people recognized in your photo library. From Google's description and Novikoff's comments, it sounds like movie concepts will be more advanced than iOS' automated creations, but we'll have to test them in practice and see if the promise holds up. I'm curious to compare Apple's Memories to Google Photos' concepts.
Last week, I mentioned how Airmail – my favorite email client for iPhone and iPad – would soon receive Siri integration on iOS 10. Today, Airmail 1.3 has hit the App Store with a variety of iOS 10 features in addition to SiriKit, including support for rich notifications and iMessage.
Tapbots released Tweetbot 4.5 today with a few iOS 10 additions.
In the latest version, notifications are slightly richer: you won't be able to preview entire conversation or DM threads in a notification, but at least the sender's username and notification title will have a bold font for better visual separation. I would have liked to see even richer notifications with custom interfaces, and I also wonder if Tweetbot could use SiriKit's messaging intents to send DMs. Perhaps Tapbots will consider deeper iOS 10 enhancements in the future.
Also new in this version, you can now add notes to user profiles. According to Tapbots, the feature is intended to add a brief note to remember why you followed someone; personally, I think it's just as effective to remember why you don't want to follow someone without blocking them. User notes are private, they sync with iCloud, and they can be accessed from the gear menu on a user's profile.
Finally, Tweetbot 4.5 supports smoother scrolling thanks to iOS 10's performance improvements in this area. It's not always noticeable, but I'm glad Tapbots implemented this feature for iOS 10 devices.
Tweetbot 4.5 is available on the App Store.