On Friday, TechCrunch reported that Apple had agreed to acquire music discovery service and app-maker Shazam. Today, Apple made it official confirming the deal to BuzzFeed News. Shazam, which makes iOS, watchOS, and macOS apps that can detect songs, TV shows, and advertisements from their sound signatures, has been on Apple’s platforms since the early days of iOS and is the engine behind Siri’s ability to recognize songs.
The financial terms of the deal were not disclosed in the announcement, but according to TechCrunch, Shazam cost Apple somewhere in the neighborhood of $400 million. According to a report from The Wall Street Journal last year, Shazam accounts for about 1 million clicks per day and 10% of digital download sales. However, as streaming services have gained popularity over paid music downloads, Shazam’s affiliate link revenue from music sales has shrunken. To compensate, Shazam has turned increasingly to advertising. With today’s acquisition, Shazam should continue to drive traffic to Apple Music without the need to sustain itself as a standalone business.
In addition to Apple’s music services, Shazam sends significant traffic to Spotify. Shazam also has an Android app. It remains to be seen what will happen to the Spotify relationship or Android app now that Shazam is part of Apple or whether Apple plans to maintain Shazam as a separate iOS app. Deeper integration with Siri is one direction Apple may take Shazam’s technology implementing something like the Google Pixel 2’s automatic song identification feature called ‘Now Playing.’
Past MacStories coverage of Shazam is available here.
For the better part of this year, I’ve been using both Spotify and Apple Music. In my opinion, each service does a few things exceptionally well, but, unfortunately, I can’t have all of them in a single music app.
Spotify’s discovery tools for both old and new songs are simply unparalleled in the industry: Discover Weekly continues to surprise me on a weekly basis just like mixtapes used to do. Spotify is everywhere (including my Amazon Echo); I like how it organizes releases on artist pages; and, it’s got a richer selection of user-generated playlists. Apple Music, on the other hand, looks much better than Spotify (I love Apple’s focus on album artworks and large photography), features built-in lyrics, is deeply integrated with the Apple ecosystem, and I’m a fan of the social feed launched with iOS 11. In short: Spotify is superior when it comes to discovery for music aficionados and integration with third-party hardware, but Apple Music is nicer and easier to use for iOS users. I can’t choose because I happen to have a foot in both camps.
Today Apple released the latest software for Apple Watch: watchOS 4.1. This update includes the previously announced Apple Music streaming, including over cellular, plus the introduction of a brand new Radio app.
In past versions of watchOS, independent music playback was limited to the small assortment of songs and playlists that could be stored locally on Watch hardware. Due to how slow music syncing was, I always considered it too much of a hassle to keep my favorite music accessible on the Watch. Today's update lifts those prior restrictions, though, in a big way. Now Watch owners who are also Apple Music subscribers can stream music, not just from their own library, but also from the service's full catalog of over 40 million songs. This works over cellular on the latest Series 3 models, but it also can be done over Wi-Fi with non-cellular models. Even if you don't plan to stream music sans-iPhone, watchOS 4.1 still includes the added benefit of restoring access to your full iCloud Music Library on Series 3 models: past versions of watchOS allowed this, but watchOS 4 limited your selection to synced music only.
Independent music streaming from the Apple Watch is made even better by the addition of a new built-in watchOS app: Radio. The Radio app enables streaming content from Beats 1 or select other radio stations, such as ESPN, NPR, and genre-based stations. While I would have been more excited by a Podcasts app, it's great to see more of Apple Music's features make their way to the Watch.
For owners of the Series 3 Watch with cellular, there's one last notable update: a new toggle in Control Center for activating and deactivating Wi-Fi on the Watch. So if your Watch gets stuck on a slow or unreliable Wi-Fi connection, you can easily disconnect and fall back to LTE.
For most users watchOS 4.1 is a fairly minor update with little to get excited about – but for those sporting a Series 3 Watch with cellular, it's simply transformative. Paired with AirPods, the Apple Watch can now serve as a powerful and worthy successor to the iPod line. There's something downright liberating about going iPhone-free while staying connected and available, and bringing 40 million songs along with you.
Sonos made several major announcements today, launching the company’s products further into their roles as platform-agnostic options in a market increasingly segmented by ecosystem. The last few years have seen an avalanche of new interest in the home speaker space from major players like Amazon, Google, and Apple, and without the accompanying ecosystems those players possess, Sonos stood at a crucial juncture: it could seek to further establish its own platform and compete directly with the tech titans, or it could aim to stand outside that main battleground and make products that tie together the benefits of all ecosystems. They clearly have chosen the latter, and today’s announcements demonstrate the fruits of that decision.
In a press release today, Amazon announced the newest feature addition to its Echo devices:
Amazon today announced an all-new Alexa feature that lets you control and synchronize music across multiple Amazon Echo devices in your home. Starting today, you can target music to a specific Echo device or a group of devices—just ask. Soon, this ability will be extended to control multi-room music on other connected speakers using simply your voice.
The feature is currently only available on Echo devices, but Amazon has also announced a couple new tools to help expand Alexa-powered audio to other speakers. There’s a new Alexa Voice Service SDK that device makers can adopt to enable their speakers to play music in sync with Echo devices. That SDK will be made available early next year. And there is also a new set of Connected Speaker APIs, available today, which allow third-party speakers to be controlled via an Alexa-enabled device.
It should be noted that multi-room audio is only available through a handful of music services. Amazon Music, Pandora, iHeartRadio, and TuneIn are available today, while Spotify and SiriusXM support is coming soon.
I don’t play guitar as often as I used to, but I’ve always enjoyed trying to pick out the chords of my favorite songs. The trouble is, that can be hard to do unless you have a well-trained ear, which I don’t. That is exactly where Capo touch can help. The app can analyze a song, pick out the chords and help you practice it a little at a time at a comfortable pace until you figure out how to play it. This week, Capo touch got a big update that streamlines the learning process and brings powerful features over from Capo for macOS.
The Apple world may be talking about the HomePod a lot in the wake of WWDC, but it's not the first time Apple has tried to reinvent home audio.
In 2006 — just a year before the iPhone appeared — a slightly-under-the-weather Steve Jobs introduced the $349 iPod Hi-Fi.
Shazam for iOS has introduced an update that makes app navigation more simple and streamlined. Gone are the traditional navigation tabs at the bottom of the screen; they have been replaced by a paginated layout where a swipe left or right is used to switch screens.
Launching Shazam lands you on the Home page, which is devoted almost entirely to the Shazam button. Tapping it will cause the app to start listening to what's playing; one change to the Home page is that you now activate Auto Shazam with a long-press on the Shazam button. Auto Shazam allows the app to continue listening to what's playing even after the app closes.
The top of the Home page indicates that there are three pages in total to navigate through. To the left of Home is My Shazam, to the right is Discover, and swiping back and forth is the primary way to get where you want to go. This type of layout resembles that of apps like Snapchat, only Shazam pulls it off in a less confusing manner. Not only do you always see three navigation dots at the top of the screen to indicate your current place within the app, but the Home page also contains icons that show which pages are placed on the left and right – Snapchat could benefit from similar aids, for new users at least.
My Shazam hasn't changed much from before, but in an effort to consolidate the total number of pages in the app, Discover now includes the contents formerly found in Trending as well. A Chart Update card is included with your daily mix, plus you'll find a button at the top and bottom of your 10 daily updates that takes you straight to Trending.
I appreciate what Shazam has done to try simplifying its app, both in the number of pages to navigate through, and in adopting the swiping gesture to handle that navigation with ease. Not every app would benefit from such a streamlined interface, but it works well here.
David Pierce has a fascinating piece for WIRED on a hip-hop producer and artist, Steve Lacy, who makes music start to finish on his iPhone.
Lacy’s smartphone has been his personal studio since he first started making music. Even now, with all the equipment and access he could want, he still feels indelibly connected to something about making songs piece by piece on his phone. He’s also working this way to prove a point: that tools don’t really matter...If you want to make something, Lacy tells me, grab whatever you have and just make it.
Pierce describes a recording session he observed where Lacy used GarageBand, an iRig, and the iPhone's built-in microphone to create music.
He paged through the drum presets in GarageBand for a while before picking a messy-sounding kit. With two thumbs, he tapped out a simple beat, maybe 30 seconds long. Then he went back to the Rickenbacker. He played a riff he’d stumbled on while tuning, recording it on a separate GarageBand track over top of the drums. Without even playing it back, Lacy then reached down and deleted it. It took three taps: stop, delete, back to the beginning. He played the riff again, subtly differently. Deleted it again. For the next half hour, that’s all Lacy did: play, tap-tap-tap, play again. He experimented wildly for a while, then settled on a loose structure and began subtly tweaking it. Eventually satisfied with that bit, he plugged in his Fender bass and starts improvising a bassline. A few hours later, he began laying vocals, a breathy, wordless melody he sang directly into the iPhone’s microphone. He didn’t know quite what he was making, but he was feeling it.
Lacy's recording method is clearly an atypical one in the music industry, but it serves as a great testament to the power of iOS and the iPhone.