On this week’s episode of AppStories, we discuss app pre-orders, a new feature introduced by Apple that has interesting implications for developers and their customers.
Email robs you of precious time. Spark recovers those lost moments by knowing what’s important and organizing it for you neatly and automatically.
Spark’s smarts start with its inbox. Messages that arrive in Spark’s Smart Inbox are automatically categorized as Personal, Notifications, and Newsletters, which makes it easy to focus on what’s important and return what’s not later.
Intelligent search makes it simple to find messages no matter where they are. Spark’s natural language algorithm thinks like you do. Just ask for messages the way you would if you were asking a friend.
Email notifications are a problem that can spin out of control quickly. Spark uses Smart Notifications to filter out the junk and only notify you what you need to know now.
In addition, Spark features beautifully designed card-style calendar invitations that can be accepted with just one tap, the ability to send later and set up reminders for messages that don’t receive a reply, message snoozing, and Quick Replies that let you acknowledge a message with a single tap. Spark also has customizable gesture actions and works with Dropbox, Box, iCloud Drive, and more. Spark is beautifully designed too with a threaded message view that makes following a conversation simple.
As if that weren’t enough, in just a few short months, Spark 2.0 will introduce Spark for Teams, which will change the way teams collaborate giving them the ability to do things like comment and discuss messages and write messages together.
Simply raise your eyebrows to move the emoji up, frown to move the emoji down, or make a neutral expression and the emoji stays still. Note that if you raise your eyebrows, and keep them raised, the emoji will continue to move in an upwards direction, and vice verse when maintaining a frowning expression.
While there are no levels, the game gets increasingly difficult as more obstacles appear. The goal is simply to get the highest score possible, but players can only compete against themselves right now. Gitter told us that he plans to integrate Apple’s Game Center for multiplayer competition in a future update.
Here’s a video of the game in action:
I played this for 20 minutes last night. It’s genius. Using the TrueDepth camera on the iPhone X, Rainbrow can detect the movements of muscles around your eyes and thus ask you to raise or lower your eyebrows to move an emoji up and down to collect points. What makes this game feel like magic – as if the iPhone is reading your mind – is that there’s no camera preview on screen and no buttons to press: you don’t see your face in a corner; the game simply reacts to your expressions in real-time without an interface separating you from the actual gameplay. It’s fun, and it’s a good demonstration of the accuracy of the TrueDepth system.
I’ve been asking myself which parts of iOS and the iPhone experience could be influenced by attention awareness and redesigned to intelligently fit our context and needs. I don’t think this idea will be limited to Face ID, timers, and auto-lock in the future. What happens, for example, if we take attention awareness farther and imagine how an iPhone X could capture user emotions and reactions? TrueDepth could turn into an attention and context layer that might be able to suggest certain emoji if we’re smiling or shaking our heads, or perhaps automatically zoom into parts of a game if we’re squinting and getting closer to the screen. A future, more sophisticated TrueDepth camera system might even be able to guess which region of the display we’re focusing on, and display contextual controls around it. Siri might decide in a fraction of a second to talk more or less if we’re looking at the screen or not. Lyrics might automatically appear in the Music app if we keep staring at the Now Playing view while listening to a song.
It might be a silly game, but Rainbrow is the kind of different application of TrueDepth I had in mind. The same goes for Nose Zone, a game that uses ARKit’s TrueDepth-based face tracking to turn your nose into a cannon to shoot squares (I’m serious). While these first TrueDepth games are fun gimmicks, I believe we’re going to see invisible, persistent attention awareness and expression tracking become embedded into more types of apps over the next year.
Then Killian, who wears his Apple Watch to bed three to four nights a week for sleep tracking, says his Apple Watch woke him up around 1 am with an alert from a third-party app called HeartWatch saying his resting heart rate was elevated while sleeping (Apple recently introduced a built-in feature that can do this with Apple Watch Series 1 and later). Killian experienced mild indigestion which can be a sign of a heart attack, but says he generally didn’t feel sick.
His Apple Watch charted his heart rate at around 121 beats per minute in the middle of the night while data previously captured showed his average resting heart rate at around 49 beats per minute. The data also showed that this was the first time his resting heart rate had reached this level since he began wearing Apple Watch, so he decided to go to the emergency room as a precaution.
I’ve been using HeartWatch since it came out in late 2015 – if you care about heart rate stats monitored by the Apple Watch, I can’t recommend it enough.
Since the release of iOS 11, I haven’t seen many apps use the system’s new document browser – the built-in Files view that can be embedded in document-based apps as a root view. I covered this feature extensively in my iOS 11 review. Apple’s iWork suite of apps has switched to Files’ document browser, as did MindNode 5, but I’ve struggled to find other examples of high-profile third-party apps abandoning their custom file managers in favor of Apple’s solution. As I argued in September, my guess is that a combination of bugs and lack of deeper customization for the document browser is preventing more apps from embracing it. Which is too bad, as the Files integration brings substantial benefits in the form of drag and drop, iCloud Drive and third-party locations, recents and tags, and more.
PDF Viewer, the powerful PDF editor for iOS we originally reviewed here and here, is adding support for iOS 11’s document browser with version 2.1, launching today on the App Store. After testing a beta of this update for the past week, I’m impressed by how the folks at PSPDFKit have been able to fine-tune the document browser to the app’s needs.
With the recent release of Animal Crossing: Pocket Camp, Nintendo has now delivered all of its previously announced smartphone games. The shocking change in company policy that lead to the development of these titles was one of the final projects that Nintendo’s former president, Satoru Iwata, managed before his untimely passing.
These games have been a mixed bag of success for Nintendo, both in terms of quality and profits, and since all of the known games are now out in the wild, it’s a good time to check in and see where each Nintendo mobile game stands at the end of 2017.
My goal here is to examine what the games set out to achieve, how successful they were with those goals on launch, where they are today and where they can go from here.
Finch is spot-on about Super Mario Run and what went wrong with the game, and I agree with his assessment of Animal Crossing’s future potential. I wonder what Nintendo could do with a future mobile Zelda game.
During as session about podcasts at WWDC in June, Apple announced that it would introduce podcast analytics later in the year. Today, the feature was rolled out as a beta service as part of iTunes Connect, Apple’s content creator portal. At release, the data available to podcast producers includes unique device downloads as well as playback metrics.
Historically, podcast analytics have been rudimentary. Producers could track downloads, but there was no way to tell how many users were behind those downloads or how long they listened. Those sorts of features are something that some podcast producers, especially those coming from the radio industry, have wanted Apple to add for a while.
With the beta introduced today, producers can track the number of unique device downloads and view graphs of how long listeners lasted before giving up on a show. The data is aggregated to protect user privacy, but it’s nonetheless substantially more information than podcasters have had in the past.
The data is limited to users using the Apple Podcasts app on iOS 11 and later or listening via iTunes 12.7 or later on macOS, which limits its utility for some show producers. For example, Apple Podcasts listeners account for barely over 5% of listeners of our podcast, AppStories, a number which counts users of the app on all versions of iOS.
One of the big proponents of these sort of analytics has been big brand advertisers who want to more closely measure the performance of podcast advertising. As podcasts have boomed in recent years and producers have looked to bring larger advertisers along for the ride, pressure has mounted for the kind of analytics that are employed on the web. It remains to be seen whether podcast analytics do to podcasts advertising rates what click-through and other metrics have done to other online media outlets’ advertising revenue.
When thinking about the earliest days of Apple, it’s easy to recall the Apple I, the Apple II line and the Macintosh. However, there’s one more computer that defined Apple’s early years. This computer was ground-breaking but incredibly expensive, and exposed many things wrong within Apple itself.
The Lisa launched 35 years ago next month. Today, it is mostly considered as a precursor to Mac. While that is true, it doesn’t come close to doing this computer justice.