Posts in Linked


Rainbrow, a Game Controlled by the iPhone X’s TrueDepth Camera

Joe Rossignol, writing for MacRumors about Rainbrow, a free arcade game for the iPhone X that requires you to raise your eyebrows or frown to move the character on screen:

Simply raise your eyebrows to move the emoji up, frown to move the emoji down, or make a neutral expression and the emoji stays still. Note that if you raise your eyebrows, and keep them raised, the emoji will continue to move in an upwards direction, and vice verse when maintaining a frowning expression.

While there are no levels, the game gets increasingly difficult as more obstacles appear. The goal is simply to get the highest score possible, but players can only compete against themselves right now. Gitter told us that he plans to integrate Apple’s Game Center for multiplayer competition in a future update.

Here’s a video of the game in action:

I played this for 20 minutes last night. It’s genius. Using the TrueDepth camera on the iPhone X, Rainbrow can detect the movements of muscles around your eyes and thus ask you to raise or lower your eyebrows to move an emoji up and down to collect points. What makes this game feel like magic – as if the iPhone is reading your mind – is that there’s no camera preview on screen and no buttons to press: you don’t see your face in a corner; the game simply reacts to your expressions in real-time without an interface separating you from the actual gameplay. It’s fun, and it’s a good demonstration of the accuracy of the TrueDepth system.

Here’s what I wrote two weeks ago in the TrueDepth section of my iPhone X story:

I’ve been asking myself which parts of iOS and the iPhone experience could be influenced by attention awareness and redesigned to intelligently fit our context and needs. I don’t think this idea will be limited to Face ID, timers, and auto-lock in the future. What happens, for example, if we take attention awareness farther and imagine how an iPhone X could capture user emotions and reactions? TrueDepth could turn into an attention and context layer that might be able to suggest certain emoji if we’re smiling or shaking our heads, or perhaps automatically zoom into parts of a game if we’re squinting and getting closer to the screen. A future, more sophisticated TrueDepth camera system might even be able to guess which region of the display we’re focusing on, and display contextual controls around it. Siri might decide in a fraction of a second to talk more or less if we’re looking at the screen or not. Lyrics might automatically appear in the Music app if we keep staring at the Now Playing view while listening to a song.

It might be a silly game, but Rainbrow is the kind of different application of TrueDepth I had in mind. The same goes for Nose Zone, a game that uses ARKit’s TrueDepth-based face tracking to turn your nose into a cannon to shoot squares (I’m serious). While these first TrueDepth games are fun gimmicks, I believe we’re going to see invisible, persistent attention awareness and expression tracking become embedded into more types of apps over the next year.

Permalink

How Apple Watch Saved One Man’s Life

Amazing story shared by Scott Killian with 9to5Mac:

Then Killian, who wears his Apple Watch to bed three to four nights a week for sleep tracking, says his Apple Watch woke him up around 1 am with an alert from a third-party app called HeartWatch saying his resting heart rate was elevated while sleeping (Apple recently introduced a built-in feature that can do this with Apple Watch Series 1 and later). Killian experienced mild indigestion which can be a sign of a heart attack, but says he generally didn’t feel sick.

His Apple Watch charted his heart rate at around 121 beats per minute in the middle of the night while data previously captured showed his average resting heart rate at around 49 beats per minute. The data also showed that this was the first time his resting heart rate had reached this level since he began wearing Apple Watch, so he decided to go to the emergency room as a precaution.

I’ve been using HeartWatch since it came out in late 2015 – if you care about heart rate stats monitored by the Apple Watch, I can’t recommend it enough.

Permalink

The State of Nintendo’s Smartphone Games

Great overview by Bryan Finch, writing for Nintendo Wire, on the state of Nintendo’s high-profile mobile titles:

With the recent release of Animal Crossing: Pocket Camp, Nintendo has now delivered all of its previously announced smartphone games. The shocking change in company policy that lead to the development of these titles was one of the final projects that Nintendo’s former president, Satoru Iwata, managed before his untimely passing.

These games have been a mixed bag of success for Nintendo, both in terms of quality and profits, and since all of the known games are now out in the wild, it’s a good time to check in and see where each Nintendo mobile game stands at the end of 2017.

My goal here is to examine what the games set out to achieve, how successful they were with those goals on launch, where they are today and where they can go from here.

Finch is spot-on about Super Mario Run and what went wrong with the game, and I agree with his assessment of Animal Crossing’s future potential. I wonder what Nintendo could do with a future mobile Zelda game.

See also: rumors of Nintendo looking for another mobile development partner, and Pocket Camp’s performance thus far.

Permalink

Connected, Episode 172: Shazamalo

Myke was surprised by Apple’s Shazam acquisition, Ticci is living that 4K life and Stephen is thinking about an iMac Pro.

A fun episode of Connected this week with a good variety of topics. You can listen here.

Sponsored by:

  • Balance Open: Find out more about Balance Open and get $2 in Ethereum.
  • Squarespace: Make your next move. Enter offer code WORLD at checkout to get 10% off your first purchase.
  • Incapsula: Secure and accelerate your website. Connected listeners get one month free.
Permalink

AppStories, Episode 35 – An Interview with Christian Selig, Creator of Reddit Client Apollo

On this week’s episode of AppStories, we interview Christian Selig, the creator of the popular iOS Reddit client Apollo about the development and design of the app, incorporating Redditors’ feedback, the complexities inherent in building a Reddit app, and working in Swift.

Sponsored by:

  • Jamf Now – Manage and protect your Apple devices in minutes. Create your free account today.
  • Dejal Time Out — You deserve a break. Try Time Out for Mac for free.
Permalink

How to Design for iPhone X (Without an iPhone X)

Great analysis by Sebastiaan de With on how they redesigned Halide for the iPhone X (the app indeed turned out to be one of the best iPhone X app updates we’ve seen so far):

Design for ergonomics. On regular iPhones, you have to do much less as a designer to optimize ergonomics. The iPhone X requires you to think about comfortable button placement and usability. Ergonomics is more than just tapping, but also swiping and other gestures. Lay out your UI so all actions are accessible and as comfortably usable as possible.

It’s a whole new device: Design for it. Everyone can stretch an app out to a larger screen, but just like the iPad, a fresh approach is not only welcomed but helps you stand out in the App Store. This is a great time to validate your current design. Are your approaches still valid? Is there a better solution possible? You might come to some valuable insights that you can apply to all your designs, not just the ones for the shiny new device.

If you’re a developer working on iPhone X UI updates, don’t miss Sebastiaan’s map visualization of the device’s display.

Permalink

Twitter Rolling Out Official Support for Threads

Twitter’s latest feature – which is rolling out “in the coming weeks” – is another that’s been inspired by something users have been doing for a few years now: threads.

From the Twitter blog:

At Twitter, we have a history of studying how people use our service and then creating features to make what they’re doing easier. The Retweet, ‘@reply’, and hashtag are examples of this. A few years ago we noticed people creatively stitching Tweets together to share more information or tell a longer story – like this. We saw this approach (which we call “threading”) as an innovative way to present a train of thought, made up of connected but individual elements.

Now, hundreds of thousands of threads are Tweeted every day! But this method of Tweeting, while effective and popular, can be tricky for some to create and it’s often tough to read or discover all the Tweets in a thread. That’s why we’re thrilled to share that we’re making it simpler to thread Tweets together, and to find threads, so it’s easier to express yourself on Twitter and stay informed.

We’ve made it easy to create a thread by adding a plus button in the composer, so you can connect your thoughts and publish your threaded Tweets all at the same time. You can continue adding more Tweets to your published thread at any time with the new “Add another Tweet” button. Additionally, it’s now simpler to spot a thread – we’ve added an obvious “Show this thread” label.

As far as I can tell, this is a prettier interface for the original method of creating a thread by replying to yourself. Twitter has integrated a multi-post feature into the app’s compose box, and there doesn’t seem to be a new API endpoint for threading. It seems like a nice workflow with a ‘Tweet All’ button at the end. In theory, popular third-party clients could replicate the same behavior (and design) in their own compose UIs – just like various tweetstorm utilities create “threads” by posting multiple replies in a row.

Permalink

T3 Interviews Phil Schiller About Face ID, AirPods, HomePod, and More

UK lifestyle site T3 has an in-depth interview with Phil Schiller, Apple’s Senior Vice President of Worldwide Marketing. The interview covers a wide range of topics including the iPhone X, Face ID, AirPods, ARKit, HomeKit, the Apple Pencil, the iMac Pro, and the HomePod.

Schiller credits Apple’s tight integration of software and hardware and cross-team collaboration with the success of Face ID:

Other companies certainly have had the vision of ‘can you unlock something with someone’s face?’ but no one [has] actually delivered technology as advanced and capable and ubiquitous and consumer friendly as Face ID. And that is the direct result of this collaboration, and how these teams work for years together on a simple powerful idea with all that technology.

He also uses the AirPods as an example of the extent of the engineering that goes into making a product as seemingly simple as the AirPods:

So frequently, I talk to customers who say, ‘My favourite product Apple has ever made are AirPods.’ And that’s just a really nice thing to hear. I love when customers respond that one of their favourite product is something this simple, and yet so much work went into it.

At the surface level, it’s an incredibly simple product. But the reality is it’s actually an incredibly complex product to make. Each AirPod really is its own computer, running software and hardware. And those two computers need to deliver this very clear experience that you want, and they have to work together, because we’re very attuned to synchronisation in audio as a species. And so it has to work the way you want.

One of our favourite features is just the idea that you take it out and the music stops – you put it back in and it keeps going again. “Again, that’s a simple idea, but took a lot of engineering to make it work quickly, reliably, for all of us in all different ear sizes and different situations. And they have to work with this iPhone that may be in your pocket or your bag, across your body. And as you know, our bodies are big bags of water, which are really bad for radio signals to get through.

Phil Schiller has an impressive knack for explaining Apple’s vision for its products, which makes this interview worth reading in its entirety.

Permalink