This Week's Sponsor:

Kolide

Ensure that if a device isn’t secure it can’t access your apps.  It’s Device Trust for Okta.


Posts tagged with "iPhone X"

iOS 11 X and iOS 11 Classic

In his review of the iPhone X, John Gruber astutely points out that the device effectively runs a fork of iOS 11:

There were always two things and only two things on the front face of an iOS device — the touchscreen display and the home button. In fact, the iPhone X changes iOS in more fundamental ways than even the iPad did. In terms of the role between the display and the home button, the iPad really was — and remains today — “just a big iPhone”.

The iPhone X, however, creates a schism, akin to a reboot of the franchise.

Apple hasn’t called attention to this, but effectively there are two versions of iOS 11 — I’ll call them “iOS 11 X”, which runs only on iPhone X, and “iOS 11 Classic”, which runs on everything else.

The fundamental premise of iOS Classic is that a running app gets the entire display, and the home button is how you interact with the system to get out of the current app and into another. Before Touch ID, the home button was even labeled with a generic empty “app” icon, an iconographic touch of brilliance.

This is a great way to think about the evolution of iOS going forward. As I noted last month, the iPhone X will reshape the entire iOS ecosystem over the next few years. Consequently, it’ll also make it more challenging to review a new version of iOS, as we’ll have to account for deeply different variations of the same features.

Permalink

Rainbrow, a Game Controlled by the iPhone X’s TrueDepth Camera

Joe Rossignol, writing for MacRumors about Rainbrow, a free arcade game for the iPhone X that requires you to raise your eyebrows or frown to move the character on screen:

Simply raise your eyebrows to move the emoji up, frown to move the emoji down, or make a neutral expression and the emoji stays still. Note that if you raise your eyebrows, and keep them raised, the emoji will continue to move in an upwards direction, and vice verse when maintaining a frowning expression.

While there are no levels, the game gets increasingly difficult as more obstacles appear. The goal is simply to get the highest score possible, but players can only compete against themselves right now. Gitter told us that he plans to integrate Apple’s Game Center for multiplayer competition in a future update.

Here’s a video of the game in action:

I played this for 20 minutes last night. It’s genius. Using the TrueDepth camera on the iPhone X, Rainbrow can detect the movements of muscles around your eyes and thus ask you to raise or lower your eyebrows to move an emoji up and down to collect points. What makes this game feel like magic – as if the iPhone is reading your mind – is that there’s no camera preview on screen and no buttons to press: you don’t see your face in a corner; the game simply reacts to your expressions in real-time without an interface separating you from the actual gameplay. It’s fun, and it’s a good demonstration of the accuracy of the TrueDepth system.

Here’s what I wrote two weeks ago in the TrueDepth section of my iPhone X story:

I’ve been asking myself which parts of iOS and the iPhone experience could be influenced by attention awareness and redesigned to intelligently fit our context and needs. I don’t think this idea will be limited to Face ID, timers, and auto-lock in the future. What happens, for example, if we take attention awareness farther and imagine how an iPhone X could capture user emotions and reactions? TrueDepth could turn into an attention and context layer that might be able to suggest certain emoji if we’re smiling or shaking our heads, or perhaps automatically zoom into parts of a game if we’re squinting and getting closer to the screen. A future, more sophisticated TrueDepth camera system might even be able to guess which region of the display we’re focusing on, and display contextual controls around it. Siri might decide in a fraction of a second to talk more or less if we’re looking at the screen or not. Lyrics might automatically appear in the Music app if we keep staring at the Now Playing view while listening to a song.

It might be a silly game, but Rainbrow is the kind of different application of TrueDepth I had in mind. The same goes for Nose Zone, a game that uses ARKit’s TrueDepth-based face tracking to turn your nose into a cannon to shoot squares (I’m serious). While these first TrueDepth games are fun gimmicks, I believe we’re going to see invisible, persistent attention awareness and expression tracking become embedded into more types of apps over the next year.

Permalink

Bear 1.4 Brings Tag and Note Autocomplete, True Black Theme for iPhone X, and More

As I noted last month in my iPad Diaries column, I’ve started using Bear in addition to Apple Notes to research articles in Markdown and later convert them to drafts in Ulysses. I was impressed with Shiny Frog’s work on iOS 11 and how they brought advanced drag and drop to Bear, but I’m even more positively surprised by the improvements to tagging they released today as part of Bear’s 1.4 update.

Read more


How to Design for iPhone X (Without an iPhone X)

Great analysis by Sebastiaan de With on how they redesigned Halide for the iPhone X (the app indeed turned out to be one of the best iPhone X app updates we’ve seen so far):

Design for ergonomics. On regular iPhones, you have to do much less as a designer to optimize ergonomics. The iPhone X requires you to think about comfortable button placement and usability. Ergonomics is more than just tapping, but also swiping and other gestures. Lay out your UI so all actions are accessible and as comfortably usable as possible.

It’s a whole new device: Design for it. Everyone can stretch an app out to a larger screen, but just like the iPad, a fresh approach is not only welcomed but helps you stand out in the App Store. This is a great time to validate your current design. Are your approaches still valid? Is there a better solution possible? You might come to some valuable insights that you can apply to all your designs, not just the ones for the shiny new device.

If you’re a developer working on iPhone X UI updates, don’t miss Sebastiaan’s map visualization of the device’s display.

Permalink

iPhone X: A New Frontier

For a radically redesigned iPhone launching on the tenth anniversary of the first model, it seems reasonable to lead a review of the device with a retrospective on the original iPhone and how, over the course of a decade, its vision matured into the iPhone X. But today, I want to begin with the iPhone 4.

Read more



    Google Maps Updated with iPhone X Support

    Google continues to chip away at iPhone X and iOS 11 support for its iOS apps. Today, Google Maps was updated to take advantage of the iPhone X’s expansive display. Maps extend in every direction to the edge of the screen, which looks much better than the previously letterboxed version of the app.

    The design could use a few tweaks though. As Adam Swinden points out on Twitter, the corners of the ‘Explore’ button at the bottom of the screen are clipped and it is too close to the top edge of the Home indicator.

    Earlier this week Google updated Docs, Slides, and Sheets for the iPhone X and implemented basic drag and drop support.


    Khoi Vinh on iPhone X’s Portrait Mode

    Fair and balanced take on the iPhone’s Portrait mode by Khoi Vinh, who generally prefers a DSLR to capture special moments:

    More to the point, quibbling over the finer points of photographic effects is somewhat (though not entirely) pointless. What really matters here is that there will be tens if not hundreds of millions of these cameras in the hands of countless people everywhere before too long, and those people will take billions of pictures with them. Only a vanishingly small number of these people will ever object to the details I’ve listed here; most will be incredibly pleased with how portrait mode performs and will share the fruits of their labors avidly.

    Just on the merits of sheer volume alone, portrait mode will become a part of our collective visual vocabulary.

    Permalink

    Apple Posts iPhone X Videos Promoting Face ID and Animoji

    Apple has posted four videos highlighting exclusive iPhone X features. Three of the videos focus on Face ID and Apple Pay, while the fourth spotlights Animoji.

    One Face ID video is a broad introduction to the iPhone X, Face ID, and using Apple Pay with Face ID. The other two Face ID videos have a narrower focus. One demonstrates that Face ID works even if you change your look. The spot features a woman with different hair styles, jewelry, glasses, makeup, and clothing unlocking her iPhone X with each new look. The other Face ID video shows that the feature works in complete darkness.

    The final spot embraces the Animoji karaoke phenomenon. As a woman sings All Night by Big Boi, a series of Animoji sing along with her just like the many Animoji karaoke videos that have been posted to Twitter and elsewhere.

    The four short videos, which you can view after the break below, are available on YouTube and will likely begin showing up on television soon.

    Read more


    Dual Lens Switching on iPhone X

    Dan Provost of Studio Neat (makers of the excellent Glif) ran some tests to analyze the low-light performance of the iPhone X’s telephoto lens:

    Last year, when the iPhone 7 Plus was released, Glenn Fleishman wrote a terrific piece for Macworld about how the dual lens camera system works. In short, when you zoom-in to 2X, the camera does not always switch to the telephoto lens. In some cases (typically in low light scenarios), you will be presented with a cropped image from the wide angle lens instead. This was sacrilege to camera nerds, but Apple would argue that if the cropped image looks better in those low light situations, then that is the correct approach.

    Results are impressive:

    As you can see, the iPhone X required very little light before it decided to use the telephoto lens. The iPhone 7 Plus required quite a bit more. I used the app Light Meter to measure the light at each interval, which I denote in the video. The app measures the lux, which is a measure of illuminance equal to one lumen per square meter. (I measured from both devices and averaged the results, as the readings were slightly different. I wouldn’t expect an app to function as well as a true light meter, but this probably gets us in the ball park).

    Make sure to check out the video to see the lens switching in action. The difference between the iPhone 7 Plus and the X is substantial when it comes to the amount of light required for the system to pick the telephoto lens.

    Permalink