Simply raise your eyebrows to move the emoji up, frown to move the emoji down, or make a neutral expression and the emoji stays still. Note that if you raise your eyebrows, and keep them raised, the emoji will continue to move in an upwards direction, and vice verse when maintaining a frowning expression.
While there are no levels, the game gets increasingly difficult as more obstacles appear. The goal is simply to get the highest score possible, but players can only compete against themselves right now. Gitter told us that he plans to integrate Apple's Game Center for multiplayer competition in a future update.
Here's a video of the game in action:
I played this for 20 minutes last night. It's genius. Using the TrueDepth camera on the iPhone X, Rainbrow can detect the movements of muscles around your eyes and thus ask you to raise or lower your eyebrows to move an emoji up and down to collect points. What makes this game feel like magic – as if the iPhone is reading your mind – is that there's no camera preview on screen and no buttons to press: you don't see your face in a corner; the game simply reacts to your expressions in real-time without an interface separating you from the actual gameplay. It's fun, and it's a good demonstration of the accuracy of the TrueDepth system.
Here's what I wrote two weeks ago in the TrueDepth section of my iPhone X story:
I've been asking myself which parts of iOS and the iPhone experience could be influenced by attention awareness and redesigned to intelligently fit our context and needs. I don’t think this idea will be limited to Face ID, timers, and auto-lock in the future. What happens, for example, if we take attention awareness farther and imagine how an iPhone X could capture user emotions and reactions? TrueDepth could turn into an attention and context layer that might be able to suggest certain emoji if we’re smiling or shaking our heads, or perhaps automatically zoom into parts of a game if we’re squinting and getting closer to the screen. A future, more sophisticated TrueDepth camera system might even be able to guess which region of the display we’re focusing on, and display contextual controls around it. Siri might decide in a fraction of a second to talk more or less if we’re looking at the screen or not. Lyrics might automatically appear in the Music app if we keep staring at the Now Playing view while listening to a song.
It might be a silly game, but Rainbrow is the kind of different application of TrueDepth I had in mind. The same goes for Nose Zone, a game that uses ARKit's TrueDepth-based face tracking to turn your nose into a cannon to shoot squares (I'm serious). While these first TrueDepth games are fun gimmicks, I believe we're going to see invisible, persistent attention awareness and expression tracking become embedded into more types of apps over the next year.