Edgar Alvarez reports for Engadget on an upcoming AR feature I can’t wait to try:
Nike has been experimenting heavily with augmented reality for a few years now, and the company is continuing to work on new experiences powered by the technology. The sportswear giant is now introducing Nike Fit, a feature that uses a combination of computer vision, scientific data, artificial intelligence and recommendation algorithms to scan your feet and find the right shoe fit for you. And you can do it all in augmented reality, using the Nike app on your smartphone. Nike says that, according to industry research, over 60 percent of people wear the wrong size shoes. With Nike Fit, the company is hoping to solve that problem.
The AR experience itself is fairly simple: You open up the Nike app, go to a product page and, next to where there's usually a menu that lets you pick the size of your shoes, you'll see a new option to measure your feet. From there, the camera will pop up and you'll be asked to stand next to a wall and point your smartphone at your feet, which will prompt a view that uses two AR circles to level your phone. Once the feature recognizes your feet and your physical environment, it starts scanning your feet and then tells you your ideal shoe size for Nike footwear. The entire process takes less than a minute.
It’s been two years since Apple introduced ARKit, yet there are still very few AR apps I’ve found add meaningful value to my life. Outside of measuring apps, the only ARKit experiences I’ve enjoyed related to shopping for furniture and other home accessories. It looks like shoe shopping will be the next major area where AR becomes more than a fun demo. According to Alvarez, the Nike Fit feature will be added to the Nike app this July.
Later this year the team at Niantic will launch a new game based on a very popular IP: Harry Potter. The game's full title is Harry Potter: Wizards Unite, and it will come to both iOS and Android as an AR-centered gaming experience in the vein of Pokémon GO, Niantic's biggest hit.
Today Nick Statt of The Verge published his impressions of Wizards Unite following a preview Niantic hosted recently. Statt writes:
The core activity in Wizards Unite involves collecting a mix of artifacts and sometimes saving notable characters from the series who are stuck in a bad situation, like the titular Harry being pinned down by a spooky dementor. From there, you play a little tracing mini-game, to evoke the casting of a spell and then collect the item or free your friend. Afterward, you’re able to add that item or character to your collections book, earn experience points, and level up.
Beyond traversing the map, collecting those artifacts, and visiting inns to eat food, players of Wizards Unite will have a few more advanced activities to keep them busy. Those include leveling up your character, picking a subclass (called a “profession”) to learn new abilities, and then teaming up with up to five other players to compete in a fortress (what Niantic has designed to be this game’s version of gym battles from Pokémon Go). These team challenges feel like a cross between a traditional strategy game and something similar to Nintendo’s Fire Emblem, where one-on-one battles take place simultaneously after players select an enemy from a top-down map.
Based on Statt's article, Harry Potter: Wizards Unite is going to have a lot more depth and complexity to it than Pokémon GO, which I'm excited about. Pokémon GO has scaled over time to become more of a gamer's game than it used to be, but with Wizards Unite it sounds like that added depth will be there from the start for those who want it.
There's still no firm release date available for Wizards Unite, but a 2019 launch is confirmed.
I recently came across a demo of AR Quick Look, an iOS 12 feature that allows apps to present 3D and AR previews for objects built using the new USDZ file format. Shopify, the popular e-commerce platform, is going to take advantage of AR Quick Look to let customers preview items in their surroundings directly from Safari, contextually to the shopping experience.
Here's Daniel Beauchamp, writing on the Shopify AR/VR blog:
For the past three years, Shopify has been exploring how AR / VR will change the way consumers shop. Last year, we showed how Apple’s ARKit could be used to provide compelling AR commerce experiences. The main complexity was that ARKit needed to be run in an app. This meant that Shopify merchants looking to offer these experiences had to have their own unique mobile apps that customers would need to download.
With iOS 12’s AR Quick Look, 3D models of products in the usdz file format can be uploaded directly to online Shopify stores and viewed in AR right within Safari, without needing to download a separate app.
His video gives you an even better idea of the integration possible between Safari, ARKit, and Apple Pay in iOS 12:
Beauchamp argues that "the web is how AR becomes mainstream" – looking at these demos, it's hard to disagree. Not having to install a dedicated ARKit app for every single online store we use and actually having the ability to share and preview models from Safari or Messages is going to remove a ton of friction from the current ARKit experience (as far as shopping is concerned). I can imagine that producing 3D objects at scale will be merchants' biggest hurdle in the short term, though.
I wasn't aware of this until I did some research, but Apple also launched an interactive AR Quick Look Gallery as part of their ARKit 2 mini-site. You can also test Shopify's improved shopping flow featuring ARKit and Apple Pay here.
At last September’s iPhone event at Apple Park, Craig Federighi, Senior Vice President of software engineering, used Snapchat to show how the iPhone X’s True Depth camera and ARKit could create realistic camera filters. Today, those filters finally launched with an update to Snapchat’s iOS app, which is featured in the Today section of the App Store.
The update includes the two filters demoed by Federighi during the keynote, as well as a masquerade ball Lens. The advantage of using the True Depth camera and ARKit is that the filters can track your face better than other Snapchat Lenses and account for the lighting in the room, providing realistic highlights and shadows.
The new filters are available now without updating Snapchat’s app. If you long press the screen with the selfie camera enabled, the new lenses are currently the first three listed.
Today Apple released the latest update to iOS, version 11.3, which adds the previously promised iPhone battery health settings, along with four new Animoji, ARKit 1.5, a new Music Videos section in Apple Music, and more.
Andrew Webster, writing for The Verge:
There’s been a wave of location-based mobile games announced recently, based on everything from The Walking Dead to Jurassic World. It turns out these games have more in common than just timing: they’re all powered by Google Maps. Today Google is announcing that it’s opening up its ubiquitous mapping platform to allow game developers to more easily create real-world games. The next Pokémon Go might finally be on the way.
Gaining access to a real-time mapping source like Google Maps is huge for developers, but the additional tools that go along with Google's newly announced game platform take that a step further. Google is also launching a Unity SDK to tie into its mapping data, and enabling gameplay experiences to be built around specific locations or location types.
Developers can do things like choose particular kinds of buildings or locations — say, all stores or restaurants — and transform each one. A fantasy realm could turn all hotels into restorative inns, for instance, or anything else.
This sounds like it could be a huge boon to the mobile game market, particularly when combined with tools like Apple's ARKit. Yes, it means we'll probably get tired of all the games trying to replicate Pokémon GO's success, but these new developer tools will also likely enable some truly immersive, exciting gaming experiences.
Google created a video that shows off just a glimpse of what's possible with its new Maps APIs.
Last week, The New York Times announced that it had added an augmented reality feature to its iOS app. The first article with embedded AR content was a preview of the feature published last week that explained to readers how it worked. At the bottom of the article was a newspaper box that could be dropped into your surroundings. I showed it off to some friends over the weekend, and everyone was impressed by how realistic it looked as they walked around the box in a neighbor’s kitchen.
This week, the Times rolled the feature out as part of its Winter Olympics coverage. In Four of the World’s Best Olympians, as You’ve Never Seen Them Before, the publication spotlights figure skater Nathan Chen, speed skater J.R. Celski, hockey player Alex Rigsby, and snowboarder Anna Gasser. The results are impressive. I placed each athlete in my living room, then walked around them. From each angle, snippets of text about what I was seeing were overlaid on the image providing additional details and context. The app also makes use of haptic feedback on the iPhone to alert users to new information as they examine a scene.
Last week Apple announced that the forthcoming iOS 11.3, currently in beta, would introduce ARKit 1.5 – the next evolution of AR tools the company launched last summer.
ARKit 1.5 brings enhancements in a number of ways, including being able to map surfaces better, offer 50% better resolution, and more. The most exciting improvement, however, is that ARKit can now be used to interact with vertical surfaces, rather than only horizontal ones.
Developers have begun experimenting with the new features included in ARKit 1.5, and several have shared their early tests on Twitter.
One of childhood's simple joys for many of us was getting our creative juices flowing by playing with building blocks. It's one of those tactile, imaginative outlets that adulthood features far less of. Blocks also brought the added benefit of getting to destroy the work you'd built – a task similarly delightful to the actual building.
Recently my wife and I were babysitting twin 1-year-old boys, owners of a big bucket full of colorful, cardboard bricks. All throughout the night I enjoyed building small towers with the bricks, and the boys would have a blast knocking those towers down. Even when they were on the other side of the room distracted by something else, if they saw me stack three or more bricks together, they'd quickly come running to play demolition crew.
Playground AR is a new app from developer Marc Sureda that uses ARKit to bring the joys of childhood play to all ages – and with no mess to clean up either. The app provides a variety of toys that let you both build and destroy, with a physics system backing it all up to make the experience a delight.
There are three main modes in Playground AR: one is for placing objects in your playground, another lets you better survey and capture photos of what you've built, and the last is for picking up and moving existing objects. Objects you can place of course include blocks of varying shapes and sizes, but there are also lots of other fun, interesting toys to experiment with – trucks, helicopters, dice, spinning widgets, and more.
The physics engine is what makes Playground truly shine. Stacking blocks too high, for example, will cause your creation to topple over if the stack isn't well-balanced. Dominos can be strung together in an elaborate setup then knocked down by a rolling ball. Magnetized blocks will stick together even if gravity or another object forces them to fall. Balloons can be attached to objects, and depending on an object's weight and the number of balloons, the object will eventually be sent flying into the stratosphere. But I would be remiss if I didn't mention my favorite physics demonstration: placing bombs and TNT containers in your playground to blow everything up. It's brilliant.
If you want to spend some time goofing around in an AR sandbox, building and destroying in all kinds of creative ways, you can pickup Playground AR on the App Store for $1.99.