I recently came across a demo of AR Quick Look, an iOS 12 feature that allows apps to present 3D and AR previews for objects built using the new USDZ file format. Shopify, the popular e-commerce platform, is going to take advantage of AR Quick Look to let customers preview items in their surroundings directly from Safari, contextually to the shopping experience.
Here's Daniel Beauchamp, writing on the Shopify AR/VR blog:
For the past three years, Shopify has been exploring how AR / VR will change the way consumers shop. Last year, we showed how Apple’s ARKit could be used to provide compelling AR commerce experiences. The main complexity was that ARKit needed to be run in an app. This meant that Shopify merchants looking to offer these experiences had to have their own unique mobile apps that customers would need to download.
With iOS 12’s AR Quick Look, 3D models of products in the usdz file format can be uploaded directly to online Shopify stores and viewed in AR right within Safari, without needing to download a separate app.
His video gives you an even better idea of the integration possible between Safari, ARKit, and Apple Pay in iOS 12:
Beauchamp argues that "the web is how AR becomes mainstream" – looking at these demos, it's hard to disagree. Not having to install a dedicated ARKit app for every single online store we use and actually having the ability to share and preview models from Safari or Messages is going to remove a ton of friction from the current ARKit experience (as far as shopping is concerned). I can imagine that producing 3D objects at scale will be merchants' biggest hurdle in the short term, though.
I wasn't aware of this until I did some research, but Apple also launched an interactive AR Quick Look Gallery as part of their ARKit 2 mini-site. You can also test Shopify's improved shopping flow featuring ARKit and Apple Pay here.
At last September’s iPhone event at Apple Park, Craig Federighi, Senior Vice President of software engineering, used Snapchat to show how the iPhone X’s True Depth camera and ARKit could create realistic camera filters. Today, those filters finally launched with an update to Snapchat’s iOS app, which is featured in the Today section of the App Store.
The update includes the two filters demoed by Federighi during the keynote, as well as a masquerade ball Lens. The advantage of using the True Depth camera and ARKit is that the filters can track your face better than other Snapchat Lenses and account for the lighting in the room, providing realistic highlights and shadows.
The new filters are available now without updating Snapchat’s app. If you long press the screen with the selfie camera enabled, the new lenses are currently the first three listed.
Today Apple released the latest update to iOS, version 11.3, which adds the previously promised iPhone battery health settings, along with four new Animoji, ARKit 1.5, a new Music Videos section in Apple Music, and more.
Andrew Webster, writing for The Verge:
There’s been a wave of location-based mobile games announced recently, based on everything from The Walking Dead to Jurassic World. It turns out these games have more in common than just timing: they’re all powered by Google Maps. Today Google is announcing that it’s opening up its ubiquitous mapping platform to allow game developers to more easily create real-world games. The next Pokémon Go might finally be on the way.
Gaining access to a real-time mapping source like Google Maps is huge for developers, but the additional tools that go along with Google's newly announced game platform take that a step further. Google is also launching a Unity SDK to tie into its mapping data, and enabling gameplay experiences to be built around specific locations or location types.
Developers can do things like choose particular kinds of buildings or locations — say, all stores or restaurants — and transform each one. A fantasy realm could turn all hotels into restorative inns, for instance, or anything else.
This sounds like it could be a huge boon to the mobile game market, particularly when combined with tools like Apple's ARKit. Yes, it means we'll probably get tired of all the games trying to replicate Pokémon GO's success, but these new developer tools will also likely enable some truly immersive, exciting gaming experiences.
Google created a video that shows off just a glimpse of what's possible with its new Maps APIs.
Last week, The New York Times announced that it had added an augmented reality feature to its iOS app. The first article with embedded AR content was a preview of the feature published last week that explained to readers how it worked. At the bottom of the article was a newspaper box that could be dropped into your surroundings. I showed it off to some friends over the weekend, and everyone was impressed by how realistic it looked as they walked around the box in a neighbor’s kitchen.
This week, the Times rolled the feature out as part of its Winter Olympics coverage. In Four of the World’s Best Olympians, as You’ve Never Seen Them Before, the publication spotlights figure skater Nathan Chen, speed skater J.R. Celski, hockey player Alex Rigsby, and snowboarder Anna Gasser. The results are impressive. I placed each athlete in my living room, then walked around them. From each angle, snippets of text about what I was seeing were overlaid on the image providing additional details and context. The app also makes use of haptic feedback on the iPhone to alert users to new information as they examine a scene.
Last week Apple announced that the forthcoming iOS 11.3, currently in beta, would introduce ARKit 1.5 – the next evolution of AR tools the company launched last summer.
ARKit 1.5 brings enhancements in a number of ways, including being able to map surfaces better, offer 50% better resolution, and more. The most exciting improvement, however, is that ARKit can now be used to interact with vertical surfaces, rather than only horizontal ones.
Developers have begun experimenting with the new features included in ARKit 1.5, and several have shared their early tests on Twitter.
One of childhood's simple joys for many of us was getting our creative juices flowing by playing with building blocks. It's one of those tactile, imaginative outlets that adulthood features far less of. Blocks also brought the added benefit of getting to destroy the work you'd built – a task similarly delightful to the actual building.
Recently my wife and I were babysitting twin 1-year-old boys, owners of a big bucket full of colorful, cardboard bricks. All throughout the night I enjoyed building small towers with the bricks, and the boys would have a blast knocking those towers down. Even when they were on the other side of the room distracted by something else, if they saw me stack three or more bricks together, they'd quickly come running to play demolition crew.
Playground AR is a new app from developer Marc Sureda that uses ARKit to bring the joys of childhood play to all ages – and with no mess to clean up either. The app provides a variety of toys that let you both build and destroy, with a physics system backing it all up to make the experience a delight.
There are three main modes in Playground AR: one is for placing objects in your playground, another lets you better survey and capture photos of what you've built, and the last is for picking up and moving existing objects. Objects you can place of course include blocks of varying shapes and sizes, but there are also lots of other fun, interesting toys to experiment with – trucks, helicopters, dice, spinning widgets, and more.
The physics engine is what makes Playground truly shine. Stacking blocks too high, for example, will cause your creation to topple over if the stack isn't well-balanced. Dominos can be strung together in an elaborate setup then knocked down by a rolling ball. Magnetized blocks will stick together even if gravity or another object forces them to fall. Balloons can be attached to objects, and depending on an object's weight and the number of balloons, the object will eventually be sent flying into the stratosphere. But I would be remiss if I didn't mention my favorite physics demonstration: placing bombs and TNT containers in your playground to blow everything up. It's brilliant.
If you want to spend some time goofing around in an AR sandbox, building and destroying in all kinds of creative ways, you can pickup Playground AR on the App Store for $1.99.
When was the last time you wanted to measure something, and the tool you needed was already within arm’s reach? I can’t think of a single time that’s happened to me. I don’t work in construction, so I don’t carry measuring tape or any similar tools around with me. But you know what I do always have with me? My iPhone. With AR MeasureKit, developer Rinat Khanov has created a toolkit that can be carried around with no extra baggage – all you need is your phone.
In late June when developer ARKit demos began showing up online, one of the first prominent videos featured digital measuring tape. Many people were amazed at ARKit’s ability to measure distance with such accuracy, while others simply wrote the idea off as a silly use of AR. While I think a simple AR ruler as seen in that first demo would have still been useful, MeasureKit offers much more than that. The app contains a comprehensive set of tools that makes your iPhone or iPad into a sort of Swiss Army Knife of measuring.
Apple’s developer site now contains Human Interface Guidelines for augmented reality apps. The guidelines are not hard, fast rules for developers working with ARKit, but more “best practices” Apple suggests for an ideal user experience. Guidelines that stand out include:
Use audio and haptic feedback to enhance the immersive experience. A sound effect or bump sensation is a great way to provide confirmation that a virtual object has come into contact with a physical surface or other virtual object.
To the extent possible, provide hints in context. Placing a three-dimensional rotation indicator around an object, for example, is more intuitive than presenting text-based instructions in an overlay.
Favor direct manipulation over separate onscreen controls. It's more immersive and intuitive when a user can touch an object onscreen and interact with it directly, rather than interact with separate controls on a different part of the screen.
Suggest possible fixes if problems occur. Analysis of the user's environment and surface detection can fail for a variety of reasons—there's not enough light, a surface is too reflective, a surface doesn't have enough detail, or there's too much camera motion. If your app is notified of insufficient detail or too much motion, or if surface detection takes too long, offer suggestions for resolving the problem.
ARKit is a brand new technology that opens up a world of possibilities to app developers. But alongside its potential for magical, immersive experiences is the potential for user frustration as developers learn the hard way what works best. Apple’s guidelines – though released later than I’m sure many developers would like – should help minimize those frustrations.