We’re huge fans of Vito Technologies’ Star Walk products here at MacStories: winner of an Apple Design Award in June 2010 and released on the Mac under the “Solar Walk” brand a few months ago, Star Walk is an incredible product that enables you to explore the universe (stars, constellations, planets, satellites) with swipes and taps thanks to iOS multitouch integration.
If you own a copy of Star Walk for the iPad and you recently bought an iPad 2, a brand new update is awaiting in the App Store. Released 2 days ago, Star Walk 5.2 (also available on the iPhone, the app is not universal) celebrates the 50th anniversary of the first human spaceflight (completed by Yuri Gagarin in his Vostok spacecraft on April 12, 1961) by adding several features like full Augmented Reality support. Similarly to how the SkyView app we reviewed in March allowed you to point your device to the sky and see the position of stars and constellations in real-time, Star Walk now lets you do the same and it works really well on the iPad 2. Among various bug fixes and enhancements to the control buttons, Star Walk 5.2 also includes a calendar of celestial events to plan your stargazing sessions within the app, and the possibility to search through all categories at once. We feel the Augmented Reality introduction is, however, the biggest and most important addition in this update, and you should consider giving a try to the app if you’ve heard of Star Walk in the past, and you just got an iPad 2. Exploring the universe is way more comfortable and elegant on the tablet’s large screen.
Thanks to the camera built into every iPhone (and now iPad), we have seen clever implementations of augmented reality applications in the past. Augmented Reality software, also known as “AR apps”, is meant to literally “augment” real-life objects you see on the device’s screen through the camera by adding digital information or other items. A navigation app like Where To, for example, places virtual directions on a real map captured by the iPhone’s camera. These experiments are very cool and show off the capabilities of iOS devices when combined with software that takes advantage of modern technologies.
SkyView, an iPhone app by Terminal Eleven, lets you explore the universe from your iPhone by simply pointing the camera to the sky. Using GPS coordinates, gyroscope rotation and 3D graphics, SkyView augments what you see by adding stars, planets, satellites, space stations and constellations on the screen.
Once you point the iPhone’s camera to the sky and the app fetches information correctly via GPS, you’ll see a variety of sky objects popping up on screen — like stars, the sun, satellites and so forth, depending on where you are in the world. If you tap on an item, a bar appears at the bottom of the screen revealing its name; tap again, and the bar will expand to show RA and DEC location. Tap the arrow, and SkyView will let you flick through a detailed description of the object you’ve discovered. If you select the Sun, Moon or another planet, you can also see the path they’ll follow during a 24 hour period. In the Settings, you can choose how many 3D layers to display or hide (including dwarf planets) and adjust time and date basing on your location.
At $0.99 in the App Store, SkyView is a neat experiment that makes great use of the iPhone’s camera. Check it out here, free version available as well.
With two cameras on board, it’s no surprise that developers are considering the possibility of photography and video apps for the tablet — in spite of its form factor that’s not exactly meant for quick picture taking or video shooting. Still, as Apple has proved at its March 2 event, there’s plenty of room for iPad apps that use the camera without forcing you to hold the device upright — such as FaceTime or Photo Booth, two pieces of software that use the iPad’s cameras to create unique (and funny) experiences. So while it’s likely that we won’t see a Camera+ for iPad in the near future, I believe several developers are playing around with the idea of alternative applications for the iPad that revolve around the use of the camera.
Self-portrait photography and video-mirroring apps sounds pretty obvious right now, and augmented-reality software for the iPad looks more promising in my opinion. As reported by All Things Digital, developers Total Immersion have announced AR Magic Mirror, an augmented-reality app that’s based on the camera to let users virtually try out different “hairstyles, outfits and accessories”. It sounds very similar to the French app for iPhone that let you try virtual glasses using a face-tracking technology, and indeed Total Immersion will use a technique to recognize a user’s face in order for this to work.
With the portability, processing power and broad consumer adoption of the iPad 2, a simple face tracking app today could soon become an essential makeover tool with beauty products at retail. Likewise a traditional game of online chess could become a massive, multiplayer game in the real world playing alongside vivid 3D characters and environments.
AR Magic Mirror will be a free download in the App Store, and it will be among the first augmented reality apps for the iPad — a trend that will evolve and proliferate in the next months. Check out the video below. (more…)
Localscope is well known for its location based discovery prowess, helping you gather information about your current location thanks to the combination of augmented reality, geo-meta-search, and social driven capabilities. Comparable to Google’s Places, Localscope enables users to find the nearest hotspots for coffee, movies, and more. As of Localscope 1.1, you can not only find the hip new club in your city, but you can navigate to it as well thanks to Navigon MobileNavigator integration. Swipe across your destination, tap the map, and bring up MobileNavigator so you can quickly reach your destination. Localscope with MobileNavigator brings the functionality that Android has delivered with Places and Navigation Beta to the iPhone: if you own both the applications, the combo presents a terrific way to traverse new landscapes without getting lost.
Where To? by Future Tap is one of the most popular apps ever released for the iPhone. In fact, I remember buying the app years ago on my original iPhone, and I still have it. Where To?, like many other alternatives that eventually came out in the App Store, gives you detailed information of nearby points of interest using the iPhone’s built-in GPS technology and Google Maps. It features a popular and unique “click-wheel” interface that Apple even used in patents to describe functionalities of iOS apps and, overall, it’s powered by a great design that’s really a pleasure to look at.
A few minutes ago, Where To? got a major update that introduces a neat 3D Augmented Reality feature I’m just playing around with as we speak. The Future Tap developers added this new option as a $0.99 in-app purchase that’s a no-brainer for as much as it’s well realized and perfectly consistent with the rest of the app. As the app fetches your location and provides you a list of POIs, you can switch to the usual map view and you’ll notice a new “eye” icon in the bottom toolbar that activates Augmented Reality. Unlike other AR software, Where To’s 3D implementation stands out because of the use of OpenGL that brings smooth animations to the reality layer captured by the iPhone’s camera. When pointing the iPhone, the app will create a virtual grid that displays POIs as boxes, all based on the distance between the user and the target. The developers explain:
Our solution: First, results are displayed in a real 3D world and their size and stacking order depends on the distance. Then, you can zoom in and out using the gestures already known from the map: Double-tap to zoom in, two-finger-tap to zoom out, pinch for fine grained zooming. Zooming of course affects the camera picture as well.
Where To? 3.5 also uses the gyroscope for much smoother AR animations and location info. In my initial tests, I’ve found the system to work exactly as the developers described it in the introductory blog post; you can zoom it and pan out with two fingers, tap on a POI’s box to get to the usual portrait view. The UI is gorgeous.
WordLens Founder On The Limitations Of AR Apps for iPhone
Interesting interview with Robert Scoble. Turns out on-the-fly video translations are a tricky thing to accomplish, the technology underlying the app is all custom and, hopefully, more European languages will be released in early 2011.
I’ve tested the app and it’s not perfect, but it sure looks impressive. I can’t wait for this to gain more languages and, as the founder says in the interview, the possibility to give “context” to words recognized by the camera.