Posts tagged with "Vision Pro"

The Best Way to Take Screenshots on Apple Vision Pro

Taking good-looking screenshots on the Apple Vision Pro isn’t easy, but it’s not impossible either. I’ve already spent many hours taking screenshots on the device, and I thought I’d share my experience and some practical tips for getting the best screenshots possible.

Although I’ve only had the Apple Vision Pro for a week, I’ve already spent a lot of time thinking about and refining my screenshot workflow out of necessity. That’s because after I spent around three hours writing my first visionOS app review of CARROT Weather and Mercury Weather, I spent at least as much time trying to get the screenshots I wanted. If that had been a review of the iOS versions of those apps, the same number of screenshots would have taken less than a half hour. That’s a problem because I simply don’t have that much time to devote to screenshots.

Taking screenshots with the Apple Vision Pro is difficult because of the way the device works. Like other headsets, the Apple Vision Pro uses something called foveated rendering, a technique that’s used to reduce the computing power needed to display the headset’s images. In practical terms, the technique means that the only part of the device’s view that is in focus is where you’re looking. The focal point changes as your eyes move, so you don’t notice that part of the view is blurry. In fact, this is how the human eye works, so as long as the eye tracking is good, which it is on the Apple Vision Pro, the experience is good too.

However, as well as foveated rendering works for using the Apple Vision Pro, it’s terrible for screenshots. You can take a quick screenshot by pressing the top button and Digital Crown, but you’ll immediately see that everything except where you were looking when you took the screen-grab is out of focus. That’s fine for sharing a quick image with a friend, but if you want something suitable for publishing, it’s not a good option.

Fortunately, Apple thought of this, and there’s a solution, but it involves using Xcode and another developer tool. Of course, using Xcode to take screenshots is a little like using Logic Pro to record voice memos, except there are plenty of simple apps for recording voice memos, whereas Xcode is currently your only choice for taking crisp screenshots on the Vision Pro. So until there’s another option, it pays to learn your way around these developer tools to get the highest quality screenshots as efficiently as possible.

Read more


The Apple Vision Pro Developer Strap

Jeff Benjamin writing for 9to5Mac has a comprehensive breakdown on what the Apple Vision Pro Developer Strap can and can’t do. One of the primary benefits for developers is capturing video. As Benjamin writes:

The Developer Strap also lets developers capture a direct video feed from Apple Vision Pro via a wired USB-C connection using Reality Composer Pro. Files transfers of the captured feed occur via the direct USB-C connection. Users without the strap can still capture these feeds but via Wi-Fi only.

Benjamin also explains how to use the strap to access Recovery Mode:

You can also restore visionOS using Recovery Mode via the wired connection made possible by the Developer Strap. This includes downgrading from visionOS beta releases.

My experience is in line with Benjamin’s. The Developer Strap may make capturing short videos and screenshots easier, but it can’t do much else.

I will add, however, that I was contacted by a MacStories reader who tipped me off to one other thing the Developer Strap can do, which is act as a video source for QuickTime. This works a lot like capturing screenshots and video from an Apple TV via QuickTime, and the advantage is that you can capture more than the 60-second cap imposed by Reality Composer Pro. That’s great, except that the capture is foveated, meaning that the video recorded will be blurry everywhere except where you’re looking.

Permalink

Vision Pro App Spotlight: Juno

Leading up to the release of Apple Vision Pro, there was as much, or perhaps more, talk about which apps wouldn’t be on the platform on day one as there was about which would. To be sure, there are some very notable holes in the Vision Pro’s catalog, and one of the biggest is YouTube. However, as we’ll see over and over with the Vision Pro apps we’ll be covering at MacStories, the gap isn’t nearly as bad as you’d think, thanks to developers like Christian Selig, the former maker of the Reddit client Apollo and his brand new app, Juno.

Read more


Vision Pro App Spotlight: CARROT Weather and Mercury Weather

We’re going to be covering a lot of visionOS apps over the coming weeks, so I thought a fitting place to start would be with two of our favorite weather apps from other Apple platforms: CARROT Weather and Mercury Weather. Both apps are past MacStories Selects award winners. CARROT Weather won the Best Watch app in 2020 and the Readers’ Choice award in 2022, and we named Mercury Weather the Best Design winner of 2023. So, I expect a lot of readers are already familiar with both apps. However, if you’re not, be sure to check out these past stories for more on what makes them two of our favorite weather apps on the iPhone, iPad, Mac, and Apple Watch:

So today, my focus is solely on the visionOS versions of both apps, which fill the gap left by Apple’s curious omission of its own Weather app from Vision Pro.

Read more


Apple Publishes Vision Pro User Guide and Support Documentation

Apple has created a comprehensive Vision Pro user guide and accompanying support documentation that was published late last week. I’ve spent some time browsing through the user guide, and it’s full of excellent tips to help get people started with the new device. Each section of the guide links to related support documents, which go into more depth about the topics covered. I’ve pinned the page in Safari as I continue to explore everything the Apple Vision Pro can do.

Another page worth bookmarking is a story by Joe Rossignol and Aaron Perris of MacRumors, who compiled a long list of what they describe as ‘nearly all’ of the support documents listed in the Vision Pro user guide. You’ll come across links to these documents in the user guide itself, but if you want to go deeper on a topic and bypass the user guide, MacRumors’ story is a great place to start.


iFixit Disassembles Apple Vision Pro

Source: [iFixit](https://www.ifixit.com/News/90137/vision-pro-teardown-why-those-fake-eyes-look-so-weird).

Source: iFixit.

As with myriad other Apple gadgets, the folks at iFixit have pulled apart a Vision Pro to see what makes it work. There’s a lot of tech crammed into a relatively small space, which made the Vision Pro difficult to take apart. But, after heating and prying parts apart, removing brackets, screws, and cables, iFixit made it reached the inner layers to show off all of the Vision Pro’s components, concluding that:

The Vision Pro is insanely ambitious. Yes, it’s heavy, and the glass is fragile, and that tethered battery might get annoying. But Apple has managed to pack the power of a Mac, plus the performance of a new dedicated AR chip, into a computer that you can wear on your face.

Repairability-wise, it’s not great, but on the plus side, some of the connections are quite delightful. You should have seen our teardown team jump up when they realized that the side arms could be popped out using the SIM-removal tool, for example, and the magnetic cushions are yet more user-friendly.

To see the Vision Pro pulled apart step by step, don’t miss the companion video on YouTube:

Permalink

Wallpaper Interviews Apple’s Alan Dye and Richard Howarth

Today, Wallpaper published an interview with Alan Dye, Apple’s Vice President of Human Interface Design, and Richard Howarth, Vice President of Industrial Design. It’s a fantastic read with some great images, including an exploded view of the Vision Pro’s components.

Something I noticed as soon as I unboxed the Apple Vision Pro was how approachable it was. The setup process was easy, well-paced, and felt natural. That carries through to the hardware itself, too, which Dye explained to Wallpaper:

We wanted people around you to also feel comfortable with you wearing it, and for you to feel comfortable wearing it around other people. That’s why we spent years designing a set of very natural, comfortable gestures that you can use without waving your hands in the air. That’s also why we developed EyeSight, because we knew more than anything, if we were going to cover your eyes, that takes away much of what is possible when you connect with people. Getting that right was at the core of the concept of the product because we wanted people to retain those connections in their actual world.

My very early impression is that Apple’s design team accomplished its goal. Howarth puts a slightly different spin on the same message:

There’s a hardness and precision to the front of the product that is completely technical and feels like it’s been sent from the future, but then everything else that connects the product to you is soft and really approachable, so you feel cushioned and there’s not a barrier to putting it on or taking it off. And in fact, it should be a pleasure.’

Nobody is going to confuse the Vision Pro for something that it’s not. Still, the care that has been taken in its design goes a long way toward taking a device that is completely foreign to many people and making it one that isn’t intimidating. That’s something very uniquely Apple and why I’m optimistic about Vision Pro’s long-term prospects.

Permalink

Apple Offers USB-C Enabled Vision Pro Strap to Registered Developers

Apple is offering a new Vision Pro accessory to registered developers: a head strap with a USB-C connector for $299. There aren’t a lot of details about the strap, which is designed to be connected to a Mac to accelerate development and testing for the Vision Pro, other than this description that is behind a developer account login:

Overview

The Developer Strap is an optional accessory that provides a USB-C connection between Apple Vision Pro and Mac and is helpful for accelerating the development of graphics-intensive apps and games. The Developer Strap provides the same audio experience as the in-box Right Audio Strap, so developers can keep the Developer Strap attached for both development and testing.

Tech specs

  • USB-C data connection
  • Individually amplified dual driver audio pods
  • Compatible with Mac

Although we haven’t been able to confirm the capabilities of the Developer Strap, USB-C may allow developers to connect the Vision Pro to their network over Ethernet or access external storage, for example.

Why is a USB-C dongle $299? It’s expensive, but as the description makes clear, it incorporates the speaker found in Vision Pro’s right strap, which it replaces, explaining at least part of the cost.


On Vision Pro’s Spatial Computing

There’s no tech commentator better equipped to talk about the history of spatial interfaces in Apple operating systems than John Siracusa, and I enjoyed his latest, thought-provoking column on where visionOS and the Vision Pro’s gesture system fit in the spatial computing world:

Where Vision Pro may stumble is in its interface to the deep, spatial world it provides. We all know how to reach out and “directly manipulate” objects in the real world, but that’s not what Vision Pro asks us to do. Instead, Vision Pro requires us to first look at the thing we want to manipulate, and then perform an “indirect” gesture with our hands to operate on it.

Is this look-then-gesture interaction any different than using a mouse to “indirectly” manipulate a pointer? Does it leverage our innate spatial abilities to the same extent? Time will tell. But I feel comfortable saying that, in some ways, this kind of Vision Pro interaction is less “direct” than the iPhone’s touch interface, where we see a thing on a screen and then literally place our fingers on it. Will there be any interaction on the Vision Pro that’s as intuitive, efficient, and satisfying as flick-scrolling on an iPhone screen? It’s a high bar to clear, that’s for sure.

In yesterday’s review on The Verge, Nilay Patel shared a similar idea: it’s a strange feeling to use a computer that requires you to look at what you want to control at all times. I don’t know what to think about this yet since I don’t have a Vision Pro, but I’m curious to learn how this interaction method will scale over time as we start using this new platform on a daily basis. It’s quite fitting, however, that visionOS is based on the one Apple platform that supports both kinds of manipulation with a pointer and touch.

Permalink