Accessories
visionOS 26 includes support for a couple of new spatial accessories that will give the system added capabilities and open up new categories of apps on the platform.
The first is support for spatial controllers, starting with the PlayStation VR2 Sense controllers. These offer six degrees of freedom, haptic feedback, and more precise position and rotation input than hand tracking for VR gaming. Apple has touted Resolution Games’ upcoming pickleball training app Pickle Pro as an example of what’s possible with spatial controllers in visionOS, and I imagine more spatial and VR games will become available on the platform due to this integration alone. Games even have the ability to require spatial controllers, meaning developers don’t have to use hand tracking as a fallback if they don’t want to. You can navigate the entire visionOS system using spatial controllers in place of hand tracking if you like as well.
The other new accessory supported in visionOS 26 is the Logitech Muse, a spatial stylus that can be used for drawing, annotating, sculpting, and more. It features tracking in six degrees of freedom and button input. Apple Pencil fans have asked for stylus support in visionOS since its launch, and it’s now part of the system.
I have not been able to try either of these accessories myself yet, but I can’t wait to get my hands on them when they’re available for purchase.7 In the meantime, the APIs are available for developers to include in their visionOS 26 apps in anticipation of a wider release. I can’t wait to see what new apps become possible on visionOS because of these integrations.
Another Vision Pro accessory that continues to gain more and more capabilities is the Developer Strap. The $300 contraption adds a USB port to the device to aid developers in connecting to their Macs, testing code, and capturing high-quality screen captures. In visionOS 26, the Developer Strap is learning a couple of new tricks. First up is the ability to connect to wired networks over Ethernet, offering a more stable alternative to Wi-Fi connections that can come in handy for gaming, video production, and more.
The second capability isn’t a new one per se, but it will be new for most users. The Developer Strap has been capable of receiving USB Video over UVC for some time, but the feature was locked behind an enterprise entitlement most developers couldn’t access. Now, that limitation has been removed, paving the way for apps like UVSee and, soon, Portal to accept and display UVC inputs.
I tested this capability by wiring my iPad Pro to my Vision Pro using a USB-C to HDMI cable and a capture card, and it worked seamlessly. While I could always mirror my iPad wirelessly via AirPlay, this enabled me to use my Vision Pro as a secondary display alongside my iPad’s built-in screen. It’s important to note that the implementation only supports video, not audio, but this opens up the Vision Pro to be used to monitor cameras, play games from external devices, and more.
My hope for the Developer Strap’s next new upgrade is the ability to transfer data to and from USB drives. It’s not something I do often, but it is a capability I need from time to time, and I think it’s something a spatial computer should be able to do. In the long run, I would love to see all of these features incorporated into the USB port on the battery pack of a future Vision Pro so all users can take advantage of them, not just developers. Regardless, it’s great to see the Developer Strap continue to add new capabilities, proving that I/O on the Vision Pro can be quite useful when enabled.
iPhone Integration
99% of the time when I’m wearing my Vision Pro, I also have my phone close by. There are certain apps on my phone that aren’t available on Vision Pro, and there are some tasks that I just prefer to do on my phone. Apple is making the two devices work more seamlessly together, and it makes for a much smoother experience when I do need to pick up my phone while wearing the Vision Pro.
Unlocking my iPhone while wearing Vision Pro is so much easier now.Replay
Face ID doesn’t work when you’re wearing a headset that blocks part of your face, so you have to unlock your phone another way. In the past, I’ve either unlocked the phone with my Apple Watch or entered my passcode manually, but neither of those solutions is ideal. With visionsOS 26 and iOS 26, you can now enable the Vision Pro as a trusted device that can unlock your iPhone for you when it’s nearby and unlocked with Optic ID. This option is available in the Face ID & Passcode settings on the iPhone, and since I enabled it, I’ve been able to simply pick up my phone and unlock it without any extra effort while wearing my Vision Pro.
Viewing the iPhone screen while wearing the Vision Pro can be tricky, too. Until now, visionOS windows and environments would block your iPhone’s screen from view. Now, visionOS will automatically recognize your phone’s screen and allow it to break through UI, similar to the way your hands and physical keyboards do. (This also works with supported game controllers in visionOS 26). This breakthrough is instant and feels natural, making it much easier to see your phone’s screen without needing to hold it in a particular way or close your visionOS windows.
Although the Vision Pro does not have a Phone app, it can now display and accept forwarded calls from your iPhone when nearby. This feature has been available on the iPad, Mac, and Apple Watch for some time now, and it is both convenient and dependable. Apple has spoiled me on the ability to answer (or reject) a phone call from any of my devices, so when my phone rings, I naturally try to address the call on whichever device I’m using. The Vision Pro can handle that task now, too. You can also now choose to record audio calls.
Finally, the iPhone and Vision Pro are working together to make switching between devices even better this year. You can now save your eye and hand data, vision prescription, and accessibility settings from your Vision Pro in iCloud, allowing you to later recall that data from your iPhone when setting up another Vision Pro. Having been through the eye- and hand-tracking setup process many times, I appreciate any opportunity to make sharing or setting up a new Vision Pro speedier.
Taken together, all of these seemingly minor connections between the iPhone and Vision Pro make using both devices in conjunction easier and smoother. They also firmly embed the Vision Pro within the Apple ecosystem, which is a win for those who are invested in and comfortable with the features Apple offers across its platforms. These enhancements have removed a bunch of small annoyances in my daily use of the Vision Pro, something I appreciate greatly as I use my device every day, and I imagine other Vision Pro users who also have iPhones will feel the same way when they see these integrations in action.
- I need them for my work! ↩︎