This Week's Sponsor:


Ensure that if a device isn’t secure it can’t access your apps.  It’s Device Trust for Okta.

Posts tagged with "developer tools"

The Best Way to Take Screenshots on Apple Vision Pro

Taking good-looking screenshots on the Apple Vision Pro isn’t easy, but it’s not impossible either. I’ve already spent many hours taking screenshots on the device, and I thought I’d share my experience and some practical tips for getting the best screenshots possible.

Although I’ve only had the Apple Vision Pro for a week, I’ve already spent a lot of time thinking about and refining my screenshot workflow out of necessity. That’s because after I spent around three hours writing my first visionOS app review of CARROT Weather and Mercury Weather, I spent at least as much time trying to get the screenshots I wanted. If that had been a review of the iOS versions of those apps, the same number of screenshots would have taken less than a half hour. That’s a problem because I simply don’t have that much time to devote to screenshots.

Taking screenshots with the Apple Vision Pro is difficult because of the way the device works. Like other headsets, the Apple Vision Pro uses something called foveated rendering, a technique that’s used to reduce the computing power needed to display the headset’s images. In practical terms, the technique means that the only part of the device’s view that is in focus is where you’re looking. The focal point changes as your eyes move, so you don’t notice that part of the view is blurry. In fact, this is how the human eye works, so as long as the eye tracking is good, which it is on the Apple Vision Pro, the experience is good too.

However, as well as foveated rendering works for using the Apple Vision Pro, it’s terrible for screenshots. You can take a quick screenshot by pressing the top button and Digital Crown, but you’ll immediately see that everything except where you were looking when you took the screen-grab is out of focus. That’s fine for sharing a quick image with a friend, but if you want something suitable for publishing, it’s not a good option.

Fortunately, Apple thought of this, and there’s a solution, but it involves using Xcode and another developer tool. Of course, using Xcode to take screenshots is a little like using Logic Pro to record voice memos, except there are plenty of simple apps for recording voice memos, whereas Xcode is currently your only choice for taking crisp screenshots on the Vision Pro. So until there’s another option, it pays to learn your way around these developer tools to get the highest quality screenshots as efficiently as possible.

Read more

The Apple Vision Pro Developer Strap

Jeff Benjamin writing for 9to5Mac has a comprehensive breakdown on what the Apple Vision Pro Developer Strap can and can’t do. One of the primary benefits for developers is capturing video. As Benjamin writes:

The Developer Strap also lets developers capture a direct video feed from Apple Vision Pro via a wired USB-C connection using Reality Composer Pro. Files transfers of the captured feed occur via the direct USB-C connection. Users without the strap can still capture these feeds but via Wi-Fi only.

Benjamin also explains how to use the strap to access Recovery Mode:

You can also restore visionOS using Recovery Mode via the wired connection made possible by the Developer Strap. This includes downgrading from visionOS beta releases.

My experience is in line with Benjamin’s. The Developer Strap may make capturing short videos and screenshots easier, but it can’t do much else.

I will add, however, that I was contacted by a MacStories reader who tipped me off to one other thing the Developer Strap can do, which is act as a video source for QuickTime. This works a lot like capturing screenshots and video from an Apple TV via QuickTime, and the advantage is that you can capture more than the 60-second cap imposed by Reality Composer Pro. That’s great, except that the capture is foveated, meaning that the video recorded will be blurry everywhere except where you’re looking.


Apple Offers USB-C Enabled Vision Pro Strap to Registered Developers

Apple is offering a new Vision Pro accessory to registered developers: a head strap with a USB-C connector for $299. There aren’t a lot of details about the strap, which is designed to be connected to a Mac to accelerate development and testing for the Vision Pro, other than this description that is behind a developer account login:


The Developer Strap is an optional accessory that provides a USB-C connection between Apple Vision Pro and Mac and is helpful for accelerating the development of graphics-intensive apps and games. The Developer Strap provides the same audio experience as the in-box Right Audio Strap, so developers can keep the Developer Strap attached for both development and testing.

Tech specs

  • USB-C data connection
  • Individually amplified dual driver audio pods
  • Compatible with Mac

Although we haven’t been able to confirm the capabilities of the Developer Strap, USB-C may allow developers to connect the Vision Pro to their network over Ethernet or access external storage, for example.

Why is a USB-C dongle $299? It’s expensive, but as the description makes clear, it incorporates the speaker found in Vision Pro’s right strap, which it replaces, explaining at least part of the cost.

Apple Publishes Updated Human Interface Guidelines for visionOS

In addition to releasing the visionOS SDK and developer tools today, Apple has updated its Human Interface Guidelines and published additional visionOS documentation for developers. The updated HIG begins with an overview of designing for the Apple Vision Pro, covering topics like Passthrough, Spatial Audio, Focus and Gestures, Ergonomics, and Accessibility, advising developers to:

Embrace the unique features of Apple Vision Pro. Take advantage of space, Spatial Audio, and immersion to bring life to your experiences, while integrating passthrough, focus, and gestures in ways that feel at home on the device.

If you’re interested in Apple’s design philosophy for the Vision Pro, the HIG is an excellent plain-English read. For developers who want to dive deeper into the details of building apps, Apple has also published a lot of additional documentation covering the nuts and bolts of building visionOS apps.


Apple Releases visionOS SDK and Developer Tools

Source: Apple.

Source: Apple.

Today, Apple announced the visionOS software development kit that will allow developers to start creating apps for the Apple Vision Pro. In addition to the SDK, an update to Xcode is introducing Reality Composer Pro, which lets developers preview 3D models, animations, images, and sounds. There’s also a new visionOS simulator that can be used to test different room configurations and lighting for visionOS apps.

The developer labs that Apple announced at WWDC will open soon too:

Next month, Apple will open developer labs in Cupertino, London, Munich, Shanghai, Singapore, and Tokyo to provide developers with hands-on experience to test their apps on Apple Vision Pro hardware and get support from Apple engineers.

Developers can also apply for an Apple Vision Pro developer kit, so they can test apps on the device itself. Anyone who has used Unity’s tools to build 3D apps and games will be able to port them to visionOS next month too.

Source: Apple.

Source: Apple.

Among the developers who have tried the visionOS SDK is Algoriddim, whose CEO, Karim Morsey, said:

The djay app on Apple Vision Pro puts a fully featured DJ system right at a user’s fingertips. With a reimagined spatial interface, anyone can mix their favorite music and apply real-time effects using just their eyes and hands. Whether for a beginner or a seasoned professional, djay on Vision Pro transforms the user’s surroundings with stunning environments that automatically react to their mix, enabling them to experience and interact with music in ways never before possible.

It’s great to see Apple getting these tools into the hands of developers so soon after WWDC. Building apps for Apple Vision Pro uses many of the same technologies and tools developers are already familiar with, like Xcode, SwiftUI, RealityKit, ARKit, and TestFlight. However, with excitement for Apple Vision Pro still high, now is the perfect time to get the new visionOS SDK and tools in developers’ hands as they plan for the device’s release next year.

Apple Releases Developer Tools to Facilitate Porting Videogames to the Mac

Source: Apple.

Source: Apple.

During the WWDC keynote, Apple showed off Game Mode for the Mac, which gives a game priority over a Mac’s CPU and GPU resources. Apple has also reduced the latency of AirPods used while gaming and doubled the sampling rate for connected Bluetooth controllers.

Game Mode promises to improve the overall experience of gaming on the Mac, but it’s not all that was announced at WWDC. Apple has also announced a series of developer tools designed to make it easier to port games to the Mac from other platforms.

Among those tools is a Game Porting Toolkit, which Tom Warren of The Verge says is:

 similar to the work Valve has done with Proton and the Steam Deck. It’s powered by source code from CrossOver, a Wine-based solution for running Windows games on macOS. Apple’s tool will instantly translate Windows games to run on macOS, allowing developers to launch an unmodified version of a Windows game on a Mac and see how well it runs before fully porting a game.

The Game Porting Toolkit is meant as a way for developers to quickly see how much work needs to be done to port their games to the Mac, but that hasn’t stopped gamers with developer accounts from downloading the tool and taking everything from Cyberpunk 2077 to Diablo IV for a spin on the Mac according to Warren.

Along with a tool to convert shaders and graphics code to Apple’s Metal framework, The Game Porting Toolkit and other announcements at WWDC mark a concerted effort by Apple to expand the catalog of games available to Mac users. Whether game developers will take advantage of these tools and bring their games to the Mac remains to be seen, but recent announcements that Stray and Hideo Kojima’s Death Stranding, Director’s Cut are coming to the Mac are both good signs.


TestFlight’s Inability to Handle Large Beta Collections Needs to Be Fixed

I’ve been thinking about app scalability a lot lately – most recently in the context of TestFlight, which I find is incredibly frustrating to use, at best, and, on the Mac, often unusable. This isn’t a new problem for me, but I haven’t mentioned it much in the past because I’ve suspected that my experience is colored by the fact that I’m an outlier. But, outlier or not, the app deserves more attention than it’s been given.

Read more

WorldWideWeb: A Simple Web Server Utility for Mac, iPad, and iPhone

Early today The Iconfactory released their latest app, a simple web server utility called WorldWideWeb. Solidly developer-focused in scope, the app serves files from a local directory to an automatically generated URL, making these files available to any device on your local network. While there are sure to be more inventive use cases for such a utility, its general purpose is for testing simple websites built on the Web’s greatest primitive: HTML.

WorldWideWeb’s killer feature is simplicity. The app’s entire main interface consists of two tiny sections: in the first you select a folder, and in the second you start or stop the web server. When the server is activated, a URL is generated. The app uses Bonjour to make the address available to any device on the same Wi-Fi network as the host. Just copy and paste the URL or press the ‘Open in Browser’ button to view the website natively in a web browser.

Read more

Logger Is the Missing Console for Shortcuts Power Users

Logger for Shortcuts.

Logger for Shortcuts.

Indie developer Alex Hay has long pushed the boundaries of what third-party developers can build with the SiriKit framework and Shortcuts integrations on Apple platforms.

In late 2019, his Toolbox Pro app redefined what it means to complement Apple’s Shortcuts app with additional actions, creating an entirely new sub-genre of headless utilities designed to provide additional actions with configurable parameters. Recently, Hay introduced Nautomate, another utility that provides users with Shortcuts actions to integrate with the Notion API without having to write a single line of code. And today, Hay is launching Logger, another Shortcuts-compatible app that is similar to his previous ones, but with a twist: rather than adding actions for external services or apps such as Apple Music and Notion, Logger offers actions to create the troubleshooting console that has always been missing from Shortcuts.

Read more