This Week's Sponsor:

Kolide

Ensures that if a device isn’t secure it can’t access your apps.  It’s Device Trust for Okta.


Posts tagged with "developer tools"

NVIDIA Introduces Remote Scene Rendering for Vision Pro Development

NVIDIA is in the midst of its 2024 GTC AI conference, and among the many posts published by the company yesterday, was a bit of news about the Apple Vision Pro:

Announced today at NVIDIA GTC, a new software framework built on Omniverse Cloud APIs, or application programming interfaces, lets developers easily send their Universal Scene Description (OpenUSD) industrial scenes from their content creation applications to the NVIDIA Graphics Delivery Network (GDN), a global network of graphics-ready data centers that can stream advanced 3D experiences to Apple Vision Pro.

That’s a bit of an NVIDIA word salad, but what they’re saying is that developers will be able to take immersive scenes built using OpenUSD, an open standard for creating 3D scenes, render them remotely, and deliver them to the Apple Vision Pro over Wi-Fi.

What caught my eye about this announcement is the remote rendering and Wi-Fi delivery part. NVIDIA has been using its data centers to deliver high-resolution gaming via its GeForce NOW streaming service. I’ve tried it with the Vision Pro, and it works really well.

NVIDIA says:

The workflow also introduces hybrid rendering, a groundbreaking technique that combines local and remote rendering on the device. Users can render fully interactive experiences in a single application from Apple’s native SwiftUI and Reality Kit with the Omniverse RTX Renderer streaming from GDN.

That means visionOS developers will be able to offload the rendering of an immersive environment to NVIDIA’s servers but add to the scene using SwiftUI and RealityKit frameworks, which Apple and NVIDIA expect will create new opportunities for customers:

“The breakthrough ultra-high-resolution displays of Apple Vision Pro, combined with photorealistic rendering of OpenUSD content streamed from NVIDIA accelerated computing, unlocks an incredible opportunity for the advancement of immersive experiences,” said Mike Rockwell, vice president of the Vision Products Group at Apple. “Spatial computing will redefine how designers and developers build captivating digital content, driving a new era of creativity and engagement.”

“Apple Vision Pro is the first untethered device which allows for enterprise customers to realize their work without compromise,” said Rev Lebaredian, vice president of simulation at NVIDIA. “We look forward to our customers having access to these amazing tools.”

The press release is framed as a technology focused on enterprise users, but given NVIDIA’s importance to the gaming industry, I wouldn’t be surprised to see the new frameworks employed there too. Also notable is the quote from Apple’s Mike Rockwell given the two companies’ historically chilly relationship.

Permalink

FinanceKit Opens Real-Time Apple Card, Apple Cash, and Apple Savings Transaction Data to Third-Party Apps

Ivan Mehta, writing for TechCrunch:

Apple’s iOS 17.4 update is primarily about adapting iOS to EU’s Digital Market Act Regulation. But the company has also released a new API called FinanceKit that lets developers fetch transactions and balance information from Apple Card, Apple Cash, and Savings with Apple.

If you use an Apple Card and a budgeting and financial tracking app, you’ll know why this is a big deal. I’ve been tracking my expenses with Copilot for over a year now, and I was pleased to see in Mehta’s story that Copilot, along with YNAB, Monarch, have teamed up with Apple to be the first third-party apps to use FinanceKit.

Before FinanceKit, I could only track my Apple Card expenses by importing a CSV file of my transactions one time each month when a new statement appeared in the Wallet app. Not only was that laborious, but it defeated the purpose of an app like Copilot, which otherwise lets you see where you stand with your budget in real-time. The process was such a bad experience that I used my Apple Card a lot less than I would have otherwise. Now, those Apple Card transactions will be recorded in Copilot, YNAB, and Monarch as they’re made, just like any other credit card.

Permalink

The Best Way to Take Screenshots on Apple Vision Pro

Taking good-looking screenshots on the Apple Vision Pro isn’t easy, but it’s not impossible either. I’ve already spent many hours taking screenshots on the device, and I thought I’d share my experience and some practical tips for getting the best screenshots possible.

Although I’ve only had the Apple Vision Pro for a week, I’ve already spent a lot of time thinking about and refining my screenshot workflow out of necessity. That’s because after I spent around three hours writing my first visionOS app review of CARROT Weather and Mercury Weather, I spent at least as much time trying to get the screenshots I wanted. If that had been a review of the iOS versions of those apps, the same number of screenshots would have taken less than a half hour. That’s a problem because I simply don’t have that much time to devote to screenshots.

Taking screenshots with the Apple Vision Pro is difficult because of the way the device works. Like other headsets, the Apple Vision Pro uses something called foveated rendering, a technique that’s used to reduce the computing power needed to display the headset’s images. In practical terms, the technique means that the only part of the device’s view that is in focus is where you’re looking. The focal point changes as your eyes move, so you don’t notice that part of the view is blurry. In fact, this is how the human eye works, so as long as the eye tracking is good, which it is on the Apple Vision Pro, the experience is good too.

However, as well as foveated rendering works for using the Apple Vision Pro, it’s terrible for screenshots. You can take a quick screenshot by pressing the top button and Digital Crown, but you’ll immediately see that everything except where you were looking when you took the screen-grab is out of focus. That’s fine for sharing a quick image with a friend, but if you want something suitable for publishing, it’s not a good option.

Fortunately, Apple thought of this, and there’s a solution, but it involves using Xcode and another developer tool. Of course, using Xcode to take screenshots is a little like using Logic Pro to record voice memos, except there are plenty of simple apps for recording voice memos, whereas Xcode is currently your only choice for taking crisp screenshots on the Vision Pro. So until there’s another option, it pays to learn your way around these developer tools to get the highest quality screenshots as efficiently as possible.

Read more


The Apple Vision Pro Developer Strap

Jeff Benjamin writing for 9to5Mac has a comprehensive breakdown on what the Apple Vision Pro Developer Strap can and can’t do. One of the primary benefits for developers is capturing video. As Benjamin writes:

The Developer Strap also lets developers capture a direct video feed from Apple Vision Pro via a wired USB-C connection using Reality Composer Pro. Files transfers of the captured feed occur via the direct USB-C connection. Users without the strap can still capture these feeds but via Wi-Fi only.

Benjamin also explains how to use the strap to access Recovery Mode:

You can also restore visionOS using Recovery Mode via the wired connection made possible by the Developer Strap. This includes downgrading from visionOS beta releases.

My experience is in line with Benjamin’s. The Developer Strap may make capturing short videos and screenshots easier, but it can’t do much else.

I will add, however, that I was contacted by a MacStories reader who tipped me off to one other thing the Developer Strap can do, which is act as a video source for QuickTime. This works a lot like capturing screenshots and video from an Apple TV via QuickTime, and the advantage is that you can capture more than the 60-second cap imposed by Reality Composer Pro. That’s great, except that the capture is foveated, meaning that the video recorded will be blurry everywhere except where you’re looking.

Permalink

Apple Offers USB-C Enabled Vision Pro Strap to Registered Developers

Apple is offering a new Vision Pro accessory to registered developers: a head strap with a USB-C connector for $299. There aren’t a lot of details about the strap, which is designed to be connected to a Mac to accelerate development and testing for the Vision Pro, other than this description that is behind a developer account login:

Overview

The Developer Strap is an optional accessory that provides a USB-C connection between Apple Vision Pro and Mac and is helpful for accelerating the development of graphics-intensive apps and games. The Developer Strap provides the same audio experience as the in-box Right Audio Strap, so developers can keep the Developer Strap attached for both development and testing.

Tech specs

  • USB-C data connection
  • Individually amplified dual driver audio pods
  • Compatible with Mac

Although we haven’t been able to confirm the capabilities of the Developer Strap, USB-C may allow developers to connect the Vision Pro to their network over Ethernet or access external storage, for example.

Why is a USB-C dongle $299? It’s expensive, but as the description makes clear, it incorporates the speaker found in Vision Pro’s right strap, which it replaces, explaining at least part of the cost.


Apple Publishes Updated Human Interface Guidelines for visionOS

In addition to releasing the visionOS SDK and developer tools today, Apple has updated its Human Interface Guidelines and published additional visionOS documentation for developers. The updated HIG begins with an overview of designing for the Apple Vision Pro, covering topics like Passthrough, Spatial Audio, Focus and Gestures, Ergonomics, and Accessibility, advising developers to:

Embrace the unique features of Apple Vision Pro. Take advantage of space, Spatial Audio, and immersion to bring life to your experiences, while integrating passthrough, focus, and gestures in ways that feel at home on the device.

If you’re interested in Apple’s design philosophy for the Vision Pro, the HIG is an excellent plain-English read. For developers who want to dive deeper into the details of building apps, Apple has also published a lot of additional documentation covering the nuts and bolts of building visionOS apps.

Permalink

Apple Releases visionOS SDK and Developer Tools

Source: Apple.

Source: Apple.

Today, Apple announced the visionOS software development kit that will allow developers to start creating apps for the Apple Vision Pro. In addition to the SDK, an update to Xcode is introducing Reality Composer Pro, which lets developers preview 3D models, animations, images, and sounds. There’s also a new visionOS simulator that can be used to test different room configurations and lighting for visionOS apps.

The developer labs that Apple announced at WWDC will open soon too:

Next month, Apple will open developer labs in Cupertino, London, Munich, Shanghai, Singapore, and Tokyo to provide developers with hands-on experience to test their apps on Apple Vision Pro hardware and get support from Apple engineers.

Developers can also apply for an Apple Vision Pro developer kit, so they can test apps on the device itself. Anyone who has used Unity’s tools to build 3D apps and games will be able to port them to visionOS next month too.

Source: Apple.

Source: Apple.

Among the developers who have tried the visionOS SDK is Algoriddim, whose CEO, Karim Morsey, said:

The djay app on Apple Vision Pro puts a fully featured DJ system right at a user’s fingertips. With a reimagined spatial interface, anyone can mix their favorite music and apply real-time effects using just their eyes and hands. Whether for a beginner or a seasoned professional, djay on Vision Pro transforms the user’s surroundings with stunning environments that automatically react to their mix, enabling them to experience and interact with music in ways never before possible.

It’s great to see Apple getting these tools into the hands of developers so soon after WWDC. Building apps for Apple Vision Pro uses many of the same technologies and tools developers are already familiar with, like Xcode, SwiftUI, RealityKit, ARKit, and TestFlight. However, with excitement for Apple Vision Pro still high, now is the perfect time to get the new visionOS SDK and tools in developers’ hands as they plan for the device’s release next year.


Apple Releases Developer Tools to Facilitate Porting Videogames to the Mac

Source: Apple.

Source: Apple.

During the WWDC keynote, Apple showed off Game Mode for the Mac, which gives a game priority over a Mac’s CPU and GPU resources. Apple has also reduced the latency of AirPods used while gaming and doubled the sampling rate for connected Bluetooth controllers.

Game Mode promises to improve the overall experience of gaming on the Mac, but it’s not all that was announced at WWDC. Apple has also announced a series of developer tools designed to make it easier to port games to the Mac from other platforms.

Among those tools is a Game Porting Toolkit, which Tom Warren of The Verge says is:

 similar to the work Valve has done with Proton and the Steam Deck. It’s powered by source code from CrossOver, a Wine-based solution for running Windows games on macOS. Apple’s tool will instantly translate Windows games to run on macOS, allowing developers to launch an unmodified version of a Windows game on a Mac and see how well it runs before fully porting a game.

The Game Porting Toolkit is meant as a way for developers to quickly see how much work needs to be done to port their games to the Mac, but that hasn’t stopped gamers with developer accounts from downloading the tool and taking everything from Cyberpunk 2077 to Diablo IV for a spin on the Mac according to Warren.

Along with a tool to convert shaders and graphics code to Apple’s Metal framework, The Game Porting Toolkit and other announcements at WWDC mark a concerted effort by Apple to expand the catalog of games available to Mac users. Whether game developers will take advantage of these tools and bring their games to the Mac remains to be seen, but recent announcements that Stray and Hideo Kojima’s Death Stranding, Director’s Cut are coming to the Mac are both good signs.

Permalink

TestFlight’s Inability to Handle Large Beta Collections Needs to Be Fixed

I’ve been thinking about app scalability a lot lately – most recently in the context of TestFlight, which I find is incredibly frustrating to use, at best, and, on the Mac, often unusable. This isn’t a new problem for me, but I haven’t mentioned it much in the past because I’ve suspected that my experience is colored by the fact that I’m an outlier. But, outlier or not, the app deserves more attention than it’s been given.

Read more