THIS WEEK'S SPONSOR:

Daylite

Spend less time managing your business and spend more time doing what you love. Regain control now!


Posts tagged with "visionOS"

Details About the Apple Vision Pro App Store Announced

A new App Store will launch alongside the Apple Vision Pro, which will include apps built specifically for the headset, as well as iPhone and iPad apps.

According to an announcement on Apple’s developer website, the new store will debut in the visionOS betas this fall, allowing developers to check how their apps run on the Vision Pro via the visionOS simulator. According to the announcement:

By default, your iPad and/or iPhone apps will be published automatically on the App Store on Apple Vision Pro. Most frameworks available in iPadOS and iOS are also included in visionOS, which means nearly all iPad and iPhone apps can run on visionOS, unmodified. Customers will be able to use your apps on visionOS early next year when Apple Vision Pro becomes available.

However, if updates are needed, developers will be alerted via App Store Connect. Apple has multiple ways for developers to check how their apps work on the Apple Vision Pro too:

To see your app in action, use the visionOS simulator in Xcode 15 beta. The simulator lets you interact with and easily test most of your app’s core functionality. To run and test your app on an Apple Vision Pro device, you can submit your app for a compatibility evaluation or sign up for a developer lab.

It will be interesting to see how the Apple Vision Pro App Store experience differs from other platforms.

Permalink

Digital Trends Interviews Apple Execs and Developers about Apple Vision Pro

Digital Trends’ Alex Blake interviewed Susan Prescott, Apple’s vice president of worldwide developer relations and Steve Sinclair, senior director of product marketing for Apple Vision Pro, along with several developers about the ways the company is encouraging development for its upcoming headset.

According to Sinclair:

One of the things that we’ve observed is that when people first put on Vision Pro, they’re so blown away by the new spatial experiences that they see that they oftentimes forget that they’re actually wearing something.

When we’re working with developers,” he continues, “we really try to stress the importance of creating new experiences that take advantage of all those capabilities.” That means building apps that “flex from windows to apps to being able to create fully immersive applications that transport you somewhere else. Because those are the things that customers and users are going to be excited about.”

Developer Ryan McLeod, the creator of the iOS and iPadOS game Blackbox, believes hands-on time with the Vision Pro hardware is key for developer adoption:

“It’s hard for me to imagine being inspired enough to build Blackbox for Vision Pro without having had ample hands-on time,” he notes. “I think it’s going to be critically important that as many developers as possible — especially smaller indie teams — get that opportunity and support for the platform.”

McLeod suggests that to get the Vision Pro in as many developers’ hands as possible:

Apple could help by “continuing to push beyond the traditional yearly WWDC cycle to continuously release more example apps, more API documentation, more sessions, and more opportunities to talk directly with engineers at Apple.”

Mark Gurman of Bloomberg posted on Twitter in early August that he’d heard that the Vision Pro labs were “under-filled with a small number of developers.” As valuable as the labs seem to have been to those who have attended, so far, they’ve only been held in Cupertino and a handful of large cities in a limited number of countries and on relatively short notice. Hopefully, as the weeks pass, Apple can schedule labs further out, expand the number of locations, and offer more developer kits. It’s that sort of hands-on experience that will get developers excited, drive the adoption of visionOS, and ensure there are apps for customers when Vision Pro ships next year.

Permalink

Apple Publishes Reactions by Attendees of Its Worldwide Vision Pro Labs

Today, Apple published the reactions from some of the people who have attended the Vision Pro labs being held in cities around the world. Those quoted include developer David Smith, the creator of Widgetsmith, the chief experience officer of Pixite, Ben Guerrette, the maker of the video creation and editing app Spool, Chris Delbuck, the principal design technologist at Slack, and Michael Simmons, the CEO of Flexibits, the company whose developers make Fantastical. All four came away from the experience impressed with visionOS and the Vision Pro hardware.

According to Smith:

I’d been staring at this thing in the simulator for weeks and getting a general sense of how it works, but that was in a box. The first time you see your own app running for real, that’s when you get the audible gasp.

Guerrette struck a similar chord:

At first, we didn’t know if it would work in our app. But now we understand where to go. That kind of learning experience is incredibly valuable: It gives us the chance to say, ‘OK, now we understand what we’re working with, what the interaction is, and how we can make a stronger connection.’

Any time new hardware with its own operating system is announced, the challenge is to get it in front of as many developers as possible to ensure there are apps for the launch. Apple has tackled the problem with a combination of an app evaluation process, labs in big cities around the world, and developer kits shipped to developers who had to apply for access. From the reactions quoted in the story published on Apple’s Developer website, hands-on time with the Vision Pro sounds as though it was well worth the effort to travel to one of the labs.

Permalink

Apple and Others Announce Alliance to Develop Open Source 3D Technology

Today, Apple announced that it, along with Pixar, Adobe, Autodesk, and NVIDIA, have formed the Alliance for OpenUSD (AOUSD), to “promote the standardization, development, evolution, and growth of Pixar’s Universal Scene Description technology.”

OpenUSD is a 3D screen description technology invented at Pixar and open-sourced in 2016. The alliance’s new project, which will be housed in the Joint Development Foundation, an affiliate of the Linux Foundation, has invited other companies and organizations to join in the effort to develop OpenUSD, “a high-performance 3D scene description technology that offers robust interoperability across tools, data, and workflows.”

Mike Rockwell, Apple’s vice president of the Vision Products Group, was quoted in the press release as saying:

OpenUSD will help accelerate the next generation of AR experiences, from artistic creation to content delivery, and produce an ever-widening array of spatial computing applications. Apple has been an active contributor to the development of USD, and it is an essential technology for the groundbreaking visionOS platform, as well as the new Reality Composer Pro developer tool. We look forward to fostering its growth into a broadly adopted standard.

With big companies coalescing around a new standard, that’s one more point of potential friction that hopefully will be removed as creators begin building 3D experiences for the Apple Vision Pro and other devices.


Apple Announces visionOS App Evaluation Process, Labs, and Developer Kits

As previously announced at WWDC, Apple today rolled out more details on three programs it has created for helping developers build apps for Vision Pro.

Developers will be able to submit their visionOS, iPadOS, and iOS apps for Vision Pro compatibility evaluations. Apple has included a checklist of steps developers should follow before submitting their apps. Once an app is ready, it can be submitted and will be tested on Vision Pro hardware, after which Apple says:

We’ll send you the evaluation results, along with any relevant screen captures or crash logs.

The second option is to apply to attend a lab in Cupertino, London, Munich, Shanghai, Singapore, or Tokyo. The labs are self-directed testing sites with Apple employees on hand to help with setup and troubleshooting.

Finally, developers can apply to borrow Vision Pro developer kit from Apple. The hardware loans also come with the following:

  • Help setting up the device and onboarding.
  • Check-ins with Apple experts for UI design and development guidance, and help refining your app.
  • Two additional code-level support requests, so you can troubleshoot any issues with your code.

All three programs are open now, so if you’re working on an app for Apple Vision Pro, you can start submitting compatibility evaluation requests and apply for a lab or developer kit today.


Unity Launches PolySpatial Beta Program for visionOS Developers

What the Golf? running on visionOS. Source: [Unity](https://blog.unity.com/engine-platform/unity-support-for-visionos).

What the Golf? running on visionOS. Source: Unity.

Today, as promised at WWDC, Unity launched a beta technology for visionOS developers called PolySpatial, which is designed to help developers bring their apps and games to the Vision Pro, which Apple has said will debut in early 2024.

Mike Rockwell, Apple’s vice president of the Vision Products Group, was quoted in Unity’s press release as saying that:

We know there is a huge community of developers who have been building incredible 3D experiences using Unity’s robust authoring tools, and we’re so excited for them to build apps for Apple Vision Pro. Unity-based apps and games run natively on Apple Vision Pro, so they have access to groundbreaking visionOS features including low latency pass-through and high-resolution rendering. This enables Unity developers to take full advantage of the powerful and unique capabilities of Apple Vision Pro. We can’t wait to see what incredible experiences are created.

Unity’s press release also revealed that Triband is bringing the studio’s game WHAT THE GOLF? to visionOS.

Unity kicked off the application process for developers who want to try the beta version of PolySpatial, saying:

Unity is excited to collaborate with Apple to bring familiar and powerful authoring tools for creating immersive games and apps for this new spatial computing platform, Apple Vision Pro. Your apps will get access to benefits such as pass-through and Dynamic Foveated Rendering, in addition to popular Unity features like AR Foundation and XR Interaction Toolkit.

With a deep integration between Unity’s new PolySpatial technology and visionOS, your apps can sit alongside other apps in the Shared Space. By combining Unity’s authoring and simulation capabilities with RealityKit’s managed app rendering, content created with Unity will look and feel at home.

Developers who are interested in Unity PolySpatial can learn more in the company’s blog post, which links to additional Unity and Apple development resources, and can sign up for the PolySpatial beta here.


David Smith on Adapting Widgetsmith’s Weather Features to visionOS

David Smith has been poking around the visionOS SDK and chronicling his efforts on his website. His latest experiment was to take the weather forecast part of Widgetsmith and adapt it to visionOS. We’ve seen a lot of developers post screenshots of what their app looks like when it’s first built in visionOS, and it’s impressive how little work it seems to take to get an app up and running on the new OS.

However, visionOS apps are fundamentally different from others and require further attention to adapt to the new UI language. That’s what I love about David’s post. This is what it looked like when he started:

In the post, David walks through all the changes made, big and small, to get to this point:

The evolution of this view and others is a fascinating lesson in the UI differences between iOS and visionOS apps. Be sure to check out David’s post for more screenshots and details on the process of adapting Widgetsmith’s weather features to visionOS.

Permalink

Apple Publishes Updated Human Interface Guidelines for visionOS

In addition to releasing the visionOS SDK and developer tools today, Apple has updated its Human Interface Guidelines and published additional visionOS documentation for developers. The updated HIG begins with an overview of designing for the Apple Vision Pro, covering topics like Passthrough, Spatial Audio, Focus and Gestures, Ergonomics, and Accessibility, advising developers to:

Embrace the unique features of Apple Vision Pro. Take advantage of space, Spatial Audio, and immersion to bring life to your experiences, while integrating passthrough, focus, and gestures in ways that feel at home on the device.

If you’re interested in Apple’s design philosophy for the Vision Pro, the HIG is an excellent plain-English read. For developers who want to dive deeper into the details of building apps, Apple has also published a lot of additional documentation covering the nuts and bolts of building visionOS apps.

Permalink

Apple Releases visionOS SDK and Developer Tools

Source: Apple.

Source: Apple.

Today, Apple announced the visionOS software development kit that will allow developers to start creating apps for the Apple Vision Pro. In addition to the SDK, an update to Xcode is introducing Reality Composer Pro, which lets developers preview 3D models, animations, images, and sounds. There’s also a new visionOS simulator that can be used to test different room configurations and lighting for visionOS apps.

The developer labs that Apple announced at WWDC will open soon too:

Next month, Apple will open developer labs in Cupertino, London, Munich, Shanghai, Singapore, and Tokyo to provide developers with hands-on experience to test their apps on Apple Vision Pro hardware and get support from Apple engineers.

Developers can also apply for an Apple Vision Pro developer kit, so they can test apps on the device itself. Anyone who has used Unity’s tools to build 3D apps and games will be able to port them to visionOS next month too.

Source: Apple.

Source: Apple.

Among the developers who have tried the visionOS SDK is Algoriddim, whose CEO, Karim Morsey, said:

The djay app on Apple Vision Pro puts a fully featured DJ system right at a user’s fingertips. With a reimagined spatial interface, anyone can mix their favorite music and apply real-time effects using just their eyes and hands. Whether for a beginner or a seasoned professional, djay on Vision Pro transforms the user’s surroundings with stunning environments that automatically react to their mix, enabling them to experience and interact with music in ways never before possible.

It’s great to see Apple getting these tools into the hands of developers so soon after WWDC. Building apps for Apple Vision Pro uses many of the same technologies and tools developers are already familiar with, like Xcode, SwiftUI, RealityKit, ARKit, and TestFlight. However, with excitement for Apple Vision Pro still high, now is the perfect time to get the new visionOS SDK and tools in developers’ hands as they plan for the device’s release next year.