Looking beneath the surface of iOS 17 and its collection of system tweaks and new app features, there is one emerging theme worth paying attention to: Apple’s bet on widgets everywhere and the progressive modularization of apps.
The company may have not come out and said it in the same terms as I will, but it’s clear from iOS 17, iPadOS 17, macOS Sonoma, and watchOS 10 that widgets – tiny pieces of apps delivered to the user in different contexts – have turned into something more than Home Screen ornaments. With interactivity now finally supported everywhere and a new area of iOS 17 entirely dedicated to widgets, it’s clear that Apple believes in widgets as an interaction paradigm that is here to stay for the next several years.
Of all the widget-based features of iOS 17, StandBy feels like an obvious precursor to a standalone HomePod with a screen in the future. While I was testing StandBy this summer, I found myself wondering every day whether I’d be willing to give up space on my desk or in my kitchen to make room for a HomePod that combined Siri conversations with widget interactions. The answer was always an unequivocal Yes.
With widgets now supporting interactions in every area of iOS, iPadOS, watchOS, and macOS where they can be placed and developers flocking the App Store with updated widgets, Apple has an established ecosystem of widget apps that could power the kind of smart speaker no other company can offer.
As apps keep getting sliced into different components and accessed via self-contained interactions, I could see StandBy become the foundation for a smart, personalized product I’d want to place in my home. I have a feeling we’ve only seen the proverbial tip of the iceberg with StandBy, and I hope my theory’s right.
Substantial iPadOS updates were evidently put on hold this year, with Apple doing just what was necessary to fix the flavor of Stage Manager they shipped last year.
After my six months in computing wilderness, I’m happy to call iPadOS my home again. I can get work done on my iPad Pro without the bugs and confusing interactions of iPadOS 16; working with external displays is considerably better than before thanks to support for webcams and more flexible windowing layouts; I can even use my iPad Pro as a portable display for my ROG Ally and Nintendo Switch thanks to integration with capture cards. Now more than ever, I love the iPad’s modularity and can’t wait to see what Apple is preparing next.
There are still parts of my job, however, that I can’t do on an iPad Pro. iPadOS’ integration with external audio devices continues to be non-existent, which means I cannot record podcasts on my iPad Pro with the same setup I use on my MacBook Air. That’s a shame, and given how Apple is focused on finishing visionOS for the Vision Pro launch next year, I have to wonder how long we’ll have to wait for audio improvements to finally arrive on iPadOS.
iOS and iPadOS 17 point to a key technological investment that is shaping Apple’s software platforms: App Intents.
The framework that makes apps’ content and functionalities available to system services outside of the apps themselves is, undeniably, the main protagonist of this year’s offerings on iOS and iPadOS. Intents are being integrated into everything: they power widgets on the Home Screen, Lock Screen, and StandBy; they’re behind Live Activities; they serve as the foundation for Focus Filters and the Action button on Apple Watch Ultra and iPhone 15 Pro; they’re at the core of the Siri experience and, of course, the Shortcuts app.
In the years I’ve been covering Apple’s operating systems, I can’t remember any other framework that started with relatively humble beginnings and gradually became the glue for so many different types of experiences. That’s my takeaway for widgets, StandBy, and the overarching App Intents story in iOS 17: this technology that Apple has spent years refining has now turned into a modular system that can turn apps into discrete components that adapt to what you’re doing on an iPhone or iPad.
iOS and iPadOS 17 show that we now have to contend with different permutations of the concept of “app”.
Let me put it another way. A modern app for the Apple ecosystem is more than an icon on the Home Screen: it’s a container for a diverse set of experiences on a range of different hardware products. And what is the end goal? What happens when the progressive modularization of apps meets a large language model? What will our interactions with an iPhone be like when Siri becomes capable of mixing and matching different app features with simple sentences?
When an app is not just its Home Screen icon anymore but can be an interactive tile on multiple Home Screen pages, an accessory on the Lock Screen, a widget available when your iPhone is charging, or perhaps a command spoken to an intelligent assistant, I’m left wondering:
What does it even mean to be an app anymore?