In many ways, the iPhone 11 Pro’s camera system feels like the culmination of over a decade’s worth of judicious, relentless improvements. Not only is the device’s camera the best and smartest Apple has ever shipped, but it also affords the most photographic freedom, allowing non-professional photographers like me to produce amazing shots with minimal effort.
Posts in stories
Ironically, Apple chose to name this year’s update to macOS after an island. Since the iPhone and iOS took off, macOS has sometimes felt like an island isolated from the rest of the company’s OSes, but the goal articulated by the company at WWDC this year was quite the opposite. Apple clearly telegraphed that change is coming to the Mac and it’s designed to bridge the user experiences between each of its platforms.
To developers, that message came in the form of Catalyst and SwiftUI. Catalyst, which was previewed as an unnamed ‘Sneak Peek’ in 2018, is meant to make it easier for iPadOS developers to bring their apps to the Mac. SwiftUI has a similar longer-term goal of unifying and streamlining how developers build the interfaces for their apps across a range of devices, for everything from the Apple Watch to the Mac.
The efforts to draw macOS in closer with Apple’s other operating systems run deeper than just developer tools though. macOS may have been the foundation on which iOS was built, but in the years that followed iOS’s introduction, the two OSes grew apart. Identically-named apps were developed on different schedules, which meant they rarely included the same features. Also, system-level functionality like System Preferences, which serves the same purpose as iOS’s Settings app, was unfamiliar, making Mac adoption unnecessarily hard for newcomers. Catalina is an attempt to address those kinds of inconsistent user experiences.
With Catalina, Apple has taken clear, though not always successful, steps to bridge the divide between the Mac and iOS. App functionality has been realigned, System Preferences has been rearranged, and new features have been added to make it easier to move from one platform to the other.
As with other transitional periods in the Mac’s history, this one isn’t going to be easy. However, because the change is driven by a fundamental change in computing, it’s also necessary. We live in a new climate where computing is now dominated by mobile devices. For many people, a smartphone is all the computing power they need day-to-day. That doesn’t mean there’s no longer a place for the Mac, but it’s clearly what’s driving the changes in Catalina.
Apple could have chosen to ignore the shift of the ground beneath its feet and merely maintained macOS, making the kind of small incremental changes we’ve become accustomed to in recent years. However, not adapting is as deliberate a choice as change is, and it carries just as much or more uncertainty for the Mac as a platform because it risks irrelevance.
The Mac isn’t in crisis, but it isn’t healthy either. Waiting until the Mac is on life support isn’t viable. Instead, Apple has opted to reimagine the Mac in the context of today’s computing landscape before its survival is threatened. The solution is to tie macOS more closely to iOS and iPadOS, making it an integrated point on the continuum of Apple’s devices that respects the hardware differences of the platform but isn’t different simply for the sake of difference.
Transitions are inherently messy, and so is Catalina in places. It’s a work in process that represents the first steps down a new path, not the destination itself. The destination isn’t clear yet, but Catalina’s purpose is: it’s a bridge, not an island.
The general expectation leading up to this year’s iPhone and Apple Watch debuts was that “boring” updates were in store. The iPhone, it was reported, would have an unattractive triple-camera system and little else in the way of improvements; some thought the Apple Watch might not get an update at all.
When Apple officially introduced its new devices to the world, my own reactions were largely positive, though a little mixed. On paper, the latest iPhone and Apple Watch models offer less year-over-year improvements in quantity than Apple usually treats us to. But the advancements that are here – cameras and battery for the iPhone, always-on display for the Watch – are qualitatively huge.
Apple is really good at making two key things: revolutionary products and iterative ones. Every now and then the company creates something that’s truly transformative, a product with undeniable cultural impact. The iPod, iPhone, and iPad are classic picks, but more recently AirPods and the iPhone X deserve similar recognition. However, in-between these giants sit a lot of iterative updates, where existing products get a little bit better. Stacked against the culture-shakers these iterative updates are comparatively less exciting, but they’re almost always objectively better products than their predecessors.
The iPhone 11 and Apple Watch Series 5 lines aren’t revolutionary, but they may well be remembered as some of the best iterative products Apple has ever shipped.
My early impression, after just a few days with the iPhone 11 Pro and Watch Series 5, is that this year’s updates have the potential to stand out over time for one main reason: they give users what we’ve all been asking for.
With last year’s release of the Apple Watch Series 4, it felt as though Apple had finally reached a point of equilibrium on the hardware side of the device. The Series 4 brought the first physical redesign, thinning the Watch out and stretching its slightly larger screen to the corners. It packed a processor that finally felt overpowered rather than underpowered, and it kept the Apple Watch’s all-day battery life going strong. The update rounded out with added health sensors for ECGs, background heart monitoring, Bluetooth 5 support, and a new speaker system. Those advancements joined the cellular capabilities from the Series 3, and have now been joined by the always-on display of the Series 5. I’m running out of feature requests for the Apple Watch.
The hardware may now be in place, but as we all know hardware is only part of the story. On the software side, the Apple Watch found its footing two years ago, but had a lot of catching up to do to reach the level of maturity of its hardware. Iteration is Apple’s specialty, and their increasingly strong understanding of the Apple Watch’s purpose has made the software path clear. Last year’s watchOS 5 brought significant fitness and audio improvements, the addition of web content and more interactive notifications. This year’s update brings us even more.
watchOS 6 flew under the radar at the packed and exciting WWDC keynote this June. It isn’t the most flashy update, but the Apple Watch had enough flashy updates in its early years to last a while longer. This is a year for iteration, and Apple has been iterating on all cylinders. watchOS 6 is a quiet giant, adding or redesigning more first-party apps at once than we’ve seen in years, dropping the largest batch of new watch faces since watchOS 1, providing a new way to track fitness over time, and kicking off a nascent foray into Apple Watch independence. Let’s see how Apple did.
Today alongside the launch of the iPhone 11 line and Apple Watch Series 5, Apple reopened its long-under-construction Apple Fifth Avenue retail location. First closed in January 2017 and relocated to a temporary space nearby, the new Fifth Avenue opens as the largest Apple Store in the world; it also remains the only location open 24 hours per day, 365 days per year.
Following years of a judicious union between platforms, it’s time for iPad to embark on its own journey.
In looking back at major iOS releases from the recent past, it’s easy to see how building and positioning these annual updates has become a careful balancing act for Apple.
In last year’s iOS 12, we saw the company focus on improving performance, providing users with tools to understand their device usage habits, and adapting Workflow to the modern era of Siri and proactive suggestions. The strategy was largely successful: iOS 12 was regarded as Apple’s most reliable iOS release of late – a reputation that has resulted in a 90% adoption rate a year later; and the Shortcuts app – the highlight of last year from a user feature perspective – is becoming a built-in (and thus more powerful) app in iOS 13.
For all that Apple accomplished in iOS 12, however, some areas of the experience inevitably had to be put on the back-burner. Besides improvements to Reminders and Files, iOS 12 lacked a long-awaited dark mode (which was rolled out on macOS instead) as well as more substantial tweaks to the ever-evolving iOS 7 design language; chief among iOS 12’s absentee list, of course, was iPad. Even though Apple had trained users to expect major additions to the tablet platform on a biennial schedule (see iOS 9 and iOS 11), the lack of meaningful iPad features in iOS 12 spurred a contentious discussion when it became apparent that new iPad Pro hardware was so far ahead of its software, it legitimized asking whether investing in that hardware was even worth it.
The annual debate that surrounds which features make it into each major iOS release is symptomatic of a complicated truth: iOS isn’t just the operating system that runs on iPhones anymore, and these annual releases are more than a mere collection of updated apps. iOS is the platform for an ecosystem of devices – from our wrists and speakers to cars and TV sets – and its changes have repercussions that ripple far beyond an updated Reminders app or a new icon set.
This, of course, has been the case for a few years at this point, but the nature of iOS as an all-encompassing platform has never been as evident as it is today in iOS 13. For the first time since I started reviewing Apple’s annual iOS updates, it feels like the company is now keenly aware that a new iOS version has to cover an array of themes that can’t be pushed back for scheduling reasons. A single area of attention isn’t enough anymore – not for the Apple of 2019 as an economic, political, and social force, and not for iOS, the engine powering devices that aren’t just screens for apps, but bona fide lifestyle computers.
As a result, there’s something for everyone in iOS 13 and all the recurring themes of Tim Cook’s Apple are touched upon this time around. iOS 13 improves Face ID recognition and promises improvements to app download sizes and performance. Apple is sending strong signals on its commitment to privacy as a feature with a new sign-in framework for apps and enhancements to location tracking controls and HomeKit cameras. iOS’ design language is getting its biggest update in years with dark mode, new tools for developers to express colors and embed glyphs in their user interfaces, updated context menus, and redesigns aimed at facilitating one-handed interactions. We have notable improvements to built-in apps, including the rebuilt Reminders and Health, an overhauled Files app, and hundreds of quality-of-life tweaks that, in big and small ways, make iOS more capable and efficient.
No stone is left unturned in iOS 13 – and that includes iPad too.
The iPad experience has always been largely consistent with the iPhone – particularly since Apple unified core iOS interactions around a screen without a Home button – but also distinct from it. iOS 13 makes this distinction official by splitting itself in a second branch called iPadOS, which uses iOS as the foundation but is specifically optimized and designed for iPad.
It was clear when the new iPad Pro launched in late 2018 that it told only one part of a bigger story about the role of the tablet in Apple’s modern ecosystem. With iPadOS, Apple is ready to tell that full story: while the iPad has always been an extension of iOS, sharing key similarities with the iPhone hardware and software, it’s been evolving – arguably, a bit too slowly – into a different breed of computer that is fundamentally distinct from a phone.
We’ve been able to observe this divergence starting in iOS 9 with Split View multitasking and Apple Pencil, and the transition continued with iOS 11 and its drag and drop-infused environment. It was only natural (and well-deserved) for the iPad to begin advancing in a parallel direction to iOS – informed and inspired by it, but also capable of growing on its own and tackling problems that an iPhone doesn’t have to solve.
From this standpoint, there are two sides to iOS 13: on one hand, an underlying tide that raises all platforms, featuring a distillation of themes Apple comes back to on an annual basis; on the other, a fork in the road, opening a new path for the iPad’s next decade. And against this backdrop, a single question looms large:
Can Apple balance both?
Another year, another batch of Siri improvements aimed at enhancing what’s already there, but not radically transforming it. Siri in iOS 13 comes with a handful of changes, all of which are in line with the types of iteration we’re used to seeing for Apple’s intelligent assistant. Siri now offers suggested actions in more places and ways than before, its voice continues becoming more human, and perhaps this year’s biggest change is a new SiriKit domain for media, which should enable – after the necessary work by third-party developers – audio apps like Spotify, Overcast, and Audible to be controlled by voice the way Apple’s native Music, Podcasts, and Books apps can be.
It’s hard to believe it’s been nearly six years since Apple added game controller support to iOS. The big news at WWDC in 2013 was the iOS 7 redesign, but for game developers, it was rivaled by the announcement that third-party Made For iPhone (MFi) controllers were coming.
The game press and developers understood the potential of controller support immediately. Even though it wasn’t announced there, Chris Plante of Polygon declared controller support the biggest story of E3, the game industry trade show that was happening at the same time as WWDC. Plante imagined that:
If Apple finds a way to standardize traditional controls, every iOS device will become a transportable console. In a year, both iPhones and iPads will approach the processing power of the current-generation devices. Companies will have the ability to port controller-based games for the mobile devices in millions of pockets — an install-base far greater than they’ve ever had before.
Game industry veteran Gabe Newell, the co-founder of Valve, saw Apple’s entry as a big risk to companies making PC and console games:
The threat right now is that Apple has gained a huge amount of market share, and has a relatively obvious pathway towards entering the living room with their platform…I think Apple rolls the console guys really easily.
I was right there with them. iOS devices couldn’t match the power of a traditional console in 2013, but you could see that they were on a trajectory to get there. With the addition of controller support, Apple felt poised to make a meaningful run at incumbents like Sony and Microsoft.
It didn’t work out that way though. iOS’ controller support was rushed to market. Early controllers were priced at around $100, in part because of the requirements of the MFi certification, and they couldn’t match the quality of controllers from Sony and Microsoft.
As anticipated, controller support was extended to the Apple TV when its App Store launched in 2015. Initially, it looked as though Apple would allow game developers to require a controller. In the end, though, the company went an entirely different direction by requiring that games support the Apple TV Remote, a decision that complicated development and dumbed down controller integration to match the remote’s limited input methods. Apple changed course eventually, and now lets developers require controllers, but by the time of that change the damage had been done. Many developers had already lost interest in controller support. It didn’t help either that for a very long time, the App Store didn’t indicate which games were compatible with MFi controllers, leaving the void to be filled by third-party sites.
Last year, when I looked back at the history of games on the App Store for its tenth anniversary, I came away pessimistic about the future of games on Apple’s platforms. After a decade, I felt like we were still asking the same question that Federico posed in 2013:
Will Apple ever develop a culture and appreciation for gaming as a medium, not just an App Store category?
Sadly, Federico’s question remains as relevant today as it was six years ago. Still, I’m cautiously optimistic based on what’s happened in the past year. Part of that is the App Store editorial team’s excellent track record of championing high-quality games in the stories published on the App Store. Another factor is Apple Arcade, the game subscription service we still don’t know a lot about, but which appears designed to showcase high-quality, artistically important games.
The latest cause for optimism is Apple’s announcement at WWDC this past June that iOS, iPadOS, tvOS, and macOS would all support the Sony DualShock 4 and Bluetooth-based Xbox controllers when Apple’s OSes are updated this fall. The reaction from developers and other observers was a combination of surprise and excitement that was uncannily similar to the MFi announcement in 2013. Yet, the news begs the question: ‘How is this time any different?’ The answer to that question lies in how the new controllers work and the role they will play in Arcade.
This year’s Worldwide Developers Conference was big. From dark mode in iOS 13 to the newly-rechristened iPadOS to the unveiling of the born-again Mac Pro and more, Apple’s annual week-long bonanza of all things software was arguably one of the most anticipated and exciting events in recent Apple history.
Accessibility certainly contributed to the bigness as well. Every year Apple moves mountains to ensure accessibility’s presence is felt not only in the software it previews, but also in the sessions, labs, and other social gatherings in and around the San Jose Convention Center.
“One of the things that’s been really cool this year is the [accessibility] team has been firing on [all] cylinders across the board,” Sarah Herrlinger, Apple’s Director of Global Accessibility Policy & Initiatives, said to me following the keynote. “There’s something in each operating system and things for a lot of different types of use cases.”
One announcement that unquestionably garnered some of the biggest buzz during the conference was Voice Control. Available on macOS Catalina and iOS 13, Voice Control is a method of interacting with one’s Mac or iOS device using only your voice. A collaborative effort between Apple’s Accessibility Engineering and Siri groups, Voice Control aims to revolutionize the way users with certain physical motor conditions access their devices. At a high level, it’s very much a realization of the kind of ambient, voice-first computing dreamed up by sci-fi television stalwarts like The Jetsons and Star Trek decades ago. You talk, it responds.
And Apple could not be more excited about it.