This week on AppStories, we explore Apple’s 2020 gaming updates, including deeper controller support and a refreshed Game Center, which are coming this fall.
The Apple Editor’s Choice App for Managing Your Money
Today Samuel Axon at ArsTechnica published a new interview with two Apple executives: SVP of Machine Learning and AI Strategy John Giannandrea and VP of Product Marketing Bob Borchers. The interview is lengthy yet well worth reading, especially since it’s the most we’ve heard from Apple’s head of ML and AI since he departed Google to join the company in 2018.
Based on some of the things Giannandrea says in the interview, it sounds like he’s had a very busy two years. For example, when asked to list ways Apple has used machine learning in its recent software and products, Giannandrea lists a variety of things before ultimately indicating that it’s harder to name things that don’t use machine learning than ones that do.
There’s a whole bunch of new experiences that are powered by machine learning. And these are things like language translation, or on-device dictation, or our new features around health, like sleep and hand washing, and stuff we’ve released in the past around heart health and things like this. I think there are increasingly fewer and fewer places in iOS where we’re not using machine learning. It’s hard to find a part of the experience where you’re not doing some predictive [work].
One interesting tidbit mentioned by both Giannandrea and Borchers is that Apple’s increased dependence on machine learning hasn’t led to the company talking about ML non-stop. I’ve noticed this too – whereas a few years ago the company might have thrown out ‘machine learning’ countless times during a keynote presentation, these days it’s intentionally more careful and calculated in naming the term, and I think for good reason. As Giannandrea puts it, “I think that this is the future of the computing devices that we have, is that they be smart, and that, that smart sort of disappear.” Borchers expounds on that idea:
This is clearly our approach, with everything that we do, which is, ‘Let’s focus on what the benefit is, not how you got there.’ And in the best cases, it becomes automagic. It disappears… and you just focus on what happened, as opposed to how it happened.
The full interview covers subjects like Apple’s Neural Engine, Apple Silicon for Macs, the benefits of handling ML tasks on-device, and much more, including a fun story from Giannandrea’s early days at Apple. You can read it here.
This week on AppStories, we bring back an AppStories classic: Pick 2, an in-depth look at two apps. In this installment, Federico explains why he has been revisiting Castro for listening to podcasts, and John covers the evolution of Grammarly into a terrific grammar and spell-checking app for writers.
Every year Apple releases a new environmental report showing the company’s progress in environmental efforts, and alongside the release of this year’s report, the company has announced a new commitment for the decade ahead:
Apple today unveiled its plan to become carbon neutral across its entire business, manufacturing supply chain, and product life cycle by 2030. The company is already carbon neutral today for its global corporate operations, and this new commitment means that by 2030, every Apple device sold will have net zero climate impact.
“Businesses have a profound opportunity to help build a more sustainable future, one born of our common concern for the planet we share,” said Tim Cook, Apple’s CEO. “The innovations powering our environmental journey are not only good for the planet — they’ve helped us make our products more energy efficient and bring new sources of clean energy online around the world. Climate action can be the foundation for a new era of innovative potential, job creation, and durable economic growth. With our commitment to carbon neutrality, we hope to be a ripple in the pond that creates a much larger change.”
Achieving carbon neutrality for its corporate operations was a nice milestone for the company, but this new commitment appears far more challenging. Apple works with third-party suppliers and manufacturers all around the world to build its devices, and fulfilling this new goal depends a lot on those third parties. It will be interesting to see over the next decade all of the different actions that will be taken to find success in carbon neutrality, but the report of Apple including fewer accessories in the box with new iPhone purchases certainly seems like it would help.
Today Apple announced an expansion of its initiative of partnering with Historically Black Colleges and Universities (HBCUs) to create hubs for training the next generation of coders. 10 new HBCU coding centers are being added throughout the US, from which nearly 500 teachers and community leaders will soon participate in “a virtual Community Education Initiative Coding Academy that Apple is hosting for all initiative partners.” During this training:
Educators will learn the building blocks of coding with Swift, Apple’s easy-to-learn coding language. Participants will work in teams to design app prototypes to address real community challenges. After completing the coding academy, educators will begin to integrate the coding and creativity curricula into their communities by launching coding clubs and courses at their schools, hosting community coding events, and creating workforce development opportunities for adult learners.
This announcement comes as Apple just last week shared updates to its lineup of coding resources for students, educators, and families alike, demonstrating the company’s investment in developing coding initiatives across all age groups. The move also follows Tim Cook’s open letter in June addressing racism in America and subsequent creation of a new $100 million Racial Equity and Justice Initiative by the company. The executive leading this initiative, Lisa Jackson, commented on today’s HBCU news saying:
“Apple is committed to working alongside communities of color to advance educational equity,” said Lisa Jackson, Apple’s vice president of Environment, Policy and Social Initiatives. “We see this expansion of our Community Education Initiative and partnership with HBCUs as another step toward helping Black students realize their dreams and solve the problems of tomorrow.”
These last couple months have seen many companies express a desire to work toward pursuing racial equality and justice, but true change takes more than just words, so I’m glad to start seeing the early fruits of Apple’s new commitments.
In April 2019, Apple published a video called The Underdogs that followed the story of a team of co-workers designing a round pizza box. Today, the quartet is back in a sequel of sorts called The Whole Work-From-Home Thing.
The new video follows the same group of colleagues as they attempt to design an all-new box while working from home. The story follows the quartet as they work around the clock on a tight deadline while juggling personal obligations and coping with working remotely.
The pace is frenetic. Over the course of the multi-day ordeal, the group turns to their Macs, iPads, and iPhones to come up with ideas and design the box. They also rely on a wide array of apps, including third-party apps like MindNode and Adobe InDesign.
Like its predecessor, ‘The Whole Work-From-Home Thing’ is funny but succeeds at demonstrating ways that Apple hardware and apps can solve some of the problems facing many people these days. This video may hit a little too close to home and stress some people out a bit, but I enjoyed the lighthearted fun poked at working from home and think it’s well worth watching.
Epic Games has released a new iPhone app for videogame developers that captures facial expressions, piping them into the company’s Unreal Engine in real-time. As explained on the Unreal Engine blog:
Live Link Face streams high-quality facial animation in real-time from your iPhone directly onto characters in Unreal Engine. The app’s tracking leverages Apple’s ARKit and the iPhone’s TrueDepth front-facing camera to interactively track a performer’s face, transmitting this data directly to Unreal Engine via Live Link over a network.
What I find most interesting about Live Link Face is that Epic says it scales from solo developers working at home to sophisticated stage productions involving actors in motion capture suits and multiple iPhones. If so, that will make the app a terrific example of the sort of democratization of complex tools that technologies like ARKit and hardware like the iPhone’s TrueDepth camera make possible when integrated into existing workflows.