I’m in process of taking notes and doing research for my annual iOS and iPadOS review, and, like every year, this means I’m reassessing the state of all native Apple apps on my devices. One of these apps is Mail, which, unfortunately, won’t be receiving AI-powered enhancements for smart message categorization as soon as iOS...
The Pizza Is the Model
AppStories Episode 391 - The Pizza Is the Model
30:47
This week, Federico and John recap WWDC week with more on their early testing of the iOS and iPadOS 18 betas and an in-depth conversation about why they are disappointed with Apple’s decision to train its large language models on the Open Web.
The Winners and Losers of iOS and iPadOS 18
WWDC week is always a busy one at MacStories and this year, thanks to our extended team, we’ve been able to provide overviews for all the operating system updates announced by Apple. That has given me the ability to start testing both iOS and iPadOS 18 (which I will review later this year) right away...
The Issues of iPadOS 18’s New Tab Bars→
Earlier today on Mastodon, I shared some concerns regarding the Books app in iPadOS 18 and how Apple implemented the new tab bar design in the app. Effectively, by eschewing a sidebar, the app has returned to feeling like a blown-up iPhone version – something I hoped we had left behind when Apple announced they wanted to make iPad apps more desktop-class two years ago.
Unfortunately, it gets worse than Books. As documented by Nico Reese, the developer of Gamery, the new tab bars seem to fall short of matching the previous design’s visual affordances as well as flexibility for developers. For starters, the new tabs are just text labels, which may work well in English, but not necessarily other languages:
Since the inception of the iPhone, tabs in a tab bar have always included a glyph and a label. With the new tab style, the glyphs are gone. Glyphs play a crucial role in UX design, allowing users to quickly recognize parts of the app for fast interaction. Now, users need to read multiple text labels to find the content they want, which is slower to perceive and can cause issues in languages that generally use longer words, such as German. Additionally, because tab bars are now customizable, they can even scroll if too many tabs are added!
You’ll want to check out Nico’s examples here, but this point is spot-on: since tab bars now sit alongside toolbar items, the entire UI can get very condensed, with buttons often ending up hidden away in an overflow menu:
Although Apple’s goal was to save space on the iPad screen, in reality, it makes things even more condensed. Apps need to compress actions because they take up too much horizontal space in the navigation bar. This constant adjustment of button placement in the navigation bar as windows are resized prevents users from building muscle memory. The smaller the window gets, the more items collapse.
If the goal was to simplify the iPad’s UI, well, now iPad users will end up with three ways to navigate apps instead of two, with the default method (the top bar) now generally displaying fewer items than before, without glyphs to make them stand out:
For users, it can be confusing why the entire navigation scheme changes with window resizing, and now they must adjust to three different variations. Navigation controls can be located at the top, the bottom, or the left side (with the option to hide the sidebar!), which may not be very intuitive for users accustomed to consistent navigation patterns.
The best way I can describe this UI change is that it feels like something conceived by the same people who thought the compact tab bar in Safari for iPad was a good idea, down to how tabs hide other UI elements and make them less discoverable.
Nico’s post has more examples you should check out. I think Marcos Tanaka (who knows a thing or two about iPad apps) put it well:
It makes me quite sad that one of the three iPad-specific features we got this year seems to be missing the mark so far. I hope we’ll see some improvements and updates on this front over the next three months before this feature ships to iPad users.
WWDC 2024: The AppStories Interviews with ADA and Swift Student Challenge Distinguished Winners
AppStories Episode 390 - WWDC 2024: The AppStories Interviews with ADA and Swift Student Challenge Distinguished Winners
40:53
Today, for a special episode of AppStories recorded at Apple Park, Federico and John interview ADA winners Devin Davies, the maker of Crouton, Katarina Lotrič and Jasna Krmelj from Gentler Streak, and James Cuda and Michael Shaw of Procreate, plus Adelaide Humez and Dezmond Blair, Distinguished Winners of Apple’s Swift Student Challenge.
WWDC 2024: Intelligence Briefings
AppStories Episode 389 - WWDC 2024: Intelligence Briefings
28:52
For the latest WWDC episode of AppStories, Federico is joined by Myke Hurley to talk about the Vision Pro and Apple Intelligence before John pops up with some AI tidbits and a WWDC vibe check from in and around Apple Park.
Designing Dark Mode App Icons→
Apple’s announcement of “dark mode” icons has me thinking about how I would approach adapting “light mode” icons for dark mode. I grabbed 12 icons we made at Parakeet for our clients to illustrate some ways of going about it.
Before that though, let’s take some inventory. Of the 28 icons in Apple’s preview image of this feature, only nine have white backgrounds in light mode. However, all icons in dark mode have black backgrounds.
Actually, it’s worth noting that five “light mode” icons have black backgrounds, which Apple slightly adjusted to have a consistent subtle black gradient found on all of their new dark mode icons. Four of these—Stocks, Wallet, TV, and Watch—all seem to be the same in both modes. However, no other (visible) icons are.
Fantastic showcase by Louie Mantia of how designers should approach the creation of dark mode Home Screen icons in iOS 18. In all the examples, I prefer Mantia’s take to the standard black background version.
See also: Gavin Nelson’s suggestion, Apple’s Human Interface Guidelines on dark mode icons, and the updated Apple Design Resources for iOS 18.
Apple Details Its AI Foundation Models and Applebot Web Scraping→
From Apple’s Machine Learning Research1 blog:
Our foundation models are trained on Apple’s AXLearn framework, an open-source project we released in 2023. It builds on top of JAX and XLA, and allows us to train the models with high efficiency and scalability on various training hardware and cloud platforms, including TPUs and both cloud and on-premise GPUs. We used a combination of data parallelism, tensor parallelism, sequence parallelism, and Fully Sharded Data Parallel (FSDP) to scale training along multiple dimensions such as data, model, and sequence length.
We train our foundation models on licensed data, including data selected to enhance specific features, as well as publicly available data collected by our web-crawler, AppleBot. Web publishers have the option to opt out of the use of their web content for Apple Intelligence training with a data usage control.
We never use our users’ private personal data or user interactions when training our foundation models, and we apply filters to remove personally identifiable information like social security and credit card numbers that are publicly available on the Internet. We also filter profanity and other low-quality content to prevent its inclusion in the training corpus. In addition to filtering, we perform data extraction, deduplication, and the application of a model-based classifier to identify high quality documents.
It’s a very technical read, but it shows how Apple approached building AI features in their products and how their on-device and server models compare to others in the industry (on servers, Apple claims their model is essentially neck and neck with GPT-4-Turbo, OpenAI’s older model).
This blog post, however, pretty much parallels my reaction to the WWDC keynote. Everything was fun and cool until they showed generative image creation that spits out slop “resembling” (strong word) other people; and in this post, everything was cool until they mentioned how – surprise! – Applebot had already indexed web content to train their model without publishers’ consent, who can only opt out now. (This was also confirmed by Apple executives elsewhere.)
As a creator and website owner, I guess that these things will never sit right with me. Why should we accept that certain data sets require a licensing fee but anything that is found “on the open web” can be mindlessly scraped, parsed, and regurgitated by an AI? Web publishers (and especially indie web publishers these days, who cannot afford lawsuits or hiring law firms to strike expensive deals) deserve better.
It’s disappointing to see Apple muddy an otherwise compelling set of features (some of which I really want to try) with practices that are no better than the rest of the industry.
- How long until this become the ‘Apple Intelligence Research’ website? ↩
WWDC 2024: Keynote First Impressions
AppStories Episode 388 - WWDC 2024: Keynote First Impressions
01:03:19
For this special episode of AppStories, Federico is joined by Jonathan and Niléane live in the Club MacStories+ Discord community to share their first impressions of the WWDC 2024 Keynote.