Posts in Linked

Lux’s Sebastiaan de With on the iPhone 16e’s Essential Camera Experience

As I read Sebastiaan de With’s review of the iPhone 16e’s camera, I found myself chuckling when I got to this part:

You can speculate what the ‘e’ in ‘16e’ stands for, but in my head it stands for ‘essential’. Some things that I consider particularly essential to the iPhone are all there: fantastic build quality, an OLED screen, iOS and all its apps, and Face ID. It even has satellite connectivity. Some other things I also consider essential are not here: MagSafe is very missed, for instance, but also multiple cameras. It be [sic] reasonable to look at Apple’s Camera app, then, and see what comprises the ‘essential’ iPhone camera experience according to Apple.

What amused me was that I initially planned to call my iPhone 16e review the ‘e’ Is for Essential, but I settled on ‘elemental’ instead. Whether the ‘e’ in iPhone 16e stands for either of our guesses or neither really doesn’t matter. Like Sebastiaan, I find what Apple chose to include and exclude from the 16e fascinating.

When it comes to the iPhone 16e’s camera, there are differences compared to the iPhone 16 Pro, which is the focus of Sebastiaan’s review. The 16e supports fewer features than the Pro and the photos it takes don’t reproduce quite as much detail, especially in low-light conditions. There are other differences, too, so it’s worth comparing the review’s side-by-side comparison shots of the 16e to the 16 Pro.

Overall, though, I think it’s fair to say Sebastiaan came away impressed with the 16e’s camera, which has been my experience, too. So far, I’ve only used it to shoot video for our podcasts, and with good lighting, the results are excellent. Despite some differences, the iPhone 16e combined with the wealth of photo and video apps, like Lux’s Halide and Kino, make it a great way to enjoy the essential iPhone photography experience.

Permalink

Where’s Swift Assist?

Last June at WWDC, Apple announced Swift Assist, a way to generate Swift code using natural language prompts. However, as Tim Hardwick writes for MacRumors, Swift Assist hasn’t been heard from since then:

Unlike Apple Intelligence, Swift Assist never appeared in beta. Apple hasn’t announced that it’s been delayed or cancelled. The company has since released Xcode 16.3 beta 2, and as Michael Tsai points out, it’s not even mentioned in the release notes.

Meanwhile, developers have moved on, adopting services like Cursor, which does much of what was promised with Swift Assist, if not more. A similar tool built specifically for Swift projects and Apple’s APIs would be a great addition to Xcode, but it’s been nine months, and developers haven’t heard anything more about Swift Assist. Apple owes them an update.

Permalink

The M3 Ultra Mac Studio for Local LLMs

Speaking of the new Mac Studio and Apple making the best computers for AI: this is a terrific overview by Max Weinbach about the new M3 Ultra chip and its real-world performance with various on-device LLMs:

The Mac I’ve been using for the past few days is the Mac Studio with M3 Ultra SoC, 32-core CPU, 80-core GPU, 256GB Unified Memory (192GB usable for VRAM), and 4TB SSD. It’s the fastest computer I have. It is faster in my workflows for even AI than my gaming PC (which will be used for comparisons below; it has an Intel i9 13900K, RTX 5090, 64GB of DDR5, and a 2TB NVMe SSD).

It’s a very technical read, but the comparison between the M3 Ultra and a vanilla (non-optimized) RTX 5090 is mind-blogging to me. According to Weinbach, it all comes down to Apple’s MLX framework:

I’ll keep it brief; the LLM performance is essentially as good as you’ll get for the majority of models. You’ll be able to run better models faster with larger context windows on a Mac Studio or any Mac with Unified Memory than essentially any PC on the market. This is simply the inherent benefit of not only Apple Silicon but Apple’s MLX framework (the reason we can efficiently run the models without preloading KV Cache into memory, as well as generate tokens faster as context windows grow).

In case you’re not familiar, MLX is Apple’s open-source framework that – I’m simplifying – optimizes training and serving models on Apple Silicon’s unified memory architecture. It is a wonderful project with over 1,600 community models available for download.

As Weinbach concludes:

I see one of the best combos any developer can do as: M3 Ultra Mac Studio with an Nvidia 8xH100 rented rack. Hopper and Blackwell are outstanding for servers, M3 Ultra is outstanding for your desk. Different machines for a different use, while it’s fun to compare these for sport, that’s not the reality.⁠⁠

There really is no competition for an AI workstation today. The reality is, the only option is a Mac Studio.

Don’t miss the benchmarks in the story.

Permalink

Is Apple Shipping the Best AI Computers?

For all the criticism (mine included) surrounding Apple’s delay of various Apple Intelligence features, I found this different perspective by Ben Thompson fascinating and worth considering:

What that means in practical terms is that Apple just shipped the best consumer-grade AI computer ever. A Mac Studio with an M3 Ultra chip and 512GB RAM can run a 4-bit quantized version of DeepSeek R1 — a state-of-the-art open-source reasoning model — right on your desktop. It’s not perfect — quantization reduces precision, and the memory bandwidth is a bottleneck that limits performance — but this is something you simply can’t do with a standalone Nvidia chip, pro or consumer. The former can, of course, be interconnected, giving you superior performance, but that costs hundreds of thousands of dollars all-in; the only real alternative for home use would be a server CPU and gobs of RAM, but that’s even slower, and you have to put it together yourself. Apple didn’t, of course, explicitly design the M3 Ultra for R1; the architectural decisions undergirding this chip were surely made years ago. In fact, if you want to include the critical decision to pursue a unified memory architecture, then your timeline has to extend back to the late 2000s, whenever the key architectural decisions were made for Apple’s first A4 chip, which debuted in the original iPad in 2010. Regardless, the fact of the matter is that you can make a strong case that Apple is the best consumer hardware company in AI, and this week affirmed that reality.

Anecdotally speaking, based on the people who cover AI that I follow these days, it seems there are largely two buckets of folks who are into local, on-device models: those who have set up pricey NVIDIA rigs at home for their CUDA cores (the vast minority); and – the undeniable majority – those who run a spectrum of local models on their Macs of different shapes and configurations (usually, MacBook Pros). If you have to run high-end, performance-intensive local models for academic or scientific workflows on a desktop, the M3 Ultra Mac Studio sounds like an absolute winner.

However, I’d point out that – again, as far as local, on-device models are concerned – Apple is not shipping the best possible hardware on smartphones.

While the entire iPhone 16 lineup is stuck on 8 GB of RAM (and we know how memory-hungry these models can be), Android phones with at least 12 GB or 16 GB of RAM are becoming pretty much the norm now, especially in flagship territory. Even better in Android land, what are being advertised as “gaming phones” with a whopping 24 GB of RAM (such as the ASUS ROG Phone 9 Pro or the RedMagic 10 Pro) may actually make for compelling pocket computers to run smaller, distilled versions of DeepSeek, LLama, or Mistral with better performance than current iPhones.

Interestingly, I keep going back to this quote from Mark Gurman’s latest report on Apple’s AI challenges:

There are also concerns internally that fixing Siri will require having more powerful AI models run on Apple’s devices. That could strain the hardware, meaning Apple either has to reduce its set of features or make the models run more slowly on current or older devices. It would also require upping the hardware capabilities of future products to make the features run at full strength.

Given Apple’s struggles, their preference for a hybrid on-device/server-based AI system, and the market’s evolution on Android, I don’t think Apple can afford to ship 8 GB on iPhones for much longer if they’re serious about AI and positioning their hardware as the best consumer-grade AI computers.

Permalink

Apple Delays Siri Personalization

Apple released a statement to John Gruber of Daring Fireball today announcing that it is delaying a “more personalized Siri.” According to Apple’s Jacqueline Roy:

Siri helps our users find what they need and get things done quickly, and in just the past six months, we’ve made Siri more conversational, introduced new features like type to Siri and product knowledge, and added an integration with ChatGPT. We’ve also been working on a more personalized Siri, giving it more awareness of your personal context, as well as the ability to take action for you within and across your apps. It’s going to take us longer than we thought to deliver on these features and we anticipate rolling them out in the coming year.

This isn’t surprising given where things stand with Siri and Apple Intelligence more generally, but it is still disappointing. Of all the features shown off at WWDC last year, the ability to have Siri take actions in multiple apps on your behalf through natural language requests was one of the most eagerly anticipated. But, I’d prefer to get a feature that works than one that is half-baked.

Still, you have to wonder where the rest of the AI market will be by the time a “more personalized Siri” is released and whether it will look as much like yesterday’s tech as some of today’s Apple Intelligence features do.

Permalink

YouTube Launches Premium Lite in the U.S.: A Limited But More Affordable Option

Today, YouTube introduced a new version of its premium service called Premium Lite. YouTube Premium has been around for a long time offering ad-free video viewing, downloads, and the ability to listen in the background for $13.99/month in the U.S.

Source: YouTube.

Source: YouTube.

With Lite, YouTube is offering a more affordable version of Premium for $7.99/month. The new tier will still include ads for music and music videos and it won’t let you download videos or play them in the background; however, other video categories will be ad-free.

It’s good to see YouTube offer something at a lower price point. YouTube Premium’s price has crept up in recent years, and the features that Lite leaves behind seem like natural break points in the service. For some users, YouTube is music, in which case, they’ll want the full Premium plan to avoid ads. For others like me, who use YouTube occasionally for music, $7.99 is a significantly better deal, though I’d prefer if background play was part of Lite, too.

Permalink

Apple Challenges UK’s Demand for iCloud ‘Back Door’

Yesterday, the Financial Times reported that Apple has filed a complaint against the UK government seeking to overturn a secret order demanding that it create “back door” access to iCloud. Apple has not commented on whether it received an order because doing so would violate UK law. Instead, the company announced that it would remove Advanced Data Protection, the feature that enables end-to-end iCloud encryption, from the devices of UK customers. However, that move did not end the dispute because the UK order reportedly applies not just to the iCloud accounts of UK citizens but also anyone outside the UK that British security services have a judicial warrant to investigate.

The Financial Times’ sources say that Apple has appealed the British government’s order to the Investigatory Power Tribunal, a judicial body that handles disputes with UK security services. If accurate, the challenge is believed to be the first of its kind. The Financial Times further reports that a hearing on Apple’s challenge to the order may take place as early as this month, although it is unclear to what degree the hearing will be made public.

Permalink

How I’m Learning Japanese 14 Days In

Brendon Bigley, my co-host on NPC: Next Portable Console, is getting married soon and planning a honeymoon to Japan. Before leaving, he and his partner decided to learn Japanese together. In doing so, Brendon has done a ton of research, quizzing friends who have learned the language, and trying a long list of resources, which he’s published on Wavelengths.

I love the idea that one of the biggest steps in learning Japanese is to learn how to learn Japanese:

So there are about one million different ways to go from here, and because every person is different it means you’ll need to try a few different things to get going. Many people say the first step of learning Japanese is to learn how to learn Japanese, and I’d say that’s almost correct. The first step, once again, is to learn hiragana and katakana… second is learning how to learn Japanese.

The reason things get so wild here is that people will say that you need to start learning kanji, vocab, and grammar simultaneously and it’s not not true. Learning even the most basic grammar helps make sense of sentence structure, which enables you to discern kanji and vocab words in the context of real written language. Conversely, the more kanji and vocab you learn the easier it will become to intuitively parse new grammatical rules as they’re introduced.

Learning how to learn something is a crucial step to any new and complicated undertaking but often gets overlooked. That’s because, as Brendon points out, everyone’s path to expertise in anything is different. It pays to listen to the advice of people you trust, as he did, but it’s just as important to listen to yourself and understand how you learn.

Brendon’s story has great advice for learning anything, but in particular, it’s packed with resources for learning Japanese. There are Mac and iOS apps, web apps, Android apps, textbooks, and more. As someone who has a kid traveling around Tokyo and Kyoto right now, I immediately sent him the link. It’s a great one to file away if you’ve ever thought to yourself, “Maybe I’ll try to learn Japanese some day.”

Permalink

“Everyone Is Caught Up, Except for Apple”

Good post by Parker Ortolani (who’s blogging more frequently now; I recommend subscribing to his blog) on the new (and surprisingly good looking?) Alexa+ and where Apple stands with Siri:

So here we are. Everyone is caught up, except for Apple. Siri may have a pretty glowing animation but it is not even remotely the same kind of personal assistant that these others are. Even the version of Siri shown at WWDC last year doesn’t appear to be quite as powerful as Alexa+. Who knows how good the app intents powered Siri will even be at the end of the day when it ships, after all according to reports it has been pushed back and looks like an increasingly difficult endeavor. I obviously want Siri to be great. It desperately needs improvement, not just to compete but to make using an iPhone an even better experience.

I continue to think that Apple has immense potential for Apple Intelligence and Siri if they get both to work right with their ecosystem. But at this point, I have to wonder if we’ll see GTA 6 before Siri gets any good.

Permalink