Posts in Linked

Proposed Epic–Google Deal Would Expand Access to the Play Store

Late yesterday, Epic Games and Google announced a settlement of their Google Play Store litigation that, subject to court approval, would open Google’s storefront more widely than ever before.

Like Apple’s revisions to its store in response to the European Commission, Google’s settlement is complex, but here are some of the highlights, as reported by Sean Hollister for The Verge:

  • Whereas the U.S. District Court’s injunction only applied to the U.S. Play Store, the settlement is global.
  • The settlement also runs through 2032, which extends beyond the three years ordered by the court.
  • Google has agreed to reduce its standard fee to 20%, and in some cases, 9% depending on the type of transaction.
  • Google will create a registration system to allow third-party storefronts.
  • Developer fees for using Play Billing, Google’s payment system, will be separated from the transaction fees.

As I said, though, there’s a lot more to the proposed settlement that you can read in full in the PDF linked in The Verge’s story, and it’s subject to court approval, but it does seem to reflect significant concessions by Google.

What does this mean for Apple and its App Store skirmishes with regulators around the world? Nothing technically; however, contextually, if the settlement is implemented, it should add to the pressure on Apple to open the App Store more widely in the U.S. and elsewhere.

Permalink

Finding the Best Sleep Tracker

Earlier this year, Andrej Karpathy wrote an in-depth analysis of four sleep tracking methods that Federico recently recommended I read. I’m glad he did, Karpathy, an AI researcher who has worked at OpenAI and Tesla, took the kind of nerdy, data-driven approach that I love.

Over the course of two months, Karpathy plotted sleep tracking results from:

Karpathy got the best results from the Whoop band and Oura ring, but just as interesting were how the data correlated to how he felt after a good night’s sleep:

…my sleep scores correlate strongly with the quality of work I am able to do that day. When my score is low, I lack agency, I lack courage, I lack creativity, I’m simply tired. When my sleep score is high, I can power through anything. On my best days, I can sit down and work through 14 hours and barely notice the passage of time. It’s not subtle.

I recommend reading the entire post for all the details of how each tracking method compared on variety of metrics. I’ve long been intrigued by the Whoop band and Oura ring as a companion to the Apple Watch. There’s overlap between the devices, but Karpathy has planted a seed in my brain that may lead to my own multi-device experiments.

Permalink

Max Weinbach on the M5’s Neural Accelerators

In addition to the M5 iPad Pro, which I reviewed earlier today, I also received an M5 MacBook Pro review unit from Apple last week. I really wanted to write a companion piece to my iPad Pro story about MLX and the M5’s Neural Accelerators; sadly, I couldn’t get the latest MLX branch to work on the MacBook Pro either.

However, Max Weinbach at Creative Strategies did, and shared some impressive results with the M5 and its GPU’s Neural Accelerators:

These dedicated neural accelerators in each core lead to that 4x speedup of compute! In compute heavy parts of LLMs, like the pre-fill stage (the processing that happens during the time to first token) this should lead to massive speed-ups in performance! The decode, generating each token, should be accelerated by the memory bandwidth improvements of the SoC.

Now, I would have loved to show this off! Unfortunately, full support for the Neural Accelerators isn’t in MLX yet. There is preliminary support, though! There will be an update later this year with full support, but that doesn’t mean we can’t test now! Unfortunately, I don’t have an M4 Mac on me (traveling at the moment) but what I was able to do was compare M5 performance before and after tensor core optimization! We’re seeing between a 3x and 4x speedup in prefill performance!

Looking at Max’s benchmarks with Qwen3 8B and a ~20,000-token prompt, there is indeed a 3.65x speedup in tokens/sec in the prefill stage – jumping from 158.2 tok/s to a remarkable 578.7 tok/s. This is why I’m very excited about the future of MLX for local inference on M5, and why I’m also looking forward to M5 Pro/M5 Max chipsets in future Mac models.

Permalink

Synology Drops Branded Drive Requirement

Brad Linder, writing for Liliputing:

Earlier this year Synology announced that you’d need to use Synology-branded hard drives in its 2025 line of “Plus” branded network-attached storage devices if you wanted full functionality. While you could theoretically use a non-Synology drive with the Synology DiskStation DS225+, DS425+, DS925+ and other models, you’d be unable to create data storage pools, or use volume deduplication.

As Linder reports, six months later, Synology has reversed course on what was a widely unpopular decision among Mac and PC users that was viewed by many as a way to lock them into overpriced drives unnecessarily. The change of direction was revealed in a Synology press release announcing DiskStation Manager 7.3, the OS that runs the company’s Plus line of NAS hardware.

This is great news for Mac users who felt betrayed by Synology’s previous announcement. However, as Linder also points out it does not change the fact that the same “Plus” series of 2025 NAS hardware does not include hardware-accelerated transcoding of H.264 and HEVC video, which previous models supported.

Permalink

Reports of Slide Over’s Death Were Greatly Exaggerated

Well, that didn’t take long.

In yesterday’s second developer beta of iPadOS 26.1, Apple restored the Slide Over functionality that was removed with the debut of the new windowing system in iPadOS 26.0 last month. Well…they sort of restored Slide Over, at least.

In my review of iPadOS 26, I wrote:

So in iPadOS 26, Apple decided to scrap Split View and Slide Over altogether, leaving users the choice between full-screen apps, a revamped Stage Manager, and the brand new windowed mode. At some level, I get it. Apple probably thinks that the functionality of Split View can be replicated with new windowing controls (as we’ll see, there are actual tiling options to split the screen into halves) and that most people who were using these two modes would be better served by the new multitasking system the company designed for iPadOS 26.

At the same time, though, I can’t help but feel that the removal of Slide Over is a misstep on Apple’s part. There’s really no great way to replicate the versatility of Slide Over with the iPad’s new windowing. Making a bunch of windows extra small and stacked on the side of the screen would require a lot of manual resizing and repositioning; at that point, you’re just using a worse version of classic windowing. I don’t know what Apple’s solution could have been here – particularly because, like I said above, the iPad did end up with too many multitasking systems to pick from. But the Mac also has several multitasking features, and people love the Mac, so maybe that’s fine, too?

Slide Over will be missed, but perhaps there’ll be a way for Apple to make it come back.

The unceremonious removal of Slide Over from iPadOS 26 was the most common comment I received from MacStories readers over the past month. I also saw a lot of posts on different subreddits from people who claimed they weren’t updating to iPadOS 26 so they wouldn’t lose Slide Over functionality. Perhaps Apple underestimated how much people loved and used Slide Over, or maybe – like I argued – they thought that multitasking and window resizing could replace it. In any case, Slide Over is back, but it’s slightly different from what it used to be.

The bad news first: the new Slide Over doesn’t support multiple apps in the Slide Over stack with their own dedicated app switcher. (This option was introduced in iPadOS 13.) So far, the new Slide Over is single-window only, and it works alongside iPadOS windowing to put one specific window in Slide Over mode. Any window can be moved into Slide Over, but only one Slide Over entity can exist at a time. From this perspective, Slide Over is different from full-screen: that mode also works alongside windowing, but multiple windows can be in their full-screen “spaces” at the same time.

On one hand, I hope that Apple can find a way to restore Slide Over’s former support for multiple apps. On the other, I feel like the “good news” part is the reason that will prevent the company from doing so. What I like about the new Slide Over implementation is that the window can be resized: you’re no longer constrained to using Slide Over in a “tall iPhone” layout, which is great. I like having the option to stretch out Music (which I’ve always used in Slide Over on iPad), and I also appreciate the glassy border that is displayed around the Slide Over window to easily differentiate it from regular windows. I feel, however, that since you can now resize the Slide Over window, also enabling support for multiple apps in Slide Over may get too confusing or complex to manage. Personally, now that I’ve tested it, I’d take a resizable single Slide Over window over multiple non-resizable apps in Slide Over.

Between improvements to local capture and even more keyboard shortcuts, it’s great (and reassuring) to see Apple iterate on iPadOS so quickly after last month’s major update. Remember when we used to wait two years for minor changes?

Permalink

Apps in ChatGPT

OpenAI announced a lot of developer-related features at yesterday’s DevDay event, and as you can imagine, the most interesting one for me is the introduction of apps in ChatGPT. From the OpenAI blog:

Today we’re introducing a new generation of apps you can chat with, right inside ChatGPT. Developers can start building them today with the new Apps SDK, available in preview.

Apps in ChatGPT fit naturally into conversation. You can discover them when ChatGPT suggests one at the right time, or by calling them by name. Apps respond to natural language and include interactive interfaces you can use right in the chat.

And:

Developers can start building and testing apps today with the new Apps SDK preview, which we’re releasing as an open standard built on the Model Context Protocol⁠ (MCP). To start building, visit our documentation for guidelines and example apps, and then test your apps using Developer Mode in ChatGPT.

Also:

Later this year, we’ll launch apps to ChatGPT Business, Enterprise and Edu. We’ll also open submissions so developers can publish their apps in ChatGPT, and launch a dedicated directory where users can browse and search for them. Apps that meet the standards provided in our developer guidelines will be eligible to be listed, and those that meet higher design and functionality standards may be featured more prominently—both in the directory and in conversations.

Looks like we got the timing right with this week’s episode of AppStories about demystifying MCP and what it means to connect apps to LLMs. In the episode, I expressed my optimism for the potential of MCP and the idea of augmenting your favorite apps with the capabilities of LLMs. However, I also lamented how fragmented the MCP ecosystem is and how confusing it can be for users to wrap their heads around MCP “servers” and other obscure, developer-adjacent terminology.

In classic OpenAI fashion, their announcement of apps in ChatGPT aims to (almost) completely abstract the complexity of MCP from users. In one announcement, OpenAI addressed my two top complaints about MCP that I shared on AppStories: they revealed their own upcoming ecosystem of apps, and they’re going to make it simple to use.

Does that ring a bell? It’s impossible to tell right now if OpenAI’s bet to become a platform will be successful, but early signs are encouraging, and the company has the leverage of 800 million active users to convince third-party developers to jump on board. Just this morning, I asked ChatGPT to put together a custom Spotify playlist with bands that had a similar vibe to Moving Mountains in their Pneuma era, and after thinking for a few minutes, it worked. I did it from the ChatGPT web app and didn’t have to involve the App Store at all.

If I were Apple, I’d start growing increasingly concerned at the prospect of another company controlling the interactions between users and their favorite apps. As I argued on AppStories, my hope is that the rumored MCP framework allegedly being worked on by Apple is exactly that – a bridge (powered by App Intents) between App Store apps and LLMs that can serve as a stopgap until Apple gets their LLM act together. But that’s a story for another time.

Permalink

Digital Foundry Documents the iPhone 17 Pro’s Major Gaming Gains

Yesterday, Digital Foundry published their review of the iPhone 17 Pro, which as you’d expect, focuses on the Pro model’s gaming capabilities. Tested against iPhone 13 and 15 Pros and using a series of benchmark tests as well as real-world gameplay, Digital Foundry’s tests revealed significant improvements in several areas, including:

  • GPU performance;
  • Ray tracing capabilities;
  • Second-generation dynamic caching; and
  • Improved thermals.

Benchmarks are one thing, but Digital Foundry’s tests also revealed real-world benefits to the iPhone 17 Pro, such as:

  • a 60% performance increase over the iPhone 15 Pro on Resident Evil Village,
  • the elimination of freezing and big frame rate drops in Assassin’s Creed Mirage,
  • more reliable 30 fps gameplay in Death Stranding.

Those are significant improvements in just two generations of the A-series chips.

What really caught my eye, though, was the impact of cooling. The iPhone 17 Pro uses a vapor chamber and aluminum chassis to help cool the device and keep its chips running at full speed longer. When Digital Foundry placed the 17 Pro on a desk fan, it became clear just how important that cooling is.

Running a benchmark stress test with air blowing across the back of the iPhone 17 Pro reduced the performance drop by roughly 50%. With those sorts of results, I expect external cooling solutions like this X5s telescopic controller and FX5 Cooler combo from GameSir to become a popular accessory in the coming year, so of course, I’ve ordered one and will report back soon.

Permalink

Apple Removes ICEBlock and Similar Apps From the App Store Under Pressure From the DOJ

Late yesterday, Apple pulled ICEBlock and similar apps from the App Store under pressure from the U.S. Department of Justice, according to Fox Business News. ICEBlock is a crowd-sourced app used to track the location of Immigration and Customs Enforcement agents.

We reached out to Apple today demanding they remove the ICEBlock app from their App Store — and Apple did so, [Attorney General Pam] Bondi said in a statement to Fox News Digital.

In a statement to Fox, Apple acknowledged pulling ICEBlock and similar apps saying:

We created the App Store to be a safe and trusted place to discover apps. Based on information we’ve received from law enforcement about the safety risks associated with ICEBlock, we have removed it and similar apps from the App Store.

Joshua Aaron, the developer of ICEBlock, which briefly held the top spot in the App Store’s social networking category, told Fox that the app has more than 1.1 million users. Aaron told Fox Business News that he intends to fight ICEBlock’s removal.

Permalink

Halide and Kino Developers Review the iPhone 17 Pro’s Cameras

Source: Lux.

Source: Lux.

Earlier this week, I shared my early impressions of the iPhone 17 Pro Max and included a few galleries of photos I’d taken at each of the model’s standard zoom levels. I was impressed by the results, which made me all the more excited to learn more of how Apple pulled this off.

One of my favorite annual iPhone camera reviews is from the team at Lux, the makers of Halide and Kino. Their experience with the iPhone 17 Pro’s cameras was similar to mine, but with a lot of nerdy camera detail that I love. The overall conclusion of their testing in New York, Iceland, and London is that:

This is, without a doubt, a great back camera system. With all cameras at 48MP, your creative choices are tremendous. I find Apple’s quip of it being ‘like having eight lenses in your pocket’ a bit much, but it does genuinely feel like having at least 5 or 6: Macro, 0.5×, 1×, 2×, 4× and 8× .

The story covers every camera and each zoom distance. Of the 2x, Lux found that:

Shooting at 2× on iPhone 17 Pro did produce noticeably better shots; I believe this can be chalked up to significantly better processing for these ‘crop shots’. Many people think Apple is dishonest in calling this an ‘optical quality’ zoom, but it’s certainly not a regular digital zoom either. I am very content with it, and I was a serious doubter when it was introduced.

Lux’s highest praise was probably for the 8x zoom:

The overall experience of shooting a lens this long should not be this good. I’ve not seen it mentioned in reviews, but the matter of keeping a 200mm lens somehow steady and not an exercise in tremendous frustration is astonishing. Apple is using both its very best hardware stabilization on this camera and software stabilization, as seen in features like Action Mode.

There are loads of beautiful photos in the post and a lot more detail than I’ve quoted here. Be sure to read through the entire post because what Apple is doing with camera hardware and software is really quite remarkable.

Permalink