Posts in Linked

The MacBook Neo Takes on Retro Gaming

I love when my interests collide, and today, thanks to Russ Crandall, that’s exactly what happened. You see, Crandall runs Retro Game Corps, a YouTube channel covering the world of videogame emulation, handheld consoles, mini PCs, and more. It’s an excellent channel that we’ve covered multiple times on NPC: Next Portable Console, and yesterday, Crandall made a video exploring the MacBook Neo’s emulation capabilities.

It turns out that the Neo pulls its weight with more than productivity apps. It’s also does quite well with game emulation, some Steam titles, and streaming, with a couple of caveats.

Seeing is believing when it comes to emulation, so it’s worth seeing how your favorite systems fare before diving into emulation on the Neo yourself, but I was surprised to see how well the Neo did even on systems as recent as the Nintendo Switch 1. Beyond the GameCube, it’s hit or miss what will run well, but older systems like NES, Game Boy, GBA, SNES, PS1, PSP, 3DS, PS2, Dreamcast, and Saturn games all ran well and in most cases at upscaled resolutions and with shaders applied.

Probably the biggest limitation Crandall ran into is when he tried running games from more recent systems on external storage over the Neo’s USB-C 2 port, while using the USB-C 3 port for a capture card. Games from more recent systems are larger, so for anyone who wants to stream their gameplay, the Neo’s 256GB or optional 512GB internal storage could be a limitation.

That said, I was pleased to see how well the MacBook Neo handled emulation. Paired with lighter-weight Steam games, streaming on services like GeForce NOW, thanks to the Neo’s Wi-Fi 6E, and the App Store’s own catalog of native games, the Neo offers a lot of options for your downtime too.

Permalink

Apple Discontinues the Mac Pro

In a move that should suprise no one, Apple discontinued the Mac Pro today.

From Chance Miller at 9to5 Mac:

It’s the end of an era: Apple has confirmed to 9to5Mac that the Mac Pro is being discontinued. It has been removed from Apple’s website as of Thursday afternoon. The “buy” page on Apple’s website for the Mac Pro now redirects to the Mac’s homepage, where all references have been removed.

Apple has also confirmed to 9to5Mac that it has no plans to offer future Mac Pro hardware.

This change has felt inevitable ever since the introduction of the Mac Studio. Although the Mac Studio lacks the Mac Pro’s expansion slots, it packs a lot of performance into a much smaller package, meeting a wide variety of professional needs, which undoubtedly cut into the Mac Pro’s sales.

Permalink

ETA Prime Answers the Question: What if the MacBook Neo Had Thermo-Electric Cooling?

By all accounts, the MacBook Neo is a great budget laptop. However, because it has no active cooling system, the A18 Pro thermally throttles pretty quickly, degrading performance of games and other applications.

Thanks to how easy it is to take the Neo apart, you’ve probably already seen videos of users adding a thermal pad to transfer some of the heat from the Neo’s chip to its aluminum chassis. The result is a meaningful boost in performance.

ETA Prime decided to take things further with an external liquid-cooled solution that attaches magnetically to the Neo to see if even more performance could be squeezed out of the computer. The first step was to add a custom copper plate and thermal pad layer that transfers heat from the A18 Pro to the Neo’s case, which like other thermal pad solutions made a meaningful difference in game performance. Next, ETA Prime attached a thermal-electric cooling device typically used for gaming on mobile phones to the bottom of the Neo to push the A18 Pro’s operating temperature consistently below the throttling temperature.

The results were substantial. It’s worth watching the full video, but a game like No Man’s Sky, which throttled quickly and ran at around 30fps on the Neo, ran at around 60fps with ETA Prime’s mod. While a thermal-electric cooling device goes beyond what a typical user would be willing to do, it does make you wonder what the Neo would be capable of with a more robust built-in cooling system.

Permalink

A Developer’s Month with OpenAI’s Codex

An eye-opening story from Steve Troughton-Smith, who tested Codex for a month and ended up rewriting a bunch of his apps and shipping versions for Windows and Android:

I spent one month battle-testing Codex 5.3, the latest model from OpenAI, since I was already paying for the $20 ChatGPT Plus plan and already had access to it at no additional cost, with task after task. It didn’t just blow away my expectations, it showed me the world has changed: we’ve just undergone a permanent, irreversible abstraction level shift. I think it will be nigh-impossible to convince somebody who grows up with this stuff that they should ever drop down and write code the old way, like we do, akin to trying to convince the average Swift developer to use assembly language.

From his conclusion:

This story is unfinished; this feels like a first foray into what software development will look like for the rest of my life. Transitioning from the instrument player to the conductor of the orchestra. I can acknowledge that this is both incredibly exciting, and deeply terrifying.

I have perused the source code of some of these projects, especially during the first few days. But very quickly I learned there’s simply nothing gained from that. Code is trivial, implementations are ephemeral, and something like Codex can chew through and rewrite a thousand lines of code in a second. Eventually, I just trusted it. Granted, I almost always had a handwritten source of truth, as detailed a spec as any, so it had patterns and structure to follow.

The models are good now. A year ago, none of them could do any of this, certainly not to this quality level. But they don’t do it alone. A ton of work went into everything here, just a different kind of work to before. Above all, what mattered most in all of the above examples was taste. My taste, the human touch. I fear for the companies, oblivious to this, that trade their priceless human resources for OpenClaw nodes in a box.

The entire story is well-documented, rich in screenshots, and full of practical details for developers who may want to attempt a similar experiment.

It’s undeniable that programming is undergoing a massive shift that has possibly already changed the profession forever. Knowing what code is and does is still essential; writing it by hand does not seem to be anymore. And it sounds like the developers who are embracing this shift are happier than ever.

I’ve been thinking about this a lot: why are some of us okay with the concept of AI displacing humans in writing code, but not so much when it comes to, say, writing prose or music? I certainly wouldn’t want AI to replace me writing this, and I absolutely cannot stand the whole concept of “AI music” (here’s a great Rick Beato video on the matter). I don’t think I have a good answer to this, but the closest I can get is: code was always a means to an end – an abstraction layer to get to the actual user experience of a digital artifact. It just so happened that humans created it and had to learn it first. With text and storytelling, the raw material is the art form itself: what you read is the experience itself. But even then, what happens when the human-sourced art form gets augmented by AI in ways that increasingly blur the lines between what is real and artificial? What happens when a videogame gets enhanced by DLSS 5 or an article is a hybrid mesh of human- and AI-generated text? I don’t have answers to these questions.

I find what’s happening to software development so scary and fascinating at the same time: developers are reinventing themselves as “orchestrators” of tools and following new agentic engineering patterns. The results, like with Steve’s story, are out there and speak for themselves. I wish more people in our community were willing to have nuanced and pragmatic conversations about it rather than blindly taking sides.

Permalink

MotionVFX Team Joins Apple

Source: MotionVFX.

Source: MotionVFX.

Earlier today, MacRumors reported that MotionVFX was acquired by Apple. Based in Poland, MotionVFX has been a go-to resource for YouTubers and other creators for years with its highly-regarded plugins, templates, and tools for Final Cut Pro, DaVinci Resolve, and other apps.

According to MotionVFX’s note:

For over 15 years, we’ve been on a mission to create world-class, visually inspiring content and effects for video editors. From the very beginning, we’ve been all about quality, ease of use, and great design. These are also the values that we admire most in Apple’s products, and we’re thrilled to be able to embrace them together.

This is exciting news for anyone who uses Final Cut Pro. My hope is that the acquisition will result in MotionVFX’s plugins making their way into Creator Studio and being extended to the iPad. For the Mac, that would add a lot of value to Creator Studio. For the iPad, it would add plugin support for the first time, a feature I expected Apple to have shipped by now.

Permalink

“This Is Not The Computer For You”

I loved this essay by Sam Henri Gold on the MacBook Neo but, really, about where the “wrong” computer in your life can take you:

There is a certain kind of computer review that is really a permission slip. It tells you what you’re allowed to want. It locates you in a taxonomy — student, creative, professional, power user — and assigns you a product. It is helpful. It is responsible. It has very little interest in what you might become.

The MacBook Neo has attracted a lot of these reviews.

The consensus is reasonable: $599, A18 Pro, 8GB RAM, stripped-down I/O. A Chromebook killer, a first laptop, a sensible machine for sensible tasks. “If you are thinking about Xcode or Final Cut, this is not the computer for you.” The people saying this are not wrong. It is also not the point.

Nobody starts in the right place. You don’t begin with the correct tool and work sensibly within its constraints until you organically graduate to a more capable one. That is not how obsession works. Obsession works by taking whatever is available and pressing on it until it either breaks or reveals something. The machine’s limits become a map of the territory. You learn what computing actually costs by paying too much of it on hardware that can barely afford it.

(The MacBook Neo is a lovely computer that feels futuristic despite its specs. I was about to return mine, then decided to keep it because there’s something special about it. You can listen to the latest episode of Connected to hear my take on it.)

Sam’s story resonated with me because I’ve been there, not as a kid, but as a 24-year-old who needed to get work done from a hospital bed and chose to do so with an iPad. I stuck with it after that, despite a lot of people telling me it was the wrong computer for me.

Sometimes the “wrong” computer is the right obsession for you. You never know where that can take you. Go read Sam’s full story if you need a reminder of why specs don’t ultimately dictate someone’s creativity.

Permalink

Apple Is Working on an AI Music Tagging System

Music Business Worldwide (via MacRumors) is reporting that Apple is rolling out a voluntary metadata system for identifying AI-generated content on Apple Music called Transparency Tags. Introduced by Apple in a newsletter sent to music industry partners, Transparency Tags is:

a system of disclosure labels that record labels and music distributors can begin applying to content delivered to Apple Music immediately, and will be required to use when delivering new content in [the] future.

According to Music Business Worldwide, the tagging system covers artwork, tracks, composition elements such as lyrics, and music videos. The publication quotes Apple’s newsletter as explaining that it views Transparency Tags as part of an initial effort toward giving the music industry what it needs to develop AI policies.

Although there are currently no consequences for failing to properly tag AI-generated music, Transparency Tags are a step in the right direction. The music industry and other creative industries are all grappling with how to deal with a flood of AI-generated content in a rapidly evolving environment. I don’t expect to see one approach sweep across industries any time soon, but it’s encouraging to see Apple taking a lead in pushing the conversation forward.

Permalink

David Pogue Tells the Origin Story of the iPhone in Wired’s Excerpt of His Upcoming History of Apple

Source: Apple.

Source: Apple.

David Pogue’s Apple: The First 50 Years is set to be published in a week, but today, Wired has an excerpt about the origins of the iPhone. It’s a fascinating story that I’m sure most readers know, but Pogue’s excerpt contains a wealth of details and quotes from those who were there that does an excellent job of putting the events in context.

One of my favorite parts covers the prototyping of multi-touch years before the iPhone was released:

In late 2003, Apple commissioned FingerWorks to build a bigger version of their multi-touch pad: 12 x 9.5 inches, a better approximation of a computer screen’s size. Kerr’s team set up a test rig in the design studio of Infinite Loop 2. They mounted an LCD projector on a tripod, shining directly down onto the trackpad. They taped a sheet of white paper over it so that the projector’s image—generated by a nearby Power Mac—would be bright and clear. Then the fun began: developing ways to interact with the on-screen elements. You could slide a finger to move an icon in the projected image. You could spread two fingers apart to enlarge a map or a photo. Using both hands, you could tap, move, and stretch objects. It was magical.

If Wired’s excerpt today is any indication of the rest of the 600-page book, it should be a great read.

David Pogue’s Apple: The First 50 Years is set to be released on March 10, but it can be preordered now in the Apple Books app, from Amazon, and from other retailers.

Permalink