Apple’s HomePod received its first substantial update since launch yesterday, at last bringing the delayed AirPlay 2 and stereo pairing features to the public. It also came with a nice surprise in the form of Calendar support, the only new Siri domain added to the device thus far.
Posts in stories
HomePod 11.4 Hands-On: AirPlay 2, Stereo Pairing, and Calendar Support
Consistent Accessibility: How Apple Can Make iOS and the Mac More Accessible
Like all Apple products, macOS is an accessible platform. Blind and low vision users can navigate their Mac using VoiceOver, while someone who has physical motor delays can use Switch Control to edit videos in Final Cut. And under the Accessibility pane in System Preferences, there is a multitude of other features one can use, ranging from Zoom to Invert Colors to closed-captioning and more. Whatever your need, the breadth and depth of Apple’s accessibility software spans many domains. This is why Apple is lauded as the industry leader in accessibility: the tools run deep and they’re well-designed.
Still, accessibility on macOS doesn’t quite reach feature parity with iOS. Amidst rumors that Apple is working on a cross-platform set of APIs to bridge the company’s two primary operating systems, now is an opportune time to consider what each platform does and what they offer one another.
In the context of accessibility, the way Apple brings consistency between iOS and macOS is by sharing features and technologies among the two. As such, there are some iOS-first features macOS sorely needs, while the Mac offers things iOS would benefit from as well. Such enhancements would not only improve the user experience across devices, but also would make iOS and Mac software richer, fuller products overall. And most importantly, more accessible.
Second Life: Rethinking Myself Through Exercise, Mindfulness, and Gratitude
“There’s something in your latest scan that we need to double check.”
Here’s what I’ve learned about cancer as a survivor: even once you’re past it, and despite doctors’ reassurances that you should go back to your normal life, it never truly leaves you. It clings to the back of your mind and sits there, quietly. If you’re lucky, it doesn’t consume you, but it makes you more aware of your existence. The thought of it is like a fresh scar – a constant reminder of what happened. And even a simple sentence spoken with purposeful vagueness such as “We need to double check something” can cause that dreadful background presence to put your life on hold again.
The Power Mac G4 Line
The tower form factor may be a thing of the past, at least until the new Mac Pro shows up next year, but for years, if you needed the most powerful and flexible machine money could buy, the Power Mac was the only way to go.
For almost five years, the heart of the Power Mac was the PowerPC G4 chip. Starting in 1999 it clocked at just 350 MHz, but by the time the Power Mac G4 line was retired, a tower with dual 1.42 GHz CPUs could be ordered. In that time frame, things like Gigabit Ethernet, SuperDrives, and Wi-Fi became mainstream.
The Power Mac G4 came in three distinct cases over the years it was available. Each style of machine saw several revisions while in service, bringing the total number of models to 10. That’s a lot of computers to cover, so let’s get started.
Apple’s Chicago Event Will Mark a Milestone in the Evolution of Its Education Strategy
Next Tuesday, Apple will take the stage at Lane Tech College Prep High School in Chicago to announce ‘creative new ideas for teachers and students.’ As any Apple event approaches, it’s natural to speculate about what products might be announced. After all, that’s what usually happens at an Apple event.
However, there’s a forest getting lost for the trees in all the talk about new hardware and apps. Sure, those will be part of the reveal, but Apple has already signaled that this event is different by telling the world it’s about education and holding it in Chicago. It’s part of a broader narrative that’s seen a shift in Apple’s education strategy that can be traced back to WWDC 2016. Consequently, to understand where Apple may be headed in the education market, it’s necessary to look to the past.
Erasing Complexity: The Comfort of Apple’s Ecosystem
Every year soon after WWDC, I install the beta of the upcoming version of iOS on my devices and embark on an experiment: I try to use Apple’s stock apps and services as much as possible for three months, then evaluate which ones have to be replaced with third-party alternatives after September. My reasoning for going through these repetitive stages on an annual basis is simple: to me, it’s the only way to build the first-hand knowledge necessary for my iOS reviews.
I also spent the past couple of years testing and switching back and forth between non-Apple hardware and services. I think every Apple-focused writer should try to expose themselves to different tech products to avoid the perilous traps of preconceptions. Plus, besides the research-driven nature of my experiments, I often preferred third-party offerings to Apple’s as I felt like they provided me with something Apple was not delivering.
Since the end of last year, however, I’ve been witnessing a gradual shift that made me realize my relationship with Apple’s hardware and software has changed. I’ve progressively gotten deeper in the Apple ecosystem and I don’t feel like I’m being underserved by some aspects of it anymore.
Probably for the first time since I started MacStories nine years ago, I feel comfortable using Apple’s services and hardware extensively not because I’ve given up on searching for third-party products, but because I’ve tried them all. And ultimately, none of them made me happier with my tech habits. It took me years of experiments (and a lot of money spent on gadgets and subscriptions) to notice how, for a variety of reasons, I found a healthy tech balance by consciously deciding to embrace the Apple ecosystem.
The Initial iPhone SDK
As noted by Craig Hockenberry, it has been a full decade since Apple shipped the first version of the iPhone SDK to developers.
It’s hard to remember today that, in the beginning, the iPhone didn’t have third-party apps. It came with a handful of built-in apps written by Apple for things like checking stocks and the weather, jotting down quick notes, making calendar events and reviewing contact information.
These apps were, for the most part, self-contained. The rich environment we enjoy on iOS today where apps can share lots of data with each other just wasn’t present in 2007.
The outlier in this paradigm was Safari, which put the Internet — or at least the parts that didn’t require Flash — in the palm of our hands.
What HomePod Should Become: A Hub for All Apple-Centric Needs
Today the HomePod is all about music, but it could be so much more.
From its debut last June at WWDC to launch day this February, HomePod’s primary purpose has been clear: it’s an Apple Music accessory. Music has been the sole focus of Apple’s marketing, including the recent Spike Jonze short film – yet it’s an angle many have trouble accepting.
In a pre-Amazon Echo world, HomePod being a great Apple Music speaker would have been enough. But in 2018 we expect more from smart speakers, and we expect more from Apple.
HomePod succeeds as a music speaker, but it’s not the device we expected – at least not yet. Due to its arrival date more than three years after the birth of Alexa, we expected a smarter, more capable product. We expected the kind of product the HomePod should be: a smart speaker that’s heavy on the smarts. Apple nailed certain aspects with its 1.0: the design, sound quality, and setup are all excellent. But that’s not enough.
HomePod isn’t a bad product today, but it could become a great one.
How?
By becoming a true hub for all our Apple-centric needs.
It’s Time for a Complete Home App Makeover
I’ve spent a lot of time experimenting with home automation gear since late last year: lightbulbs, light strips, electrical plugs, and door sensors, among other things. Most of the devices I’ve tested support Apple’s HomeKit APIs and those that don’t, I plan to integrate with Homebridge. That means I’ve also spent a lot of time in Apple’s Home app, and despite poking around in every corner of the app, I still find it frustrating and hard to use.
Last week, I tried to configure something new in my studio that illustrates several of the Home app’s problems. I have several Hue lights: there’s one over the steps leading to my workspace, four recessed ceiling lights, a floor lamp, and a desk lamp. It’s a lot of lights, but my studio is in the basement, and the lights help keep it from feeling like a cave.
I set the lights up as a group in Home and trigger them with a motion sensor. As soon as I open the door to the basement, the lights come on. I added a Hue switch on the wall as a shortcut for turning the lights on and off too.
I recently got an Elgato Eve door sensor for my back door. I figured it would be nice to know when my wife and son got home in the afternoon if I’m working away with the door closed, listening to music, and don’t hear the back door open. I can get alerts from the sensor on my Apple Watch and iPhone, but I thought it might be fun to also turn the Hue light on my desk red when the door opens as a visual cue. That turned out to be harder than I anticipated.