Steffen Reich ran some tests to determine range differences between AirPods, W1-equipped Beats headphones, and older Beats models:
Much has been said about the virtues of the W1 chip Apple started baking into their latest wireless Beats line-up and of course the AirPods. By now we know for sure that W1 facilitates a much faster pairing process, as do we know that the chip significantly amplifies both battery life and conservation techniques. What’s less prominently talked about – at least from official sides – is the operating range of these wireless headphones and the presumed effect the W1 chip addition has had on that benchmark.
Obviously, walking a straight line in a park is no replacement for the kind of wireless interference you'd have on a train, in a crowded street, or in an office with walls and other Bluetooth devices nearby. Also, the AirPods are a new category altogether – I'm not sure how relevant a comparison to non-wireless Bluetooth buds can be.
However, these base results are in line with the excellent range I also experienced with the Beats Solo3, which makes me wonder how impressive (range-wise) future Studio Wireless headphones will be.
I keep wishing Apple would license the W1 chip to third-parties – especially on large headphones, it makes pairing and range performance so much better than regular Bluetooth.
Ben Bajarin makes a strong point on using Siri with the AirPods:
There is, however, an important distinction to be made where I believe the Amazon Echo shows us a bit more of the voice-only interface and where I’d like to see Apple take Siri when it is embedded in devices without a screen, like the AirPods. You very quickly realize, the more you use Siri with the AirPods, how much the experience today assumes you have a screen in front of you. For example, if I use the AirPods to activate Siri and say, “What’s the latest news?” Siri will fetch the news then say, “Here is some news — take a look.” The experience assumes I want to use my screen (or it at least assumes I have a screen near me to look at) to read the news. Whereas, the Amazon Echo and Google Home just start reading the latest news headlines and tidbits. Similarly, when I activate Siri on the AirPods and say, “Play Christmas music”, the query processes and then plays. Where with the Echo, the same request yields Alexa to say, “OK, playing Christmas music from top 50 Christmas songs.” When you aren’t looking at a screen, the feedback is important. If I was to ask that same request while I was looking at my iPhone, you realize, as Siri processes the request, it says, “OK” on the screen but not in my ear. In voice-only interfaces, we need and want feedback that the request is happening or has been acknowledged.
Siri already adapts to the way it's activated – it talks more when invoked via "Hey Siri" as it assumes you're not looking at the screen, and it uses UI elements when triggered from the Home button.
Currently, activating Siri from AirPods yields the same feedback of the "Hey Siri" method. I wonder if future Siri will talk even more when it detects AirPods in your ear as it means only you will be able to hear its responses.
Apple revealed AirPod wireless headphones at its iPhone event in September. Initially, they were slated for release in October, but Apple subsequently announced that the product would be delayed.
This morning Apple updated its online store opening orders for AirPods. I ordered a pair and at least in the Chicago area, they are set to arrive by December 21st.
In a statement to TechCrunch, Apple said that it is delaying the introduction of AirPods:
The early response to AirPods has been incredible. We don’t believe in shipping a product before it’s ready, and we need a little more time before AirPods are ready for our customers.
Beyond the statement to TechCrunch, Apple has given no indication of the reason for the delay. Perhaps we will hear more during the Apple event tomorrow.