iFixit ran more tests and took a closer look at the keyboard membrane that was added to the 2018 MacBook Pro released last week. It turns out, the membrane is one sheet of die-cut silicone with tiny cutouts to allow keycaps to connect to the butterfly switches beneath.
To test how well the new design holds up against dust, iFixit sprayed a new MacBook Pro with a dusting of paint additive. They then tore apart the notebook and found:
Lo and behold, the dust is safely sequestered at the edges of the membrane, leaving the mechanism fairly sheltered. The holes in the membrane allow the keycap clips to pass through, but are covered by the cap itself, blocking dust ingress. The previous-gen butterfly keys are far less protected, and are almost immediately flooded with our glowing granules. On the 2018 keyboard, with the addition of more particulate and some aggressive typing, the dust eventually penetrates under the sheltered clips, and gets on top of the switch—so the ingress-proofing isn’t foolproof just yet. Time will tell how long the barrier will hold up.
iFixit followed up by testing with grittier sand, which managed to cause keys to stick.
It’s good to see that the MacBook Pro’s keyboard withstands fine particles better than earlier models in testing. The real test, of course, is long-term human testing. Only time will tell whether the 2018 MacBook Pros can hold up when faced with a crumbly muffin in your local café or the pollen blowing across your keyboard as you browse the web in your backyard.
Despite Apple's message that the iPad Pro can be a viable PC replacement because, among other features, it natively supports a dedicated external keyboard, its software still isn't fully optimized for keyboard control. This isn't surprising at all: iOS was designed with multitouch in mind; as long as the iPad shares a common foundation with the iPhone, it'll always be first and foremost a touch computer. The iPad Pro line, however, is nearing its third anniversary, and its external keyboard integration still feels like an afterthought that's hard to reconcile with the company's marketing.
Take multitasking for example: after three years, Split View, one of the iPad's marquee exclusive features, still can't be controlled from an external keyboard. If you buy an iPad Pro with a Smart Keyboard and assume that you're going to be able to assign an app to a side of the Split View, or maybe resize it, or perhaps change the keyboard's focus from one side to another...well, do not assume. As much as Apple argued against vertical touch screen surfaces in laptops years ago, the iPad Pro ended up in this very situation: if you want to take advantage of all the great features iOS 11 offers to pro users, you will have to take your hands off the Smart Keyboard and touch the screen. There are dozens of similar instances elsewhere in iOS. For the most part, the iPad treats external keyboards as inferior, bolt-on input devices.
It's with this context that I want to cover Things 3.6, a major update to the task manager's iPad version that gives us a glimpse into what Apple could do with external keyboard control on iPad if only they understood its potential.
I've been able to play around with Things 3.6 on my iPad Pro for the past couple of weeks. This isn't another "keyboard-centric" update that only adds a handful of shortcuts to trigger specific commands. Instead, the developers at Cultured Code have focused on an all-encompassing keyboard control framework for the whole app, from task lists to popovers and multiple selections. With version 3.6, Things has the best implementation of external keyboard support I've ever seen in an iPad app.
I've always liked Gboard, Google's alternative keyboard for iOS. Gboard combines Google's intelligence (with accurate and personalized autocorrect, emoji and GIF suggestions) with handy features such as glide typing. However, as I mentioned before, I couldn't use Gboard as my primary keyboard on the iPhone for a variety of reasons: it lacked iOS' native dictation mode, couldn't automatically switch light and dark themes depending on the context of the current app, and, worst of all, it didn't support multi-language typing without manually picking a different keyboard layout.
Siri dictation and simultaneous multi-language support are still missing (the former will likely never be added), but today's update is an important step towards making Gboard feel more integrated with iOS. Gboard can now switch its default theme between light and dark based on the app you're using. I'm not sure how Google pulled this off, but I tested Gboard with the dark modes in Apollo, Tweetbot, Copied, and Bear, and the keyboard always used the dark theme instead of the light version. Conversely, in Safari, App Store, and other light-themed apps, Gboard used the light mode instead.
This was one of my complaints about the old Gboard: its default light theme looked garish in dark apps; on the other hand, if you persistently enabled Gboard's dark theme, then it would look out of place in apps like Messages or Mail. With automatic theme switch, changing Gboard's default appearance is no longer a concern because it adapts to the app you're using.
I've found a couple of apps where Gboard doesn't correctly apply the dark theme (Overcast is one of them), but I'm impressed overall; Gboard even switches to the dark theme when you swipe down on the Home screen to open Spotlight. I'd be curious to know which iOS API Google is using to implement this option, and if third-party developers can optimize for Gboard in any way.
There’s been quite a stir over the last week regarding an area of the Apple cloud ecosystem where syncing has remained unreliable for years. Brian Stucki wrote a detailed post on the matter, which John Gruber linked to with echoing sentiments. Today, however, Gruber followed up on his post with an exciting update:
Good news related to yesterday’s item regarding the fact that text replacement shortcuts have never synced reliably between Macs or iOS devices: an Apple spokesperson emailed me to say they checked with the team, and an update that moves text replacement syncing to CloudKit should be rolling out to iOS 11 and MacOS 10.13 High Sierra users in the “next month or so”.
I use text replacement every day across my iOS devices, and haven’t dealt with syncing issues myself, but it’s great to hear that this long-standing problem should soon be put to rest for everyone else. Recent history shows that once a cloud product moves to CloudKit, syncing issues disappear almost entirely – hopefully that will be the case here as well.
One of the best third-party keyboard options on iOS just got better. The latest update for Gboard adds special integrations with two Google services: YouTube and Google Maps.
When using Gboard, tapping the G button will now present YouTube and Maps tabs alongside the standard Search option. Both new options present an assortment of suggestions when you first open them, along with the expected search function. YouTube’s suggestions appear to be a selection of top trending videos; in my testing Gboard wouldn’t show any videos personalized to me or my watch history, but that may change over time with more use. Maps shows an assortment of nearby locations, as well as your current location if you grant that permission. Making a selection copies a link to the content inside the text box, alongside a brief description of what you’re linking to.
The market for third-party keyboards on iOS has largely grown stagnant, but Google continues to show its commitment to Gboard. While adding new features to a keyboard could clutter its interface before long, YouTube and Maps are natural fits for Gboard, integrating well both from a functionality standpoint and in their placement in the interface.
I've been carrying Studio Neat's new Canopy, a combination keyboard case and iPad stand, for about a week. It's the first time Apple's Smart Keyboard has been off my iPad Pro since I got it, but I haven't missed it at all. There are still certain situations where I prefer the Smart Keyboard, but I love having the option to work on my iPad with Apple's Magic Keyboard when it suits my needs. So, while I won't be switching to a Magic Keyboard/Canopy combination full-time, it's a choice I'm glad to have and one I will use frequently.
Since getting a 12.9-inch iPad Pro last year, I've had a fairly troubled relationship with external iPad keyboards. I didn't like the original Logitech CREATE keyboard case, so, surprised by the lack of notable Smart Connector-based accessories, I ended up using an Apple Magic Keyboard for the majority of 2016.
Good overview of the Touch Bar and its developer API by Benjamin Mayo:
Developers can display pretty much whatever they want whilst their app is in the foreground; this includes swapping out views and buttons depending on the current window of their app (a compose window necessitates different Touch Bar accessory views than the inbox window). However, the Touch Bar does not allow persistent widgets, status items or similar features like always-visible news tickers. These constraints are unlikely to be lifted either; Apple is imposing the restriction so that the UI under the user’s finger isn’t constantly changing due to spurious notifications or text messages.
Apple wants the bar to display peaceful relatively-static UI based on the current task. Major changes to the Bar should only happen when the application state drastically changes, such as opening a new tab or beginning a new modal activity. To repeat: once an app’s window is not active, it loses its control to influence what is shown on the Bar. The system Control Strip sits to the right in a collapsed state by default, but can be disabled entirely in System Preferences if desired.
This makes sense to me: the Touch Bar is intended to be an extension of the keyboard that deals with input – it's not a smaller Dashboard or a widget container. This means that apps like PCalc won't be able to persistently display their controls in the Touch Bar unless they're the frontmost (active) app.
The more I think about it, the more the Touch Bar feels like the natural evolution of QuickType and the Shortcut Bar from iOS – to the point where I wonder if we'll ever get this kind of evolution on the iPad Pro as well (where the current app is always the frontmost one and system controls could use a faster way to be engaged than Control Center). Perhaps with a new external keyboard with its own embedded Touch Bar and T1 chip?
Google released a nice update to their iOS keyboard, Gboard, earlier today.
Cursor control can now be activated with 3D Touch, which is consistent with the behavior of Apple's keyboard. Gboard can't move the cursor freely on the screen like the system keyboard, though, which makes it more limited when it comes to swiping across multiple lines of text. Also, Google didn't implement haptic feedback when switching between contextual keyboard menus (such as holding down on the dash key), which is another detail that I appreciate in Apple's keyboard on the iPhone 7.
Similarly, Gboard now features Contacts integration to look up a person's contact card directly from the keyboard – but it's not as tightly integrated as QuickType suggestions in iOS 10. However, I prefer the presentation of contact cards in Gboard and I think Google's is a sweet solution as well.
Gboard is shaping up nicely, but I continue to wish Google paid more attention to the iPad layout and built true multilingual support for international users.