For the past couple of years, the tech industry has grappled with the consequences of people carrying a tiny computer with them all day long. When it comes to Apple, iOS devices have long had a Do Not Disturb setting and notifications can be adjusted, but over time, it became apparent that the existing tools were not enough.
Screen Time is Apple’s solution to the information gap about how we use our iOS devices. The new feature, which is found in the Settings app in iOS 12, provides a wide array of metrics that give an unforgiving and eye-opening look into exactly how you use your devices each day.
Screen Time is also the means for acting on that information. Users can impose restrictions on when and how they use their devices.
The same tools are available to parents through Apple’s Family Sharing feature. Although Screen Time for kids is complex in some respects and lacking in others, it’s an excellent first step. The feature may require a time investment to master, but it succeeds on two levels. First, by working the same way it does for individuals, which Federico will cover in his iOS 12 review, managing the feature for a child will be familiar to anyone who uses Screen Time themselves. Second, although I’d like to see Apple implement some age-appropriate default settings in places, on balance, I’m satisfied that the complexity of Screen Time is a fair trade-off for the customization that it enables.
Many app developers already provide privacy policies and are transparent about the information they collect from users and how they use it. I’m glad to see privacy policies become a requirement though because for some apps it’s not easy to track down how they use your data and there have been too many instances in the recent past where it’s been discovered that an app has used data in ways that users might not expect.
As first reported by The Wall Street Journal, Facebook has removed its Onavo Protect VPN app from the App Store after Apple said the app violated rules against data gathering. The app was acquired by Facebook in 2013 as part of its purchase of an Israeli company.
Onavo collected user data using network traffic to provide market intelligence to Facebook about the popularity and use of apps outside its own apps. TechCrunch reported on Onavo's data collection practices back in February. In June during WWDC, Apple introduced new App Review Guidelines addressing data harvasting, which struck some as a direct response to Onavo.
In a statement to The Verge, Facebook said:
“We’ve always been clear when people download Onavo about the information that is collected and how it is used,” said a Facebook spokesperson in a statement given to The Verge. “As a developer on Apple’s platform, we follow the rules they’ve put in place.”
It’s good to see Apple enforce App Review guidelines against companies of all sizes, though a little disappointing that it has taken so long.
Today, Apple unveiled a new data and privacy website to comply with the European Union’s GDPR legislation that goes into effect on May 25th. The site allows users to copy and correct personally identifiable information associated with their Apple IDs and deactivate or delete their accounts. Although the new copy and deactivation options are only available in the EU, they will be rolling out throughout the remainder of the year to the rest of the world.
Mark Gurman and Stephanie Bodoni report for Bloomberg on an upcoming change that will make it easier for some users to access the personal data Apple stores on them:
The iPhone maker said it will update its web page for managing Apple IDs in coming months to let users download a copy of all their data stored with the company. The site will also let customers correct personal information, temporarily deactivate their account, and completely delete it. The new privacy controls are designed to comply with the European Union’s General Data Protection Regulation, which kicks in May 25, the Cupertino, California-based company said.
Apple’s new web-based privacy options will be released for users in Europe in early May, while the features will come to other countries later.
The report also notes that you could previously receive copies of your data, delete your account, and more by contacting Apple directly. So these options will not be entirely new, they'll just be available on the web for the first time.
It's unclear which countries outside Europe will receive these features. Hopefully in light of the recent Facebook data debacle, Apple will find it important to make these options available to all its users, regardless of where those users reside.
DuckDuckGo, the popular search engine for privacy-conscious users, today launched major updates to its browser extension and mobile apps in an effort to grant users data protection no matter where they are on the web.
The browser extension – available for Safari, Chrome, and Firefox – joins the revamped DuckDuckGo app on iOS and Android in providing a set of privacy features that affect your full browsing experience. In addition to the existing private search feature DuckDuckGo is known for, the extension and app now offer built-in tracker network blocking, smarter encryption, and a Privacy Grade rating for sites you visit.
DuckDuckGo's privacy features work seamlessly in the background for those using the extension or mobile app. Any hidden trackers detected by DuckDuckGo will be blocked, and users will be able to see a full list of exactly what has been blocked. If a site offers an encrypted version but doesn't automatically send all users to it, DuckDuckGo will perform that routing itself.
The Privacy Grade ratings are an interesting feature designed to give users a quick, easy understanding of each site's privacy practices. Each site receives its grade based on several factors – whether it offers an encrypted connection, what, if any, tracker networks are detected, including major tracker networks, and whether the site has published privacy practices that DuckDuckGo has vetted. Based on all of this information, each site contains a unique privacy grade ranging from A to F. The site will also receive an 'enhanced grade' where applicable, meaning the grade for the site after DuckDuckGo has deployed its blocking technology. Sites can only receive a perfect 'A' grade if no trackers were detected and the site's privacy policies have been reviewed by DuckDuckGo.
I've been using DuckDuckGo as my primary search engine for nearly a year, and have had a great experience with it. It will be interesting to see what difference, if any, DuckDuckGo's vetting and grading of sites will make in shaping future privacy practices.
Alex Hern, reporting for The Guardian on the results of Safari's new Intelligent Tracking Prevention (ITP), launched last year with iOS 11 and macOS High Sierra (via John Gruber):
Internet advertising firms are losing hundreds of millions of dollars following the introduction of a new privacy feature from Apple that prevents users from being tracked around the web.
Advertising technology firm Criteo, one of the largest in the industry, says that the Intelligent Tracking Prevention (ITP) feature for Safari, which holds 15% of the global browser market, is likely to cut its 2018 revenue by more than a fifth compared to projections made before ITP was announced.
Here's how Apple officially describes ITP in Safari 11's documentation:
Added Intelligent Tracking Prevention, which updates the default cookie and website data policy to isolate and remove cookies and website data for sites with the ability to track users across-site.
This isn't the first time ad companies have complained about Apple's protection of user privacy in Safari and stance against invasive cross-site tracking. In September, six trade groups claimed Apple was "sabotaging" the industry with a "unilateral and heavy-handed approach", to which Apple responded:
“Ad tracking technology has become so pervasive that it is possible for ad tracking companies to recreate the majority of a person’s web browsing history,” according to an Apple spokesperson. “This information is collected without permission and is used for ad re-targeting, which is how ads follow people around the Internet.”
"Unilateral" is exactly right: Apple should only care for the interests of users buying their devices, not those of third-party ad companies creepily tracking them around the web.
Cross-site tracking and ad targeting has gotten so out of hand over the past couple of years, it's become a regular comment from friends who don't follow tech news – "Why am I seeing an ad for something I was checking out two days ago on another site?" is a question I hear frequently despite the existence of third-party ad blockers and Apple's own ITP in Safari. Personally, I think the more Apple can advance ITP, the better it is for the privacy of all iOS users.
In the most recent issue of Apple's Machine Learning Journal, titled "Learning with Privacy at Scale," the team working on differential privacy shares details on exactly how its systems work. While much of the article is highly technical in nature, it concludes by sharing results from several real-life applications. Regarding emoji:
The data shows many differences across keyboard locales. In Figure 6, we observe snapshots from two locales: English and French. Using this data, we can improve our predictive emoji QuickType across locales.
The referenced chart is featured above, showing the popularity of certain emoji in different parts of the world.
The results regarding QuickType words aren't presented in a chart, but the article does mention words in several specific categories that Apple has been able to learn about thanks to differential privacy.
The learned words for the English keyboard, for example, can be divided into multiple categories: abbreviations like wyd, wbu, idc; popular expressions like bruh, hun, bae, and tryna, seasonal or trending words like Mayweather, McGregor, Despacito, Moana, and Leia; and foreign words like dia, queso, aqui, and jai. Using the data, we are constantly updating our on-device lexicons to improve the keyboard experience.
Another category of words discovered are known words without the trailing e (lov or th) or w (kno). If users accidentally press the left-most prediction cell above the keyboard, which contains the literal string typed thus far, a space will be added to their current word instead of the character they intended to type. This is a key insight that we were able to learn due to our local differentially private algorithm.
Though the article doesn't mention it, presumably the latter example of accidentally-tapped QuickType suggestions might lead to Apple adjusting sensitivity for its touch targets related to the 'e' button and the left-most prediction cell. It's interesting to consider what other unexpected lessons may be learned from differential privacy data.
The most timely update to the site is the inclusion of a Face ID security white paper. This 6-page PDF goes into impressive detail on how Face ID works, with an emphasis on privacy and several interesting details to be gleaned. One of the feature tidbits is a comprehensive listing of when your passcode will be needed in place of Face ID. Passcode input will be required under the following circumstances:
• The device has just been turned on or restarted.
• The device hasn’t been unlocked for more than 48 hours.
• The passcode hasn’t been used to unlock the device in the last 156 hours (six and a half days) and Face ID has not unlocked the device in the last 4 hours.
• The device has received a remote lock command.
• After five unsuccessful attempts to match a face.
• After initiating power off/Emergency SOS by pressing and holding either volume button and the side button simultaneously for 2 seconds.
Other sections of the document confirm that any third-party app currently supporting Touch ID will automatically support Face ID without any user or developer changes. Also, Face ID can be used for all the same things as Touch ID, such as purchases from the App Store, iTunes Store, and more.