Today, Apple unveiled a new data and privacy website to comply with the European Union’s GDPR legislation that goes into effect on May 25th. The site allows users to copy and correct personally identifiable information associated with their Apple IDs and deactivate or delete their accounts. Although the new copy and deactivation options are only available in the EU, they will be rolling out throughout the remainder of the year to the rest of the world.
Posts tagged with "Privacy"
Mark Gurman and Stephanie Bodoni report for Bloomberg on an upcoming change that will make it easier for some users to access the personal data Apple stores on them:
The iPhone maker said it will update its web page for managing Apple IDs in coming months to let users download a copy of all their data stored with the company. The site will also let customers correct personal information, temporarily deactivate their account, and completely delete it. The new privacy controls are designed to comply with the European Union’s General Data Protection Regulation, which kicks in May 25, the Cupertino, California-based company said.
Apple’s new web-based privacy options will be released for users in Europe in early May, while the features will come to other countries later.
The report also notes that you could previously receive copies of your data, delete your account, and more by contacting Apple directly. So these options will not be entirely new, they'll just be available on the web for the first time.
It's unclear which countries outside Europe will receive these features. Hopefully in light of the recent Facebook data debacle, Apple will find it important to make these options available to all its users, regardless of where those users reside.
DuckDuckGo, the popular search engine for privacy-conscious users, today launched major updates to its browser extension and mobile apps in an effort to grant users data protection no matter where they are on the web.
The browser extension – available for Safari, Chrome, and Firefox – joins the revamped DuckDuckGo app on iOS and Android in providing a set of privacy features that affect your full browsing experience. In addition to the existing private search feature DuckDuckGo is known for, the extension and app now offer built-in tracker network blocking, smarter encryption, and a Privacy Grade rating for sites you visit.
DuckDuckGo's privacy features work seamlessly in the background for those using the extension or mobile app. Any hidden trackers detected by DuckDuckGo will be blocked, and users will be able to see a full list of exactly what has been blocked. If a site offers an encrypted version but doesn't automatically send all users to it, DuckDuckGo will perform that routing itself.
The Privacy Grade ratings are an interesting feature designed to give users a quick, easy understanding of each site's privacy practices. Each site receives its grade based on several factors – whether it offers an encrypted connection, what, if any, tracker networks are detected, including major tracker networks, and whether the site has published privacy practices that DuckDuckGo has vetted. Based on all of this information, each site contains a unique privacy grade ranging from A to F. The site will also receive an 'enhanced grade' where applicable, meaning the grade for the site after DuckDuckGo has deployed its blocking technology. Sites can only receive a perfect 'A' grade if no trackers were detected and the site's privacy policies have been reviewed by DuckDuckGo.
I've been using DuckDuckGo as my primary search engine for nearly a year, and have had a great experience with it. It will be interesting to see what difference, if any, DuckDuckGo's vetting and grading of sites will make in shaping future privacy practices.
Internet advertising firms are losing hundreds of millions of dollars following the introduction of a new privacy feature from Apple that prevents users from being tracked around the web.
Advertising technology firm Criteo, one of the largest in the industry, says that the Intelligent Tracking Prevention (ITP) feature for Safari, which holds 15% of the global browser market, is likely to cut its 2018 revenue by more than a fifth compared to projections made before ITP was announced.
Here's how Apple officially describes ITP in Safari 11's documentation:
Added Intelligent Tracking Prevention, which updates the default cookie and website data policy to isolate and remove cookies and website data for sites with the ability to track users across-site.
This isn't the first time ad companies have complained about Apple's protection of user privacy in Safari and stance against invasive cross-site tracking. In September, six trade groups claimed Apple was "sabotaging" the industry with a "unilateral and heavy-handed approach", to which Apple responded:
“Ad tracking technology has become so pervasive that it is possible for ad tracking companies to recreate the majority of a person’s web browsing history,” according to an Apple spokesperson. “This information is collected without permission and is used for ad re-targeting, which is how ads follow people around the Internet.”
"Unilateral" is exactly right: Apple should only care for the interests of users buying their devices, not those of third-party ad companies creepily tracking them around the web.
Cross-site tracking and ad targeting has gotten so out of hand over the past couple of years, it's become a regular comment from friends who don't follow tech news – "Why am I seeing an ad for something I was checking out two days ago on another site?" is a question I hear frequently despite the existence of third-party ad blockers and Apple's own ITP in Safari. Personally, I think the more Apple can advance ITP, the better it is for the privacy of all iOS users.
In the most recent issue of Apple's Machine Learning Journal, titled "Learning with Privacy at Scale," the team working on differential privacy shares details on exactly how its systems work. While much of the article is highly technical in nature, it concludes by sharing results from several real-life applications. Regarding emoji:
The data shows many differences across keyboard locales. In Figure 6, we observe snapshots from two locales: English and French. Using this data, we can improve our predictive emoji QuickType across locales.
The referenced chart is featured above, showing the popularity of certain emoji in different parts of the world.
The results regarding QuickType words aren't presented in a chart, but the article does mention words in several specific categories that Apple has been able to learn about thanks to differential privacy.
The learned words for the English keyboard, for example, can be divided into multiple categories: abbreviations like wyd, wbu, idc; popular expressions like bruh, hun, bae, and tryna, seasonal or trending words like Mayweather, McGregor, Despacito, Moana, and Leia; and foreign words like dia, queso, aqui, and jai. Using the data, we are constantly updating our on-device lexicons to improve the keyboard experience.
Another category of words discovered are known words without the trailing e (lov or th) or w (kno). If users accidentally press the left-most prediction cell above the keyboard, which contains the literal string typed thus far, a space will be added to their current word instead of the character they intended to type. This is a key insight that we were able to learn due to our local differentially private algorithm.
Though the article doesn't mention it, presumably the latter example of accidentally-tapped QuickType suggestions might lead to Apple adjusting sensitivity for its touch targets related to the 'e' button and the left-most prediction cell. It's interesting to consider what other unexpected lessons may be learned from differential privacy data.
The most timely update to the site is the inclusion of a Face ID security white paper. This 6-page PDF goes into impressive detail on how Face ID works, with an emphasis on privacy and several interesting details to be gleaned. One of the feature tidbits is a comprehensive listing of when your passcode will be needed in place of Face ID. Passcode input will be required under the following circumstances:
• The device has just been turned on or restarted.
• The device hasn’t been unlocked for more than 48 hours.
• The passcode hasn’t been used to unlock the device in the last 156 hours (six and a half days) and Face ID has not unlocked the device in the last 4 hours.
• The device has received a remote lock command.
• After five unsuccessful attempts to match a face.
• After initiating power off/Emergency SOS by pressing and holding either volume button and the side button simultaneously for 2 seconds.
Other sections of the document confirm that any third-party app currently supporting Touch ID will automatically support Face ID without any user or developer changes. Also, Face ID can be used for all the same things as Touch ID, such as purchases from the App Store, iTunes Store, and more.
Ina Fried, writing for Recode, got more details from Apple on how the company will be collecting new data from iOS 10 devices using differential privacy.
First, it sounds like differential privacy will be applied to specific domains of data collection new in iOS 10:
As for what data is being collected, Apple says that differential privacy will initially be limited to four specific use cases: New words that users add to their local dictionaries, emojis typed by the user (so that Apple can suggest emoji replacements), deep links used inside apps (provided they are marked for public indexing) and lookup hints within notes.
As I tweeted earlier this week, crowdsourced deep link indexing was supposed to launch last year with iOS 9; Apple's documentation mysteriously changed before the September release, and it's clear now that the company decided to rewrite the feature with differential privacy behind the scenes. (I had a story about public indexing of deep links here.)
I'm also curious to know what Apple means by "emoji typed by the user": in the current beta of iOS 10, emoji are automatically suggested if the system finds a match, either in the QuickType bar or with the full-text replacement in Messages. There's no way to manually train emoji by "typing them". I'd be curious to know how Apple will be tackling this – perhaps they'll look at which emoji are not suggested and need to be inserted manually from the user?
I wonder if the decision to make more data collection opt-in will make it less effective. If the whole idea of differential privacy is to glean insight without being able to trace data back to individuals, does it really have to be off by default? If differential privacy works as advertised, part of me thinks Apple should enable it without asking first for the benefit of their services; on the other hand, I'm not surprised Apple doesn't want to do it even if differential privacy makes it technically impossible to link any piece of data to an individual iOS user. To Apple's eyes, that would be morally wrong. This very contrast is what makes Apple's approach to services and data collection trickier (and, depending on your stance, more honest) than other companies'.
Also from the Recode article, this bit about object and scene recognition in the new Photos app:
Apple says it is not using iOS users’ cloud-stored photos to power the image recognition features in iOS 10, instead relying on other data sets to train its algorithms. (Apple hasn’t said what data it is using for that, other than to make clear it is not using its users photos.)
I've been thinking about this since the keynote: if Apple isn't looking at user photos, where do the original concepts of "mountains" and "beach" come from? How do they develop an understanding of new objects that are created in human history (say, a new model of a car, a new videogame console, a new kind of train)?
Apple said at the keynote that "it's easy to find photos on the Internet" (I'm paraphrasing). Occam's razor suggests they struck deals with various image search databases or stock footage companies to train their algorithms for iOS 10.
Andy Greenberg, writing for Wired, has a good explanation of differential privacy:
Differential privacy, translated from Apple-speak, is the statistical science of trying to learn as much as possible about a group while learning as little as possible about any individual in it. With differential privacy, Apple can collect and store its users’ data in a format that lets it glean useful notions about what people do, say, like and want. But it can’t extract anything about a single, specific one of those people that might represent a privacy violation. And neither, in theory, could hackers or intelligence agencies.
Differential privacy, Roth explains, seeks to mathematically prove that a certain form of data analysis can’t reveal anything about an individual—that the output of an algorithm remains identical with and without the input containing any given person’s private data. “You might do something more clever than the people before to anonymize your data set, but someone more clever than you might come around tomorrow and de-anonymize it,” says Roth. “Differential privacy, because it has a provable guarantee, breaks that loop. It’s future proof.”
TIME’s Nancy Gibbs and Lev Grossman have published the full transcript of a Tim Cook interview that will be the subject of the magazine's March 28 cover story.
It's a lengthy interview, with Cook discussing a variety of issues related to the FBI's requests in the San Bernardino case. Cook comments on his views on encryption in the modern technological landscape, how the US Congress should approach this debate, and why Apple views the FBI's demands as a threat to civil liberties. It's a great read with some fantastic passages.
The thing that is different to me about Messages versus your banking institution is, the part of you doing business with the bank, they need to record what you deposited, what your withdrawals are, what your checks that have cleared. So they need all of this information. That content they need to possess, because they report it back to you.
That’s the business they’re in. Take the message. My business is not reading your messages. I don’t have a business doing that. And it’s against my values to do that. I don’t want to read your private stuff. So I’m just the guy toting your mail over. That’s what I’m doing. So if I’m expected to keep your messages, and everybody else’s, then there should be a law that says, you need to keep all of these.
Now I think that would be really bad. I think it would be really bad because in order for me to keep them, I have to have a way to see them. If I have to have a way to see them and a place to copy them, you can imagine—if you knew where the treasure was buried at, and everybody else did, then it puts a bull’s eye on that target. And in the world of cyber security, the last thing you want is to have a target painted on you.