Finding a way to convey the benefits of privacy isn’t easy, which is why I like Apple’s ‘Privacy on iPhone – Private Side’ video so much.
The video, which runs under a minute, opens with images of several ‘No Trespassing,’ ‘Keep Out,’ ‘Beware of Dog,’ and other signs. In a series of quick cuts, the video shows two people who pause an intense conversation when interrupted by a waiter as well as people locking file cabinets, closing blinds, locking doors, shredding documents, and more. Near the end, a woman rolls up the window of a car when she sees someone nearby watching her put on makeup.
As Apple’s description of the YouTube video says:
Your privacy matters. From encrypting your iMessage conversations, or not keeping a history of your routes in Maps, to limiting tracking across sites with Safari. iPhone is designed to protect your information.
Every clip of the video, which adds a bit of levity to an otherwise serious topic, reinforces the closing message that ‘If privacy matters in your life, it should matter to the phone your life is on.’
The video is an effective rebuttal of the ‘I have nothing to hide’ argument against privacy. Even the mundane aspects of day-to-day life aren’t something that you necessarily want to broadcast to the world, which this video is very effective in conveying.
Facebook is in the news again, and unsurprisingly it's not the good kind of publicity.
Yesterday Josh Constine of TechCrunch exposed a "Facebook Research" VPN that Facebook has been using to harvest extensive phone data from users age 13 to 35 in exchange for payment from the company of up to $20/month. The practice was made possible by Facebook's enterprise developer certificate from Apple, but after the story came to light, Apple swiftly responded by revoking that certificate from Facebook and publicly condemning the company's misuse of Apple's Enterprise Developer Program. That action caused the immediate end of the Facebook Research initiative on Apple platforms, but it also has reportedly brought widespread consequences throughout the entirety of Facebook's company operations. Tom Warren and Jacob Kastrenakes, reporting for The Verge:
Apple has shut down Facebook’s ability to distribute internal iOS apps, from early releases of the Facebook app to basic tools like a lunch menu. A person familiar with the situation tells The Verge that early versions of Facebook, Instagram, Messenger, and other pre-release “dogfood” (beta) apps have stopped working, as have other employee apps, like one for transportation. Facebook is treating this as a critical problem internally, we’re told, as the affected apps simply don’t launch on employees’ phones anymore.
Revoking a certificate not only stops apps from being distributed on iOS, but it also stops apps from working. And because internal apps by the same organization or developer may be connected to a single certificate, it can lead to immense headaches like the one Facebook now finds itself in where a multitude of internal apps have been shut down.
This is more than a slap on the wrist, but it seems like a fitting response to Facebook's blatant abuse of the Apple enterprise agreement. My main hope is that it causes Facebook to think twice before implementing any similarly shady initiatives in the future.
User privacy is one of the social drums Tim Cook has been consistently beating for years now, and today that's continuing in an even stronger way with a new op-ed by Apple’s CEO published by Time. Cook writes:
I and others are calling on the U.S. Congress to pass comprehensive federal privacy legislation—a landmark package of reforms that protect and empower the consumer. Last year, before a global body of privacy regulators, I laid out four principles that I believe should guide legislation:
First, the right to have personal data minimized. Companies should challenge themselves to strip identifying information from customer data or avoid collecting it in the first place. Second, the right to knowledge—to know what data is being collected and why. Third, the right to access. Companies should make it easy for you to access, correct and delete your personal data. And fourth, the right to data security, without which trust is impossible.
In addition to outlining these four principles, Cook gets more specific in calling for a particular organization to be formed that counteracts a “shadow economy that’s largely unchecked” whereby people’s data is sold by retailers and other companies without express knowledge or consent. He writes:
Meaningful, comprehensive federal privacy legislation should not only aim to put consumers in control of their data, it should also shine a light on actors trafficking in your data behind the scenes. Some state laws are looking to accomplish just that, but right now there is no federal standard protecting Americans from these practices. That’s why we believe the Federal Trade Commission should establish a data-broker clearinghouse, requiring all data brokers to register, enabling consumers to track the transactions that have bundled and sold their data from place to place, and giving users the power to delete their data on demand, freely, easily and online, once and for all.
Apple has established a consistent practice of standing for user privacy, partly owing to its highly publicized standoff with the FBI in 2016, but it seems that in 2019 the company wants that value to be even more pronounced. First there was the unavoidable banner at CES touting the iPhone's privacy advantage, and now today's op-ed. It will be interesting to see if any of the ideas Cook pushes bring about productive discussion on this issue, leading to practical change in US policy.
On Wednesday, Apple CEO Tim Cook spoke at the International Conference of Data Protection and Privacy Commissioners in Brussels. Cook, who has stated many times that Apple believes privacy is a fundamental human right, called for federal privacy legislation. As transcribed in Ars Technica’s post on the speech, Tim Cook said:
We at Apple are in full support of a comprehensive federal privacy law in the United States. There, and everywhere, it should be rooted in four essential rights: First, the right to have personal data minimized. Companies should challenge themselves to de-identify customer data—or not to collect it in the first place.
Second, the right to knowledge. Users should always know what data is being collected and what it is being collected for. This is the only way to empower users to decide what collection is legitimate and what isn't. Anything less is a sham.
Third, the right to access. Companies should recognize that data belongs to users, and we should all make it easy for users to get a copy of, correct, and delete their personal data. And fourth, the right to security. Security is foundational to trust and all other privacy rights.
Cook also commended the European Union on its General Data Protection Regulation, which went into effect earlier this year.
The importance of privacy to Apple is reflected on its website, during public events, and elsewhere. As the amount of data collected about everyone increases and the methods for creating sophisticated profiles of people with that data become more advanced, providing consumers with the tools to make informed decisions about what they share and control that data has become increasingly important. These aren’t issues that Apple can solve on its own, but as one of the largest global technology companies, it’s heartening to see the company taking a proactive stance on privacy.
Earlier today, Bloomberg published a story claiming that Apple and Amazon discovered tiny, malicious chips on Elemental network servers built by Super Micro. According to the story, the chips were the work of Chinese spies and designed to infiltrate the tech companies’ networks. Shortly after publication, Apple responded in an email statement strongly refuting Bloomberg’s account.
Amazon’s chief information security officer similarly discredited the claims saying in part:
There are so many inaccuracies in this article as it relates to Amazon that they’re hard to count.
A short time ago, Apple elaborated on its initial statement to Bloomberg on its Newsroom website:
In response to Bloomberg’s latest version of the narrative, we present the following facts: Siri and Topsy never shared servers; Siri has never been deployed on servers sold to us by Super Micro; and Topsy data was limited to approximately 2,000 Super Micro servers, not 7,000. None of those servers have ever been found to hold malicious chips.
Topsy is a startup that Apple acquired in 2013.
For over 12 months, Apple says it repeatedly told Bloomberg reporters and editors that they and their sources were incorrect.
We are deeply disappointed that in their dealings with us, Bloomberg’s reporters have not been open to the possibility that they or their sources might be wrong or misinformed. Our best guess is that they are confusing their story with a previously-reported 2016 incident in which we discovered an infected driver on a single Super Micro server in one of our labs. That one-time event was determined to be accidental and not a targeted attack against Apple.
Security and privacy are cornerstones of Apple’s business that it uses to differentiate the company’s products from competitors’, so the fact that the company takes this sort of claim seriously isn’t unusual. This also isn’t the first time Apple has taken Bloomberg to task on the veracity of its reporting. However, the forcefulness of the responses from Apple and Amazon, followed by Apple’s press release on its Newsroom site is something that is unprecedented. It will be interesting to see whether Bloomberg responds.
For the past couple of years, the tech industry has grappled with the consequences of people carrying a tiny computer with them all day long. When it comes to Apple, iOS devices have long had a Do Not Disturb setting and notifications can be adjusted, but over time, it became apparent that the existing tools were not enough.
Screen Time is Apple’s solution to the information gap about how we use our iOS devices. The new feature, which is found in the Settings app in iOS 12, provides a wide array of metrics that give an unforgiving and eye-opening look into exactly how you use your devices each day.
Screen Time is also the means for acting on that information. Users can impose restrictions on when and how they use their devices.
The same tools are available to parents through Apple’s Family Sharing feature. Although Screen Time for kids is complex in some respects and lacking in others, it’s an excellent first step. The feature may require a time investment to master, but it succeeds on two levels. First, by working the same way it does for individuals, which Federico will cover in his iOS 12 review, managing the feature for a child will be familiar to anyone who uses Screen Time themselves. Second, although I’d like to see Apple implement some age-appropriate default settings in places, on balance, I’m satisfied that the complexity of Screen Time is a fair trade-off for the customization that it enables.
Many app developers already provide privacy policies and are transparent about the information they collect from users and how they use it. I’m glad to see privacy policies become a requirement though because for some apps it’s not easy to track down how they use your data and there have been too many instances in the recent past where it’s been discovered that an app has used data in ways that users might not expect.
As first reported by The Wall Street Journal, Facebook has removed its Onavo Protect VPN app from the App Store after Apple said the app violated rules against data gathering. The app was acquired by Facebook in 2013 as part of its purchase of an Israeli company.
Onavo collected user data using network traffic to provide market intelligence to Facebook about the popularity and use of apps outside its own apps. TechCrunch reported on Onavo's data collection practices back in February. In June during WWDC, Apple introduced new App Review Guidelines addressing data harvasting, which struck some as a direct response to Onavo.
In a statement to The Verge, Facebook said:
“We’ve always been clear when people download Onavo about the information that is collected and how it is used,” said a Facebook spokesperson in a statement given to The Verge. “As a developer on Apple’s platform, we follow the rules they’ve put in place.”
It’s good to see Apple enforce App Review guidelines against companies of all sizes, though a little disappointing that it has taken so long.
Today, Apple unveiled a new data and privacy website to comply with the European Union’s GDPR legislation that goes into effect on May 25th. The site allows users to copy and correct personally identifiable information associated with their Apple IDs and deactivate or delete their accounts. Although the new copy and deactivation options are only available in the EU, they will be rolling out throughout the remainder of the year to the rest of the world.