This Week's Sponsor:

Kolide

Ensures that if a device isn’t secure it can’t access your apps.  It’s Device Trust for Okta.


Posts tagged with "Privacy"

All the Little Details of How ‘Sign In with Apple’ Works

Sarah Perez of TechCrunch has assembled an excellent, in-depth walkthrough answering key questions about how Apple’s upcoming authentication service, Sign In with Apple, will work:

From a security perspective, Apple offers a better option for both users and developers alike compared with other social login systems which, in the past, have been afflicted by massive security and privacy breaches.

Apple’s system also ships with features that benefit iOS app developers — like built-in two-factor authentication support, anti-fraud detection and the ability to offer a one-touch, frictionless means of entry into their app, among other things.
[…]
Despite the advantages to the system, the news left many wondering how the new Sign In with Apple button would work, in practice, at a more detailed level. We’ve tried to answer some of the more burning and common questions.

Perez addresses questions regarding what information a developer receives when a user chooses Sign In with Apple, whether it’s possible to use the authentication service on Android devices, when an app will and won’t be required to use Sign In with Apple, and more.

Despite some controversy regarding how strongly Apple is pushing this new secure login option, if it works as advertised, Sign In with Apple could be one of the upcoming OS features that has the biggest societal impact in the long run.

Permalink

Privacy No Longer a Marketing Angle for Apple, It’s a Service

Darrell Etherington, writing for TechCrunch:

Apple’s truly transforming into a privacy-as-a-service company, which shows in the way that it’s implementing both the new single sign-on account service, as well as its camera and location services updates in iOS 13. The SSO play is especially clever, because it includes a mechanism that will allow developers to still have the relevant info they need to maintain a direct relationship with their users – provided users willingly sign-up to have that relationship, but opting in to either or both name and email sharing.

For years, a major point of debate in tech circles has been the friction between privacy and convenience, particularly as relates to web services offered by companies like Apple and Google. Apple’s privacy-sensitive approach has, in some people’s view, hamstrung it from offering the same level of convenience in its services that’s found in competing services from Google, Amazon, and others who rely on sending your data to the cloud for analyzing.

This year at WWDC, Apple’s new privacy-focused initiatives seem to be striking more of a balance between convenience and security. The company’s new Sign in with Apple feature is a great example: it provides developers a way to contact their users directly, while still protecting those users’ actual email addresses so they can’t be sold to third parties. In my view that’s a brilliant win-win, and the type of innovation I hope we see more of in future products.


You can also follow all of our WWDC coverage through our WWDC 2019 hub, or subscribe to the dedicated WWDC 2019 RSS feed.

Permalink

Apple Illustrates iPhone Privacy with Real-World Analogs

Finding a way to convey the benefits of privacy isn’t easy, which is why I like Apple’s ‘Privacy on iPhone – Private Side’ video so much.

The video, which runs under a minute, opens with images of several ‘No Trespassing,’ ‘Keep Out,’ ‘Beware of Dog,’ and other signs. In a series of quick cuts, the video shows two people who pause an intense conversation when interrupted by a waiter as well as people locking file cabinets, closing blinds, locking doors, shredding documents, and more. Near the end, a woman rolls up the window of a car when she sees someone nearby watching her put on makeup.

As Apple’s description of the YouTube video says:

Your privacy matters. From encrypting your iMessage conversations, or not keeping a history of your routes in Maps, to limiting tracking across sites with Safari. iPhone is designed to protect your information.

Every clip of the video, which adds a bit of levity to an otherwise serious topic, reinforces the closing message that ‘If privacy matters in your life, it should matter to the phone your life is on.’

The video is an effective rebuttal of the ‘I have nothing to hide’ argument against privacy. Even the mundane aspects of day-to-day life aren’t something that you necessarily want to broadcast to the world, which this video is very effective in conveying.

Permalink

Facebook Receives Retribution from Apple for Violation of Enterprise Program Guidelines

Facebook is in the news again, and unsurprisingly it’s not the good kind of publicity.

Yesterday Josh Constine of TechCrunch exposed a “Facebook Research” VPN that Facebook has been using to harvest extensive phone data from users age 13 to 35 in exchange for payment from the company of up to $20/month. The practice was made possible by Facebook’s enterprise developer certificate from Apple, but after the story came to light, Apple swiftly responded by revoking that certificate from Facebook and publicly condemning the company’s misuse of Apple’s Enterprise Developer Program. That action caused the immediate end of the Facebook Research initiative on Apple platforms, but it also has reportedly brought widespread consequences throughout the entirety of Facebook’s company operations. Tom Warren and Jacob Kastrenakes, reporting for The Verge:

Apple has shut down Facebook’s ability to distribute internal iOS apps, from early releases of the Facebook app to basic tools like a lunch menu. A person familiar with the situation tells The Verge that early versions of Facebook, Instagram, Messenger, and other pre-release “dogfood” (beta) apps have stopped working, as have other employee apps, like one for transportation. Facebook is treating this as a critical problem internally, we’re told, as the affected apps simply don’t launch on employees’ phones anymore.
[…]
Revoking a certificate not only stops apps from being distributed on iOS, but it also stops apps from working. And because internal apps by the same organization or developer may be connected to a single certificate, it can lead to immense headaches like the one Facebook now finds itself in where a multitude of internal apps have been shut down.

This is more than a slap on the wrist, but it seems like a fitting response to Facebook’s blatant abuse of the Apple enterprise agreement. My main hope is that it causes Facebook to think twice before implementing any similarly shady initiatives in the future.

Permalink

Tim Cook Calls for US Privacy Regulations in Time Op-Ed

User privacy is one of the social drums Tim Cook has been consistently beating for years now, and today that’s continuing in an even stronger way with a new op-ed by Apple’s CEO published by Time. Cook writes:

I and others are calling on the U.S. Congress to pass comprehensive federal privacy legislation—a landmark package of reforms that protect and empower the consumer. Last year, before a global body of privacy regulators, I laid out four principles that I believe should guide legislation:

First, the right to have personal data minimized. Companies should challenge themselves to strip identifying information from customer data or avoid collecting it in the first place. Second, the right to knowledge—to know what data is being collected and why. Third, the right to access. Companies should make it easy for you to access, correct and delete your personal data. And fourth, the right to data security, without which trust is impossible.

In addition to outlining these four principles, Cook gets more specific in calling for a particular organization to be formed that counteracts a “shadow economy that’s largely unchecked” whereby people’s data is sold by retailers and other companies without express knowledge or consent. He writes:

Meaningful, comprehensive federal privacy legislation should not only aim to put consumers in control of their data, it should also shine a light on actors trafficking in your data behind the scenes. Some state laws are looking to accomplish just that, but right now there is no federal standard protecting Americans from these practices. That’s why we believe the Federal Trade Commission should establish a data-broker clearinghouse, requiring all data brokers to register, enabling consumers to track the transactions that have bundled and sold their data from place to place, and giving users the power to delete their data on demand, freely, easily and online, once and for all.

Apple has established a consistent practice of standing for user privacy, partly owing to its highly publicized standoff with the FBI in 2016, but it seems that in 2019 the company wants that value to be even more pronounced. First there was the unavoidable banner at CES touting the iPhone’s privacy advantage, and now today’s op-ed. It will be interesting to see if any of the ideas Cook pushes bring about productive discussion on this issue, leading to practical change in US policy.

Permalink

Tim Cook Calls for Strong Privacy Protections

On Wednesday, Apple CEO Tim Cook spoke at the International Conference of Data Protection and Privacy Commissioners in Brussels. Cook, who has stated many times that Apple believes privacy is a fundamental human right, called for federal privacy legislation. As transcribed in Ars Technica’s post on the speech, Tim Cook said:

We at Apple are in full support of a comprehensive federal privacy law in the United States. There, and everywhere, it should be rooted in four essential rights: First, the right to have personal data minimized. Companies should challenge themselves to de-identify customer data—or not to collect it in the first place.

Second, the right to knowledge. Users should always know what data is being collected and what it is being collected for. This is the only way to empower users to decide what collection is legitimate and what isn’t. Anything less is a sham.

Third, the right to access. Companies should recognize that data belongs to users, and we should all make it easy for users to get a copy of, correct, and delete their personal data. And fourth, the right to security. Security is foundational to trust and all other privacy rights.

Cook also commended the European Union on its General Data Protection Regulation, which went into effect earlier this year.

The importance of privacy to Apple is reflected on its website, during public events, and elsewhere. As the amount of data collected about everyone increases and the methods for creating sophisticated profiles of people with that data become more advanced, providing consumers with the tools to make informed decisions about what they share and control that data has become increasingly important. These aren’t issues that Apple can solve on its own, but as one of the largest global technology companies, it’s heartening to see the company taking a proactive stance on privacy.


Apple Strongly Refutes Bloomberg Report That Its Servers Were Compromised by Malicious Chips

Earlier today, Bloomberg published a story claiming that Apple and Amazon discovered tiny, malicious chips on Elemental network servers built by Super Micro. According to the story, the chips were the work of Chinese spies and designed to infiltrate the tech companies’ networks. Shortly after publication, Apple responded in an email statement strongly refuting Bloomberg’s account.

Amazon’s chief information security officer similarly discredited the claims saying in part:

There are so many inaccuracies in this article as it relates to Amazon that they’re hard to count.

A short time ago, Apple elaborated on its initial statement to Bloomberg on its Newsroom website:

In response to Bloomberg’s latest version of the narrative, we present the following facts: Siri and Topsy never shared servers; Siri has never been deployed on servers sold to us by Super Micro; and Topsy data was limited to approximately 2,000 Super Micro servers, not 7,000. None of those servers have ever been found to hold malicious chips.

Topsy is a startup that Apple acquired in 2013.

For over 12 months, Apple says it repeatedly told Bloomberg reporters and editors that they and their sources were incorrect.

We are deeply disappointed that in their dealings with us, Bloomberg’s reporters have not been open to the possibility that they or their sources might be wrong or misinformed. Our best guess is that they are confusing their story with a previously-reported 2016 incident in which we discovered an infected driver on a single Super Micro server in one of our labs. That one-time event was determined to be accidental and not a targeted attack against Apple.

Security and privacy are cornerstones of Apple’s business that it uses to differentiate the company’s products from competitors’, so the fact that the company takes this sort of claim seriously isn’t unusual. This also isn’t the first time Apple has taken Bloomberg to task on the veracity of its reporting. However, the forcefulness of the responses from Apple and Amazon, followed by Apple’s press release on its Newsroom site is something that is unprecedented. It will be interesting to see whether Bloomberg responds.


Screen Time for Kids Puts Powerful iOS Monitoring Tools and Use Restrictions in Parents’ Hands

For the past couple of years, the tech industry has grappled with the consequences of people carrying a tiny computer with them all day long. When it comes to Apple, iOS devices have long had a Do Not Disturb setting and notifications can be adjusted, but over time, it became apparent that the existing tools were not enough.

Screen Time is Apple’s solution to the information gap about how we use our iOS devices. The new feature, which is found in the Settings app in iOS 12, provides a wide array of metrics that give an unforgiving and eye-opening look into exactly how you use your devices each day.

Screen Time is also the means for acting on that information. Users can impose restrictions on when and how they use their devices.

The same tools are available to parents through Apple’s Family Sharing feature. Although Screen Time for kids is complex in some respects and lacking in others, it’s an excellent first step. The feature may require a time investment to master, but it succeeds on two levels. First, by working the same way it does for individuals, which Federico will cover in his iOS 12 review, managing the feature for a child will be familiar to anyone who uses Screen Time themselves. Second, although I’d like to see Apple implement some age-appropriate default settings in places, on balance, I’m satisfied that the complexity of Screen Time is a fair trade-off for the customization that it enables.

Read more


Privacy Policies Will Be Required for All Apps and Updates Beginning October 3rd

If you go to an app’s product page on any of Apple’s app stores, sometimes you’ll see a link to a privacy policy and sometimes you won’t. That’s because when iOS, macOS, or tvOS apps are submitted for beta testing or sale, developers have the option but aren’t required to link to a webpage with a privacy policy or, in the case of tvOS, include the policy directly in the app’s product page because the Apple TV doesn’t support web browsing. That will change soon.

As of October 3, 2018, any new app or update uploaded for beta testing or sale must include a privacy policy. Apple announced the change on its App Store Connect developer portal, which requires a developer account to access. Apple also noted in its announcement that a developer’s privacy policy link or text will only be editable when a developer submits a new version of their app for review.

Many app developers already provide privacy policies and are transparent about the information they collect from users and how they use it. I’m glad to see privacy policies become a requirement though because for some apps it’s not easy to track down how they use your data and there have been too many instances in the recent past where it’s been discovered that an app has used data in ways that users might not expect.