This Week's Sponsor:

Kolide

Ensures that if a device isn’t secure it can’t access your apps.  It’s Device Trust for Okta.


Posts tagged with "Privacy"

Apple Publishes ‘A Day in the Life of Your Data’ Case Study and Reveals That App Tracking Transparency is Coming in the Spring

Today is Data Privacy Day, and to mark the occasion, Apple has published a case study titled ‘A Day in the Life of Your Data.’ In the accompanying press release, Craig Federighi, Apple’s senior vice president of Software Engineering explains the company’s approach to privacy:

Privacy means peace of mind, it means security, and it means you are in the driver’s seat when it comes to your own data. Our goal is to create technology that keeps people’s information safe and protected. We believe privacy is a fundamental human right, and our teams work every day to embed it in everything we make.

Apple’s efforts to put its customers in control of their data are not new, but as they evolve and expand, so have tensions with other tech industry titans like Facebook. Part of the lastest tensions stems from the fact that as part of the next iOS and iPadOS beta, Apple will begin testing a system that alerts users when an app wants to share data it collects with other apps, websites, and companies. The most common way apps do this is with the Identifier for Advertisers or IDFA, a unique code that identifies your device.

Users can already go into the Privacy section of the Settings app to turn off IDFA-based tracking under ‘Tracking,’ but that requires people to know about the setting and find it. Apple’s new system is similar to other privacy flows throughout iOS in that it displays an alert when an app that wants to use tracking is launched, asking the user to grant it permission.

Facebook and others, whose advertising relies on aggregating data about users from multiple apps and websites and then tying it back to a specific individual, see this as a threat to their business models. Attempting to reframe the issue as one of economics, Facebook argues that the change will hurt small businesses who purchase targeted ads because those ads will no longer be as effective.

The new privacy feature, which Apple calls App Tracking Transparency, comes on the heels of the standardized privacy disclosures the company began requiring from developers with its Fall 2020 OS updates. For apps like Facebook, the disclosures are extensive, but to its credit, Facebook published its disclosures late last year, while Google still hasn’t.

Apple's App Store privacy labels make it clear to users that third-party Twitter clients collect far less data than the official app, for example.

Apple’s App Store privacy labels make it clear to users that third-party Twitter clients collect far less data than the official app, for example.

Apple’s case study is a day-in-the-life hypothetical that follows a father and daughter throughout their day together. The document is peppered with facts about tracking and data brokers, including a citation to a study that says the average app includes six trackers. Most useful, though, is the case study’s plain-English, practical examples of the kind of tracking that can occur as you go about a typical day’s activities using apps.

It’s impossible to create a case study that users who aren’t security experts will understand without glossing over details and nuances inherent to privacy and tracking. However, the study is extensively footnoted with citations to back up the statements it makes for those who want to learn more, which I think strikes the right balance. I’ve had tracking turned off on my devices since it was possible, and personally, I’m glad to see the feature is going to be surfaced for others who may not be aware of its existence.


Apple and Privacy in 2020: Wide-Reaching Updates with Minimal User Intrusion

Privacy has increasingly become a competitive advantage for Apple. The bulk of the company’s revenue comes from hardware sales, in stark contrast to competitors like Google who depend heavily on ad revenue and thus benefit tremendously from collecting user data. Apple calls privacy one of its core values, and the structure of its business makes it easier to hold true to that value. But that doesn’t mean its privacy work is easy or without cost – behind the huge number of privacy enhancements this year was surely significant effort and resources that could have been diverted elsewhere. The company’s privacy discourse isn’t empty marketing speak; it’s product-shaping. Not only that, but thanks to Apple’s enormous influence in tech, it can be industry-shaping too, forcing companies that otherwise may not prioritize user privacy to do business differently.

This year in its WWDC keynote, Apple dedicated an entire section of the presentation to privacy, detailing its latest efforts within the framework of what it calls its four privacy pillars:

  • On-device processing
  • Data minimization
  • Security protections
  • Transparency and control

Evidence of each of these pillars can be seen throughout much of what Apple announced during the rest of the keynote. On-device processing, for example, powers the new Translate app in iOS 14, HomeKit Secure Video’s face recognition feature, and more. New security protections have been implemented to warn you if a Keychain password’s been compromised, and to enable Sign In with Apple for existing in-app accounts, both of which make your accounts more secure. But the majority of this year’s most prominent privacy updates fell under the remaining two core pillars: data minimization and transparency and control.

Here are the privacy-focused changes you’ll see this fall across iOS and iPadOS 14 and macOS Big Sur.

Read more


John Giannandrea on the Broad Reach of Machine Learning in Apple’s Products

Today Samuel Axon at ArsTechnica published a new interview with two Apple executives: SVP of Machine Learning and AI Strategy John Giannandrea and VP of Product Marketing Bob Borchers. The interview is lengthy yet well worth reading, especially since it’s the most we’ve heard from Apple’s head of ML and AI since he departed Google to join the company in 2018.

Based on some of the things Giannandrea says in the interview, it sounds like he’s had a very busy two years. For example, when asked to list ways Apple has used machine learning in its recent software and products, Giannandrea lists a variety of things before ultimately indicating that it’s harder to name things that don’t use machine learning than ones that do.

There’s a whole bunch of new experiences that are powered by machine learning. And these are things like language translation, or on-device dictation, or our new features around health, like sleep and hand washing, and stuff we’ve released in the past around heart health and things like this. I think there are increasingly fewer and fewer places in iOS where we’re not using machine learning. It’s hard to find a part of the experience where you’re not doing some predictive [work].

One interesting tidbit mentioned by both Giannandrea and Borchers is that Apple’s increased dependence on machine learning hasn’t led to the company talking about ML non-stop. I’ve noticed this too – whereas a few years ago the company might have thrown out ‘machine learning’ countless times during a keynote presentation, these days it’s intentionally more careful and calculated in naming the term, and I think for good reason. As Giannandrea puts it, “I think that this is the future of the computing devices that we have, is that they be smart, and that, that smart sort of disappear.” Borchers expounds on that idea:

This is clearly our approach, with everything that we do, which is, ‘Let’s focus on what the benefit is, not how you got there.’ And in the best cases, it becomes automagic. It disappears… and you just focus on what happened, as opposed to how it happened.

The full interview covers subjects like Apple’s Neural Engine, Apple Silicon for Macs, the benefits of handling ML tasks on-device, and much more, including a fun story from Giannandrea’s early days at Apple. You can read it here.

Permalink

Craig Federighi on Apple’s New Privacy Initiatives

Michael Grothaus at Fast Company interviewed Craig Federighi on a subject that was prominent in yesterday’s WWDC keynote: privacy. The interview begins with Federighi sharing how Apple considers the extended long-term impact of its decisions in this area:

“We hope to build a lot of great products that bring customers a lot of joy every year,” he says. “But in the fullness of time, in the scope of hundreds of years from now, I think the place where I hope people can look back and talk about the places where Apple made a huge contribution to humanity is in helping people see the way of taking advantage of this great technology without the false tradeoff of giving up their privacy to do it.”

Grothaus highlights several new privacy features throughout his piece, all of which will arrive this fall in Apple’s new batch of software releases. One especially interesting feature is called Approximate Location:

With this option, an app will never know the precise spot you’re at. Instead, it will learn the general area, which is often enough to provide the same level of service without intruding on your privacy to the same degree. To achieve the “approximate location” feature, Apple divided the entire planet into regions roughly 10 square miles in size. Each region has its own name and boundaries, and the area of the region is not based on a radius from the user–it’s fixed. That means that an app can’t extrapolate your precise location from approximate location data, because you aren’t necessarily at the center point of that approximate location boundary.

In a packed, fast-moving keynote, it was noteworthy that Apple dedicated an entire segment of the presentation to privacy. The company ships new privacy features annually, and considering its stated focus on future centuries, it shows no signs of letting up any time soon.


You can also follow all of our WWDC coverage through our WWDC 2020 hub, or subscribe to the dedicated WWDC 2020 RSS feed.

Permalink

Apple Reveals Major Update to Its Privacy Webpage

Privacy and everything it entails is not easy to explain. Under the hood, it’s driven by complex mathematics and code. However, in practice, app privacy starts with how apps are designed. Some are designed to collect information about you, and others aren’t. With Apple’s update to its privacy page today, the company has created a site that explains how privacy drives the design of its apps in clear, concise language. However, for anyone who wants to understand the nitty-gritty details, Apple has also published white papers and linked to other materials that provide a closer look at the issues that the main page addresses.

Apple’s Privacy webpage starts with a declaration of the company’s position on privacy:

Privacy is a fundamental human right. At Apple, it’s also one of our core values. Your devices are important to so many parts of your life. What you share from those experiences, and who you share it with, should be up to you. We design Apple products to protect your privacy and give you control over your information. It’s not always easy. But that’s the kind of innovation we believe in.

What follows is an app-by-app explanation of how each is designed to give users control over what they share and limit what Apple collects. Safari, Maps, Photos, Messages, Siri, News, Wallet, Health, Sign On with Apple, and the App Store are all covered with playful animations and a short explanation of what they do to protect your privacy.

Additional information about each app, their underlying technologies, plus iCloud, CarPlay, Home, education and children’s privacy, and other features follow under the Features tab. Included among the more detailed materials are white papers that go even deeper on Safari, Location Services, differential privacy, iOS security, and Face ID security. There is also a tech brief on Photos and links to additional materials about Ask Siri, Siri Suggestions, Apple Pay, ResearchKit and CareKit, Apple News, Apple Music, the Apple TV app, Apple Arcade, iCloud, Screen Time, Family Sharing, security in education, Apple’s student privacy pledge, and your data and privacy page. That’s a lot of information, but it’s presented in a thoughtful, compelling way that lets you go deeper if you want without being confusing or difficult to navigate.

The final new tab is called Control. Some of these tips and guides were available before, but the page has been updated with new practical suggestions on how you can make your Apple devices more secure. The page covers passcodes, Touch ID and Face ID, two-factor authentication, Find My, the alerts that explain the information third-party apps request, using your data and privacy page, advertising, analytics, and more. It’s an excellent place for users looking for ways to take command of the security of their devices.

I know that a lot of MacStories readers care strongly about their privacy and the security of their devices, and many are aware of at least some of what’s covered on Apple’s privacy page. However, it’s still worth a visit because privacy and security are part of so much of what Apple does now, that I expect there are at least a few tidbits on this new page that will be new to everyone. It’s also a great page to share with family members and friends who may not be as aware of the privacy issues related to their devices.


Apple Announces Changes to Siri Grading Program

Earlier this month, Apple suspended its Siri grading program, in which third-party contractors listened to small snippets of audio to evaluate Siri’s effectiveness. Today in a press release, Apple explained its Siri grading program and changes the company is making:

We know that customers have been concerned by recent reports of people listening to audio Siri recordings as part of our Siri quality evaluation process — which we call grading. We heard their concerns, immediately suspended human grading of Siri requests and began a thorough review of our practices and policies. We’ve decided to make some changes to Siri as a result.

Apologizing for not living up to the privacy standards customers expect from it, Apple outlined three changes that will be implemented this fall when operating system updates are released:

First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve.

Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time.

Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.

This is a sensible plan. It’s clear, concise, and has the benefit if being verifiable once implemented. It’s unfortunate that Siri recordings were being handled this way in the first place, but I appreciate the plain-English response and unambiguous plan for the future.


Sign In with Apple: Goodbye Account Management

I love trying new apps and services. It may be part of my job at MacStories, but even if it weren’t, I would still constantly be on the lookout for interesting, creative products that can benefit either my work or leisure. In recent years it seems like there’s always a fresh stream of apps and services to check out. Often when I try something new, however, I’m immediately confronted with the obstacle of a login screen. At which point there’s a choice to make: do I go through the hassle of creating an account for this service, or – if the option is available – do I simply authenticate via a third party like Google or Facebook? Sadly, neither option is ideal.

Creating a new account for every service you try is a major pain. It’s made easier with the aid of iCloud Keychain and 1Password, but while those tools eliminate lots of friction, they can be a little clunky, and in the end you’re still trusting your data to the (usually unknown) privacy policies of the service you sign up for.

Third-party login buttons solve the convenience problem, mostly. They may require entering your credentials for that third-party service, but at least you don’t have to create and remember new credentials for multiple services. The data privacy issue can be a question mark with these buttons though; when you authenticate through, let’s say Facebook, do you really know exactly what data you’re sharing with the new service? Or how the service will use that data? As consumers continue losing trust in Facebook itself to secure their data, why would they trust a service that taps into their Facebook data?

Sign In with Apple is a modern alternative to the current mess of login methods, offering Apple users a solution that addresses the current options’ shortfalls. It makes account creation and sign-in trivially simple – even more so than buttons from Google or Facebook – while also keeping your data in the hands of a company with a decent privacy track record.

When apps update to adopt Sign In with Apple, I suspect many users’ initial thoughts will be some variation of what immediately popped into my mind after trying it for the first time: “Where has this been all my life?”

Read more


Apple Suspends Program In Which Contractors Listened to Recorded Siri Snippets

Last week, The Guardian reported on Apple’s Siri grading program in which contractors listen to snippets of audio to evaluate the effectiveness of Siri’s response to its trigger phrase. That article quoted extensively from an anonymous contractor who said they and other contractors regularly heard private user information as part of the program.

In response, Apple has announced that it is suspending the Siri grading program worldwide. While suspended, Apple says it will re-evaluate the program and issue a software update that will let users choose whether to allow their audio to be used as part of the program.

In a statement to Matthew Panzarino, the editor-in-chief of TechCrunch, Apple said:

“We are committed to delivering a great Siri experience while protecting user privacy,” Apple said in a statement to TechCrunch. “While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.”

In an earlier response to The Guardian, Apple had said that less than 1% of daily Siri requests are sent to humans as part of the grading program. However, that’s not very comforting to users who are left wondering whether snippets of their daily life are part of the audio shared with contractors. Consequently, I’m glad to see that Apple is re-examining its Siri quality-control efforts and has promised to give users a choice of whether they participate.


Apple Disables Walkie-Talkie App Due to Security Vulnerability

Matthew Panzarino, writing for TechCrunch:

Apple has disabled the Apple Watch Walkie Talkie app due to an unspecified vulnerability that could allow a person to listen to another customer’s iPhone without consent, the company told TechCrunch this evening.

Apple shared an official statement with TechCrunch:

We were just made aware of a vulnerability related to the Walkie-Talkie app on the Apple Watch and have disabled the function as we quickly fix the issue. We apologize to our customers for the inconvenience and will restore the functionality as soon as possible. Although we are not aware of any use of the vulnerability against a customer and specific conditions and sequences of events are required to exploit it, we take the security and privacy of our customers extremely seriously. We concluded that disabling the app was the right course of action as this bug could allow someone to listen through another customer’s iPhone without consent. We apologize again for this issue and the inconvenience.

Panzarino rightfully points out the parallels of this issue with the highly-publicized FaceTime bug from earlier this year. The one key difference: whereas with the FaceTime bug, when it was reported by a user, Apple didn’t respond or take action until the problem received widespread media attention; with this Walkie-Talkie bug, Apple followed up on a customer’s report and addressed the issue seemingly before anyone else knew about it. Hopefully this is the sign of improved processes inside the company for handling serious bugs and vulnerabilities.

Permalink