This Week's Sponsor:

Listen Later

Listen to Articles as Podcasts


Posts tagged with "Privacy"

iMessage Is Preparing for a Post-Quantum Computing World

Yesterday, Apple’s Security Research website published a report on a cryptographic security upgrade coming to iMessage with the release of iOS 17.4, iPadOS 17.4, macOS 14.4, and watchOS 10.4 called PQ3. It’s a forward-looking, preemptive upgrade that anticipates a future where quantum computers will be able to defeat today’s cryptographic security with ease. That day isn’t here yet, but PQ3 is rolling out with the next series of Apple’s OS updates to protect against a scenario known as Harvest Now, Decrypt Later where bad actors collect vast amounts of encrypted data today, anticipating a future where it can be decrypted by quantum computers.

Source: Apple.

Source: Apple.

If you’ve heard the term quantum computing thrown around in the past and don’t know what it is, I highly recommend a couple of explainer articles by the MIT Technology Review that cover both quantum computers and post-quantum cryptography.1 But if the details don’t interest you, the bottom line is that PQ3 is being added to iMessage today in anticipation of a day in the future where today’s end-to-end encryption techniques don’t work anymore. Here’s how Apple’s paper explains it:

Historically, messaging platforms have used classical public key cryptography, such as RSA, Elliptic Curve signatures, and Diffie-Hellman key exchange, to establish secure end-to-end encrypted connections between devices. All these algorithms are based on difficult mathematical problems that have long been considered too computationally intensive for computers to solve, even when accounting for Moore’s law. However, the rise of quantum computing threatens to change the equation. A sufficiently powerful quantum computer could solve these classical mathematical problems in fundamentally different ways, and therefore — in theory — do so fast enough to threaten the security of end-to-end encrypted communications.

Although quantum computers with this capability don’t exist yet, extremely well-resourced attackers can already prepare for their possible arrival by taking advantage of the steep decrease in modern data storage costs. The premise is simple: such attackers can collect large amounts of today’s encrypted data and file it all away for future reference. Even though they can’t decrypt any of this data today, they can retain it until they acquire a quantum computer that can decrypt it in the future, an attack scenario known as Harvest Now, Decrypt Later.

PQ3 protects against a post-quantum world by setting up an iMessage conversation with a new post-quantum public key system and then periodically updating the keys so that if the keys are compromised, it won’t compromise the entire conversation. The system also uses existing cryptographic algorithms for portions of the encryption process that aren’t vulnerable to a Harvest Now, Decrypt Later scenario.

There is a lot of additional detail in Apple’s report, as you can imagine, including information about the review process that the new system has undergone and the way it is applied to iMessage in particular, which explains the design considerations that were necessary to apply these cryptographic techniques at the scale of iMessage in a way that doesn’t compromise users’ experience.

There’s more to be done to ramp up iMessage’s security even further as we approach a world where quantum computers are a threat to traditional cryptography. However, as Apple’s report concludes, with the imminent OS updates, iMessage will be “the global state of the art for protecting messages against Harvest Now, Decrypt Later attacks and future quantum computers.”

I’ve heard iMessage security get thrown under the bus a lot lately as an excuse Apple uses to protect its market dominance. There’s no reason that protecting customer communications and market-share can’t both be true. However, I think you’d be hard-pressed to read a report like this one and not come away believing that customer privacy and security are also a sincere goals at Apple.


  1. Yes, these are the sorts of articles I save in my read-later app. It’s a fascinating topic that also helps me fall asleep at night, so it’s a win all around. ↩︎
Permalink

Apple and Google Submit Spec to Industry Group Addressing Unwanted Use of Item Trackers

As useful as they are for finding misplaced belongings, AirTags and other item location trackers are also misused to track people. Today, Apple and Google announced a joint effort aimed at creating an industry standard to combat unwanted tracking. According to a press release from Apple:

Today Apple and Google jointly submitted a proposed industry specification to help combat the misuse of Bluetooth location-tracking devices for unwanted tracking. The first-of-its-kind specification will allow Bluetooth location-tracking devices to be compatible with unauthorized tracking detection and alerts across iOS and Android platforms. Samsung, Tile, Chipolo, eufy Security, and Pebblebee have expressed support for the draft specification, which offers best practices and instructions for manufacturers, should they choose to build these capabilities into their products.

Apple says that the spec, which has been submitted to the Internet Engineering Task Force (IETF), incorporates input from device manufacturers as well as safety and advocacy groups.

Erica Olsen, the National Network to End Domestic Violence’s senior director of its Safety Net Project, said of the companies’ efforts:

This collaboration and the resulting standards are a significant step forward. NNEDV is encouraged by this progress. These new standards will minimize opportunities for abuse of this technology and decrease the burden on survivors in detecting unwanted trackers. We are grateful for these efforts and look forward to continuing to work together to address unwanted tracking and misuse.

The full specification is available on the IETF’s Datatracker website.


Apple Has Stopped Development of System to Identify Child Sexual-Abuse Material

Joanna Stern of The Wall Street Journal, who interviewed Craig Federighi, Apple’s Senior Vice President of Software Engineering, in connection with the new security features coming to its platforms, reports that Apple has abandoned its efforts to identify child sexual-abuse materials in its devices. According to Stern:

Last year, Apple proposed software for the iPhone that would identify child sexual-abuse material on the iPhone. Apple now says it has stopped development of the system, following criticism from privacy and security researchers who worried that the software could be misused by governments or hackers to gain access to sensitive information on the phone.

Federighi told Stern:

 Child sexual abuse can be headed off before it occurs. That’s where we’re putting our energy going forward.

Apple also told The Wall Street Journal that Advanced Data Protection that allows users to opt into end-to-end encryption of new categories of personal data stored in iCloud, will be launched in the US this year and globally in 2023.

For an explanation of the new security protections announced today, be sure to catch Joanna Stern’s full interview with Craig Federighi.

Permalink

Apple Announces a Trio of Security Features Coming to Its Platforms

Today, Apple announced three new security features.

First, iMessage Contact Key Verification allows users to verify that they are communicating with the person with whom they think they’re communicating. The feature will alert users who use it if someone has infiltrated cloud services to gain access to the user’s iMessage conversations. For even greater security, users can compare a Contact Verification Code in person, on FaceTime, or through another secure channel.

Second, Security Keys lets users adopt hardware security keys when logging into their iCloud accounts. The new system is an enhancement over two-factor authentication because it prevents someone from obtaining a your second factor through a phishing scam.

Third, Advanced Data Protection for iCloud adds encryption on the iPhone, iPad, and Mac for a long list of data categories. According to Apple’s press release:

iCloud already protects 14 sensitive data categories using end-to-end encryption by default, including passwords in iCloud Keychain and Health data. For users who enable Advanced Data Protection, the total number of data categories protected using end-to-end encryption rises to 23, including iCloud Backup, Notes, and Photos. The only major iCloud data categories that are not covered are iCloud Mail, Contacts, and Calendar because of the need to interoperate with the global email, contacts, and calendar systems.

Apple says that iMessage Contact Key Verification will be available globally in 2023, and Security Keys is coming early 2023. Advanced Data Protection for iCloud is available in the US today for participants in Apple’s beta OS program, and will presumably roll out with the next point release to Apple’s OSes.


Hands On: iCloud Shared Photo Library and Family Checklist

iCloud Shared Photo Library

Over the years, I’ve shared family photos with my wife Jennifer in three ways: iMessage, AirDrop, and Shared Albums. However, of those, iMessage won hands down, not because it’s the best way to share photos, but because Messages is an app we already use every day to communicate. Plus, sharing photos with Messages is easy whether you’re already in the app and using the Photos iMessage app or in the Photos app itself and using the share sheet. From conversations with friends and family, I know I’m not alone in my scattershot approach to sharing photos with my family.

It’s into that chaotic, ad hoc mess and all of its variations that users have improvised over the years that Apple is stepping in with iCloud Shared Photo Library, its marquee new Photos feature for iOS and iPadOS 16 and macOS Ventura. And you know what? It just works.

The feature lets anyone with an iCloud photo library share part or all of their photo library with up to five other people. Once activated, a new library is created that sits alongside your existing one and counts against the iCloud storage of the person who created it.

One critical limitation of iCloud Shared Photo Library is that you can only be a member of one shared library, a restriction that is designed to limit the library to your immediate household. That means I could share photos with my wife and kids because there are fewer than six of us, but I couldn’t set up another library with my siblings or parents for our extended families. Nor could I invite one of my extended family members to use the extra slot I’ve got in my family library unless they were willing to forego being part of any shared library their own family created.

Unwinding a shared library.

Unwinding a shared library.

So, what do you do if you’re in a shared library and want to join a different one? There’s a button in the Photos section of Settings to leave a library, so you can do so with one tap, saving all of the photos in the shared library to your personal library or keeping just those you originally contributed to the shared pool. Deleting libraries is possible too, but only by the person who created them, who is given the choice of keeping all images or just the ones they contributed when they do so.

Read more


Safety Check Lets Victims of Domestic or Intimate Partner Violence Reset Privacy Settings

Safety Check is a new iOS privacy feature that allows users whose safety may be at risk due to domestic or intimate partner violence to review and revoke access previously granted to their partner. Apple developed the feature in consultation with groups that support the victims of domestic and intimate partner violence, including the National Network to End Domestic Violence, the National Center for Victims of Crime, and Wesnet, The Women’s Services Network.

Speed is important when someone’s safety may be at risk. Instead of hunting app-by-app to undo privileges granted to a partner, Safety Check includes an emergency reset that signs a user out of their iCloud account on other devices that might be in a partner’s possession, stops sharing their location, resets privacy permissions across all apps, and limits messaging to the device they have with them.

Safety Check also includes tools to manage sharing with people and apps. The feature can show you what you’re sharing and with whom and then change the access settings if you’d like. For apps, each type of private information that’s shared is listed and can be disallowed. It promises to be a great way to perform a periodic personal privacy audit.

With the number of apps we use and people we share with growing all the time, a tool like Safety Check is critical. It eliminates the need to go hunting for multiple privacy settings by consolidating the information in one place, which is important if you’re someone whose safety is at risk.


You can follow all of our WWDC coverage through our WWDC 2022 hub or subscribe to the dedicated WWDC 2022 RSS feed.


Tim Cook Delivers Keynote Speech to Privacy Group

Today at the International Association of Privacy Professionals’ Global Privacy Summit, Apple CEO Tim Cook delivered a keynote speech on privacy. The IAPP’s mission is to act as a resource for privacy professionals to help them in their efforts to manage information privacy risks and protect data for their organizations.

Cook’s speech, which was livestreamed on the IAPP’s YouTube channel, began with a recap of Apple’s efforts to protect user privacy, including App Tracking Transparency, alluding to the “A Day in the Life of Your Data” white paper the company published early last year.

Cook told the assembled crowd that Apple supports privacy regulation, including GDPR in the EU and privacy laws in the US. However, Cook also expressed concern about unintended consequences that laws being considered in the US and elsewhere might cause, calling out sideloading proposals in particular. Cook said that although Apple supports competition, alternate app stores and sideloading are not the solution because they would open devices up to apps that sidestep the company’s tracking protections and can expose users introduce malware.

Concluding his remarks, Cook called on tech companies and governments to work together to fashion policies that don’t undermine user privacy. Ending on an optimistic note, Cook told the gathering that although the world is at a pivotal moment for privacy, the end of privacy as we know it is not inevitable.


Apple Publishes AirTag Additions to Its Personal Safety Guide

First spotted by 9to5Mac, Apple has expanded its Personal Safety Guide that serves as a hub for information about device and data access when your safety is at risk. New details in the guide cover AirTags, which have received a lot of attention as stories have surfaced of their use to stalk people, but the guide is broader than that, covering a wide range of topics. As Apple explains at the beginning of the guide:

This user guide is a personal safety resource for anyone who is concerned about or experiencing technology-enabled abuse, stalking, or harassment. It can help you sever digital ties with those you no longer want to be connected to and outlines the personal safety features that are built into Apple devices.

In addition to accessing the Personal Safety Guide on Apple’s website, it’s available as a downloadable PDF.

Regarding AirTags and other Find My accessories, Apple’s guide explains what the device’s alerts mean, providing users with the context necessary to know how to respond. The guide also offers suggestions of what to do if an unknown device is following them.

It’s good to see Apple’s Personal Safety Guide actively maintained. Apple has built-in safety measures for devices like AirTags, but it’s equally important that users know how to take advantage of those safety features, which the Personal Safety Guide should help with.

Permalink

Are AirTags Causing Stalking or Making Us More Aware of It?

The problem of AirTags being used to stalk people has been in the news ever since they were released last spring, but a recent story in The New York Times has brought the issue to the forefront again. AirTags are fantastic when used as intended to keep track of your keys, luggage, and other personal items, but stalking is a serious problem that Apple should do everything it can to prevent.

Apple is also in a unique position given the vast size of its Find Me network. That puts the company in a different league than competitors like Tile, which carries greater responsibility with it.

In a story on Peer Reviewed, Matt VanOrmer puts a finger on something I’ve been wondering for a while: Are AirTags contributing to the problem of stalking or merely making us more aware of it because of the unique stalking countermeasures built into the device? It’s a classic causation/correlation question that is worth reflecting on. As VanOrmer explains:

I think the increase in news stories about AirTag stalking situations are less indicative of AirTags causing more stalking, and more indicative of how frequently stalkings already occur — with AirTags’ anti-stalking features simply bringing more of these horrible situations to light. These stories may be a classic example of the Baader-Meinhof phenomenon (AKA the “Frequency Illusion”) — in which increased awareness of creeps using AirTags to stalk women creates the illusion that it is happening more often, or even that AirTags are responsible for this illusory increase in incidence.

As VanOrmer rightly points out, Apple should do everything it can to prevent AirTags from being used to track people, which includes improving the tools available to Android users for whom Apple has made an app that is generally viewed as insufficient. This is also a topic where some added transparency about what Apple is doing to address concerns about stalking would help observers decide whether it’s enough instead of having only anecdotal news reports to go on. However, given the wide-reaching impact of the Find My network, which affects people who aren’t even Apple customers, I think a third-party audit of how Apple is handling the security and privacy implications of AirTags is warranted.

Permalink