Getty Images/iStockphoto

Security pros grade Apple Intelligence data privacy measures

Apple has built a Private Cloud Compute server to process and then delete data sent from Apple Intelligence running on an iPhone, iPad or Mac. Apple says it won't store any data.

Despite worries on social media over Apple's integration of OpenAI's ChatGPT, security experts applauded the data privacy system in the Apple Intelligence AI services planned for the latest iOS and macOS devices starting in September.

Initial reaction to Apple's handling of data is positive, primarily due to an architecture that won't store information outside of the device. Instead, the company promises to delete the data people input in Apple Intelligence immediately after it delivers a response from models running on Apple's specially designed data center servers.

"I've never actually seen this kind of thing," said Alex Stamos, chief trust officer at security firm SentinelOne, of Apple's technology. "People have talked about it in papers, but I've never seen it done in production."

Apple has built a Private Cloud Compute (PCC) server for processing AI requests from the company's devices. After completing the process, the PCC server deletes all data and retains nothing, according to the company.

Apple encrypts data from the device to the validated PCC node. Apple's Secure Enclave, a subsystem that handles cryptographic operations on the company's silicon, does the encryption and decryption.

"It's effectively about the best you can do for data sent up to the cloud for these kinds of queries while keeping Apple locked out of the data," Stamos said.

Example of Apple Intelligence's features.
Apple Intelligence can rewrite, proofread and summarize text on Apple devices.

Apple PCC security at the factory

Apple's security for the PCC hardware starts at manufacturing, according to the company. Apple performs high-resolution imaging of PCC components before sealing the server and activating its tamper switch for shipping. Apple performs revalidation when the server arrives at the data center.

At the end of the process, Apple issues a certificate for keys stored in the Secure Enclave for each server. An Apple iPhone, iPad or Mac won't send data to a PCC node unless its certificate is validated.

After launching PCC, Apple plans to let security researchers inspect software images of each production server. Apple will reward researchers who find security problems through its bounty program.

"The Apple Security Bounty will reward research findings in the entire Private Cloud Compute software stack -- with especially significant payouts for any issues that undermine our privacy claims," the company said on its website.

Enterprises that let employees bring their own devices to work will likely be comfortable with Apple's data privacy system, security experts said. Organizations that must follow strict data-handling rules set by regulators will likely opt out of Apple Intelligence. They include financial services, government agencies and healthcare institutions.

"Some customers, particularly those in highly regulated industries, will still want to control the flow of corporate data from managed devices to any outside service, even Apple's highly secure implementation," said Weldon Dodd, senior vice president of community at mobile device management (MDM) company Kandji.

Typically, organizations under regulatory control can only use IT hardware and software that meet strict standards, such as the Federal Risk and Authorization Management Program and HIPAA.

Some less regulated organizations could also be wary of Apple's claims, said Matt Vlasach, vice president of product and sales engineering at MDM company Jamf.

"Apple says they won't retain data, but there is always enterprise skepticism about such claims," he said.

Apple promises to provide a way for organizations to audit its AI system, which should alleviate concerns, Vlasach said. Nevertheless, enterprises will likely turn off Apple Intelligence while assessing the service.

"Initially, no matter what Apple claims, there is always significant hesitation to leverage cloud services and even more so emerging AI technologies, until legal and compliance teams can really dig into the details," Vlasach said.

ChatGPT integration in iOS devices

Apple says they won't retain data, but there is always enterprise skepticism about such claims.
Matt VlasachVice president of product and sales engineering, Jamf

This week, Apple also addressed privacy concerns related to its partnership with ChatGPT-maker OpenAI. Under the deal, Apple would give device users the option of using OpenAI's ChatGPT for services unavailable from Apple, such as image and document understanding capabilities. Users can also use ChatGPT within Apple's writing tools to create content.

Apple obscures users' IP addresses from OpenAI, and ChatGPT won't store requests, according to the company.

Security experts said some enterprises might balk at using a device that doesn't prevent employees from feeding corporate data into a public version of ChatGPT. Apple said organizations will be able to turn off ChatGPT on devices running the iOS 18, iPadOS 18 and macOS 15 operating systems, all scheduled for release this year with Apple Intelligence.

"This is exactly the sort of assurance that corporate customers require in order to carefully and thoughtfully consider how to adapt their [governance, risk management and compliance] policies to the new capabilities of iPad, iPhone, and Mac," Dodd said.

Apple did not provide comment by press time.

Antone Gonsalves is an editor at large for TechTarget Editorial, reporting on industry trends critical to enterprise tech buyers. He has worked in tech journalism for 25 years and is based in San Francisco.

Dig Deeper on Mobile security

Search Networking
Search Unified Communications
Search Security
Close