How does Apple Intelligence protect users’ privacy
Apple Intelligence — the personal intelligence system for iPhone, iPad, and Mac — combines the power of generative models with personal context to deliver intelligence that’s useful and relevant to the user.
Apple Intelligence is deeply integrated into iOS 18, iPadOS 18, and macOS Sequoia, harnessing the power of Apple silicon to understand and create language and images, take action across apps, and draw from personal context to simplify and accelerate everyday tasks — all while protecting users’ privacy and security. Many of the models that power Apple Intelligence run entirely on device, and Private Cloud Compute offers the ability to flex and scale computational capacity between on-device processing and larger, server-based models that run on dedicated Apple silicon servers. The first set of Apple Intelligence features will be available next month, delivering experiences that are delightful, intuitive, easy to use, and specially designed to help users do the things that matter most to them.
Bailey Schulz for USA Today:
Tech privacy experts and advocates told USA TODAY that these ideas look innovative, but they’re waiting to see exactly how they play out.
“While there’s a lot of innovation and thinking about privacy with the new system, there are still open questions on how effective these interventions will ultimately be, especially if we see more examples from other companies trying to do similar things,” said Miranda Bogen, director of the Center for Democracy and Technology’s AI Governance Lab…
Apple also said it plans to make available software images of every production build of Private Cloud Compute so security researchers can verify its functionality and identify any issues.
The transparency “is a good thing,” but users shouldn’t expect immediate results from this sort of digging, according to Thorin Klosowski, a security and privacy activist at the Electronic Frontier Foundation, a nonprofit digital rights group.
“It will just take some time before we have a really good idea of what they’re doing, how they’re doing it and if it’s working,” he said, cautioning users to avoid offering “too deep” of personal and private information.
He added: “I think conceptually, it looks good.”
MacDailyNews Note: Apple says:
Apple Intelligence is designed to protect users’ privacy at every step. A cornerstone of Apple Intelligence is on-device processing, and many of the models that power it run entirely on device. To run more complex requests that require more processing power, Private Cloud Compute extends the privacy and security of Apple devices into the cloud to unlock even more intelligence. Private Cloud Compute marks a fundamental breakthrough in server-based intelligence. When using Private Cloud Compute, users’ data is never stored or shared with Apple; it is used only to fulfill their request. Independent experts can inspect the code that runs on Apple silicon servers to continuously verify this privacy promise and are already doing so.
In addition, for users who choose to access ChatGPT through Siri or Writing Tools, privacy protections are built in — their IP addresses are obscured, and OpenAI won’t store requests. Users can access ChatGPT for free without creating an account, and ChatGPT’s data-use policies apply for those who choose to connect their account.
Please help support MacDailyNews — and enjoy subscriber-only articles, comments, chat, and more — by subscribing to our Substack: macdailynews.substack.com. Thank you!
Support MacDailyNews at no extra cost to you by using this link to shop at Amazon.
The post How does Apple Intelligence protect users’ privacy appeared first on MacDailyNews.