Apple offers up to $1 million to anyone who can hack its AI servers
Apple is offering up to US$1 million to anyone who can hack its artificial intelligence (AI) servers, reports CNET.
Though Apple continues to roll out AI features slowly as part of what Craig Federighi, Apple’s senior vice president of Software Engineering, says is the company’s strategy to “get each piece right and release it when it’s ready,” one thing Apple feels very confident about is how it’s handling the privacy and security on the Private Cloud Compute, or PCC, servers that power some Apple Intelligence features.
CNET says this is why it’s inviting hackers as well as privacy and security professionals and researchers to verify the security claims it’s made about PCC and is offering bounties from $50,000 up to $1 million to anyone who finds a bug or major issue. PCC. The PCC bounty categories are:
° Accidental data disclosure: vulnerabilities leading to unintended data exposure due to configuration flaws or system design issues.
° External compromise from user requests: vulnerabilities enabling external actors to exploit user requests to gain unauthorized access to PCC.
° Physical or internal access: vulnerabilities where access to internal interfaces enables a compromise of the system.
“Because PCC extends the industry-leading security and privacy of Apple devices into the cloud, the rewards we offer are comparable to those for iOS,” Apple says. “We award maximum amounts for vulnerabilities that compromise user data and inference request data outside the PCC trust boundary.”
The post Apple offers up to $1 million to anyone who can hack its AI servers appeared first on MacTech.com.