Apple will pay you up to $1 million if you can hack into Apple Intelligence servers
Think you can hack your way into an Apple server? If so, you could score as much as $1 million courtesy of a new bug bounty. On Thursday, Apple revealed a challenge to test the security of the servers that will play a major role in its Apple Intelligence service.
As Apple preps for the official launch of its AI-powered service next week, the company is naturally focused on security. Though much of the processing for Apple Intelligence requests will occur on your device, certain ones will have to be handled by Apple servers. Known collectively as Private Cloud Compute (PCC), these servers need to be hardened against any type of cyberattack or hack to guard against data theft and compromise.
Also: 7 essential password rules to follow in 2024, according to security experts
Apple has already been proactive about protecting PCC. After initially announcing Apple Intelligence, the company invited security and privacy researchers to inspect and verify the end-to-end security and privacy of the servers. Apple even gave certain researchers and auditors access to a Virtual Research Environment and other resources to help them test the security of PCC. Now the company is opening the door for anyone who wants to attempt to hack into its server collection.
To give people a head start, Apple has published a Private Cloud Compute Security Guide. This guide explains how PCC works with a particular focus on how requests are authenticated, how to inspect the software running in Apple's data centers, and how PCC's privacy and security are designed to withstand different types of cyberattacks.
The Virtual Research Environment (VRE) is also open to anyone vying for the bug bounty. Running on a Mac, the VRE lets you inspect the PCC's software releases, download the files for each release, boot up a release in a virtual environment, and debut the PCC software to investigate it further. Apple has even published the source code for certain key components of PCC, which is accessible on GitHub.
Also: Have you stayed at a Marriott? Here's what its settlement with the FTC means for you
Now how about that bug bounty? The program is designed to uncover vulnerabilities across three major areas:
- Accidental data disclosure -- Vulnerabilities that expose data due to PCC configuration flaws or system design issues.
- External compromise from user requests -- Vulnerabilities that allow attackers to exploit user requests to gain unauthorized access to PCC.
- Physical or internal access -- Vulnerabilities in which access to internal interfaces of PCC lets someone compromise the system.
Breaking it down further, here are the amounts Apple will pay out for different kinds of hacks and discoveries:
- Accidental or unexpected disclosure of data due to deployment or configuration issue -- $50,000
- Ability to execute code that has not been certified -- $100,000.
- Access to a user's request data or other sensitive user details outside the trust boundary -- the area where the level of trust changes because of the sensitive nature of the data being captured -- $150,000.
- Access to a user's request data or sensitive information about the user's requests outside the trust boundary -- $250,000.
- Arbitrary execution of code without the user's permission or knowledge with arbitrary entitlements -- $1,000,000.
However, Apple promises to consider awarding money for any security issue that significantly impacts PCC even if it doesn't match a published category. Here, the company will evaluate your report based on the quality of your presentation, proof of what can be exploited, and the impact on users. To learn more about Apple's bug bounty program and how to submit your own research, browse the Apple Security Bounty page.
Also: Why remove Russian maintainers of Linux kernel? Here's what Torvalds says
"We hope that you'll dive deeper into PCC's design with our Security Guide, explore the code yourself with the Virtual Research Environment, and report any issues you find through Apple Security Bounty," Apple said in its post. "We believe Private Cloud Compute is the most advanced security architecture ever deployed for cloud AI compute at scale, and we look forward to working with the research community to build trust in the system and make it even more secure and private over time."