Apple is offering a reward of up to $1 million to anyone who can hack the servers of its artificial intelligence (AI) system.
Apple has just announced a new 'bounty' program, encouraging security researchers to search for security vulnerabilities in the Private Cloud Compute (PCC) system - the cloud platform used to process data for the "Apple" artificial intelligence (AI) features.
Apple offers up to $1 million reward to anyone who can 'crack' its AI system |
PCC is designed to protect sensitive user information, preventing unauthorized access by any third party, including Apple itself. To improve the security of the system, the iPhone manufacturer decided to open PCC for the security research community to examine.
Security researchers can use the VRE virtual environment and the provided PCC source code to analyze and search for vulnerabilities. The highest reward is up to 1 million USD for those who can remotely attack and execute malicious code, and 250,000 USD for intrusion to steal user data.
Initially, Apple invited only a select few security researchers, but now the opportunity is open to everyone. "To encourage further research into PCC servers, we are expanding Apple's security bounty program," an Apple representative shared.
Not only Apple, other technology giants such as Microsoft and Google also often give big rewards to security experts who discover serious bugs. This helps save costs compared to when security bugs are exploited by hackers first.
Source
Comment (0)