One of Apple’s legal foes has offered to help independent researchers analyze the tech giant’s controversial new scanning software for detecting child sexual abuse material on iPhones.

Corellium’s new “Open Security Initative” will offer $5,000 grants to security researchers to support “independent public research into the security and privacy of mobile applications,” according to the company’s announcement. That includes verifying Apple’s new photo scanning initiative’s privacy and security claims.

On Friday, Craig Federighi, Apple’s senior vice president of software engineering, defended Apple’s child sexual abuse material initiative in an interview with The Wall Street Journal. He said independent security researchers could inspect iPhones to ensure the system does not overstep the bounds of its intended use.

Corellium offers a service that can be used by security researchers to probe “virtual” iPhones, inspecting Apple’s mobile operating system in a way that is not possible on off-the-shelf, physical devices. Apple sought to shut that service down with a lawsuit, claiming it violated Apple’s intellectual property. The two companies settled last week.

The uproar over Apple’s decision to police photos stems from the company’s controversial decision to do part of the policing for child pornography locally on consumers’ devices. Security experts immediately raised concerns, ranging from possible abuse by authoritarian governments, to exploitation by hackers.

The issues are at the heart of a long combative and complicated relationship with the security research community, many of whom say that Apple doesn’t allow for the same flexibility allowed by some rivals. The security of Apple’s locked-down mobile operating system has come under scrutiny in the wake of new revelations about hacking facilitated by the NSO Group, an Israeli company that sells its spy tools to authoritarian governments. Some researchers have called on Apple to remove its long-standing road blocks that stand in the way of independent security research.

Advertising

“Apple is being two-faced on this for sure,” said Alex Stamos, former chief security officer at Facebook. “Federighi is correct. The only way you can verify that Apple is telling the truth about any of their privacy protections and security protections is reverse-engineering the iPhone. But for the last several years, Apple has argued that doing privacy and security research on iOS is illegal,” he said. Stamos, who is director of the Stanford Internet Observatory, was scheduled to testify as an expert in support of Corellium at trial before the case settled.

Apple spokesman Todd Wilder did not have immediate comment.

Unlike desktop computers, iPhones are designed to limit the files users and applications can access, a strategy Apple says improves privacy and security. To peer more deeply into the iPhones, researchers must first break down the phone’s defenses in a software trick known as a “jailbreak.” Apple has fought aggressively to stop jailbreaking, arguing unsuccessfully that it is a violation of U.S. law, and has hired some of the most successful jailbreakers to help lock down its operating system.

Now, Apple’s software that looks at phones for evidence of child pornography has created a new need for security research, according to Apple. Other large tech companies, such as Facebook and Microsoft, scan their servers for child porn using a software product called PhotoDNA, developed by Microsoft and Dartmouth professor Hani Farid. The software relies on a database of known child pornography maintained by the National Center for Missing and Exploited Children. If a photo on a company server matches the database, it is flagged and authorities are notified. Companies have employed that system on their servers, and not on devices owned by their customers.

Last week, Apple announced what is essentially its own version of PhotoDNA. However, one key difference caused an uproar among privacy advocates. Instead of simply scanning its own servers, called iCloud, Apple employed a high tech and byzantine system that it calls NeuralHash so that the matching process happens on Apple devices, and not in the cloud. By doing so, Apple says it is not able to see users’ photos, until they reach a threshold of 30 matching photos.

Security experts immediately raised concerns, from possible abuse by authoritarian governments, to exploration by hackers. Matthew Green, a Professor of Computer Science at Johns Hopkins, began tweeting about it the night before Apple’s announcement, calling it a “really bad idea.”

But last week, in the interview with The Wall Street Journal, Federighi attempted to assuage those concerns, in part by highlighting the ability of security researchers to help build trust in Apple’s effort to scan phones for child pornography.

Advertising

“Because it’s on the [phone], security researchers are constantly able to introspect what’s happening in Apple’s [phone] software,” he said. “So if any changes were made that were to expand the scope of this in some way — in a way that we had committed to not doing — there’s verifiability, they can spot that that’s happening.”

Corellium posted its announcement two days later.

Security researchers say they would likely need to use Corellium or jailbroken devices to verify that Apple was using the same database for all users around the world, for instance. They could also verify whether Apple is only able to see suspected child pornography after a threshold of 30 photos is met, as Apple has claimed. And researchers could check whether the database on the phone had been tampered with in some way.

But some levels of verification are beyond the reach of security researchers, they say. Though the scenarios are far fetched, researchers would not be able to tell whether the database of child porn images had been somehow manipulated to include photos that are not child pornography, perhaps to facilitate government censorship or a secret law enforcement imitative.

“We applaud Apple’s commitment to holding itself accountable by third-party researchers,” Corellium said in its announcement. “We believe our platform is uniquely capable of supporting researchers in that effort. Our ‘jailbroken’ virtual devices do not make use of any exploits, and instead rely on our unique hypervisor technology.”

In December, a federal judge threw out Apple’s legal claim that Corellium violated Apple’s copyrights, ruling that the virtual iPhones it sells are protected by the fair use doctrine of copyright law. The judge ruled that Corellium was protected, in part, because it was designed to help improve the security for all iPhone users.

According to a person familiar with the matter, who spoke on the condition of anonymity because they are not authorized to speak publicly, Apple is considering filing an appeal on the judge’s ruling, which would revive Apple’s legal battle against the company.