Micro Focus is now part of OpenText. Learn more >

You are here

You are here

Cracked Apple: iOS security researchers intimidated into silence

public://webform/writeforus/profile-pictures/richi-2016-480.jpg
Richi Jennings Your humble blogwatcher, dba RJA
 

Apple v. Corellium is causing trouble in the world of independent security researchers. The lawsuit about tools that emulate iPhones might outlaw them—even for legitimate testing.

Not only that, but several researchers feel like they can’t speak out on the issue, for fear of retribution from Cupertino. The phrase “chilling effect” is being bandied about.

So much for Tim Cook’s promise to do a better job of helping indie white hats. In this week’s Security Blogwatch, we get to the core.

Your humble blogwatcher curated these bloggy bits for your entertainment. Not to mention: Cloud Eggs.

iPhone VM illegal?

What’s the craic? Lorenzo Franceschi-Bicchierai reports—Apple’s Copyright Lawsuit Has Created a ‘Chilling Effect’ on Security Research:

Security researchers are scared to buy, use, or even talk about the controversial iPhone emulation software Corellium, whose makers are in a legal battle with Apple. … Critics have called ... Apple's lawsuit against the company ... “dangerous” as it may shape how security researchers and software makers can tinker with Apple’s products and code.

During the lawsuit's proceedings, Apple has sought information from companies that have used the tool, which emulates iOS on a computer, allowing researchers to probe potential iPhone vulnerabilities in [an] easy-to-use environment. “Apple has created a chilling effect,” [said] a security researcher familiar with Corellium's product, who asked to remain anonymous because he wasn’t allowed to talk to the press. … “I definitely believe retribution is possible.”

Several other cybersecurity researchers expressed fear of retribution from Apple for using Corellium. … Three other researchers who specialize in hacking Apple software declined to comment citing the risk of some sort of retaliation from Apple.

Corellium has argued that its products help researchers find vulnerabilities and ultimately help Apple make safer devices. … “This litigation presents an existential threat to an open and healthy security research community not only for Apple products but for consumer devices in general,” the company said in a statement sent by its lawyer.

Apple declined to comment.

And Mike Peterson adds—Apple lawsuit scares security researchers away from Corellium emulator:

In August 2019, Apple levied a copyright lawsuit against security specialist Corellium, saying the company's iOS emulation software "copied everything" about the tech giant's mobile operating system. … The escalating legal dustup has created a "chilling effect" in the iPhone-focused security industry.

Apple maintains that the purpose of its lawsuit is "not to encumber good-faith security research," but to simply stop Corellium from commercializing its copyrighted works. [But] some … claim that Apple's copyright lawsuit is … more about retaining control over iOS security research and snarling the development of third-party iPhone hacking tools.

Of course, Marcus Hutchins—@MalwareTechBlog—knows a little something about the US justice system:

Apple v Corellium isn't about jailbreaking, it's about Apple wanting control over Apple research and the bugs that come with it. Their intention is probably to prevent researchers selling bugs to brokers, but you'd be insane to think they'd sanction jailbreaking either.

Fighting the strawman is a lot easier than fighting actual gatekeeping I guess.

I first tried to transition to iOS research after over a decade of Windows research. Asked an expert how to debug and they lost me at kernel exploits and black market dev-fused iPhones. The bar to entry is already insanely high, why allow Apple even more control?

Way I see it is [Apple] are very clearly trying to monopolize … research.

But what did they do wrong? Anubis IV has background:

Apple is suggesting these guys extracted copyrighted files from iPhones (and is seeking discovery of documents that would indicate if they did so from illicitly procured prototype or dev devices), packaged them up as a simulator, and then began selling it as a commercial product to get around needing an iPhone.

Interestingly, it wasn’t until their second round of remarks to the court that they started saying this is a security tool with legitimate uses that Apple is attempting to quash. Sounds like they made up that justification after the fact.

Okay, but what’s Apple’s real beef? emmayche explainifies:

Corellium is creating something on which iOS runs which isn’t manufactured by Apple, and that’s a violation of the licensing terms of iOS. Remember, you don’t own the copy of iOS that’s on your iPhone; it’s licensed for your use under very specific terms, and one of those terms is that you can’t use it anywhere but actually on that iPhone.

To build a business around theft (which is what using software that you have no right to use is) is something that the courts frown on. Chances are that Apple will win this one.

And Aeiou321 agrees:

Exactly! This case has nothing to do with the security researchers and everything to do with people stealing Apple’s OS.

The fact is that iOS is closed source, and as such it is illegal for Correllium to use it outside of an iPhone, let alone sell a product with it. Kinda a cut and dry case.

Apple was impressed enough by Correllium’s work that they offered to buy the company, which was declined. Only then did the lawsuits come. Seems like they should have taken the money when they could have.

So Rasmus Sten—@pajp—guns the point home:

can someone explain to me, in simple words, how Corellium ever could believe that their product could be legal, given current EULT-based legal framework for software distribution? … You can’t get a license to run iOS on non-Apple hardware.

Whether it should be legal is a different issue altogether – but in general it seems like a bad idea to build a product that clearly violates the law.

But postbigbang offers a more nuanced view:

It's a great test of fair use. … This also speaks to Right To Repair and other initiatives to permit users to hack their own stuff. Apple wants you to buy a lot of phones if you're going to brick them on the way.

I have little doubt that iOS emulators are available elsewhere, from non-legitimate sources. Do we have to send John Deere tractors to Canada to get them fixed, absent of domestic cooperation from John Deere? Same question for Tesla … Samsung … etc.

Apple's billions in cash will deliver serious legal firepower and that may eventually cave the problem for them, and if so, sends an onerous message to their customers and their competition.

Meanwhile, Beats cuts to the chase:

Steve would have killed these companies.

The moral of the story?

Carefully watch this legal battle. Precedent could be made here.

And finally

Even you can’t screw up these eggs

Previously in “And finally”

You have been reading Security Blogwatch by Richi Jennings. Richi curates the best bloggy bits, finest forums, and weirdest websites … so you don’t have to. Hate mail may be directed to @RiCHi or sbw@richi.uk. Ask your doctor before reading. Your mileage may vary. E&OE.

Image sauce: Paulo Henrique (cc:by)

Keep learning

Read more articles about: SecurityInformation Security