iOS security exploit exposed, already released in an Apple approved app [video]

An iOS security exploit, unveiled by security researcher Charlie Miller, allows an app to download and execute unsigned code from a remote unknown server. What’s even more astonishing, to prove the exact details of this hack, Charlie Miller developed and submitted an app containing the exploit to Apple. The app was approved and available in the App Store. (It has since been removed, and Charlie Miller has also now been removed from the iOS developer program.)

Miller became suspicious of a possible flaw in the code signing of Apple’s mobile devices with the release of iOS 4.3 early last year. To increase the speed of the phone’s browser, Miller noticed, Apple allowed javascript code from the Web to run on a much deeper level in the device’s memory than it had in previous versions of the operating system. In fact, he realized, the browser’s speed increase had forced Apple to create an exception for the browser to run unapproved code in a region of the device’s memory, which until then had been impossible. (Apple uses other security restrictions to prevent untrusted websites from using that exception to take control of the phone.) The researcher soon dug up a bug that allowed him to expand that code-running exception to any application he’d like. “Apple runs all these checks to make sure only the browser can use the exception,” he says. “But in this one weird little corner case, it’s possible. And then you don’t have to worry about code-signing any more at all.”

Miller plans on demonstrating the exploit at the SysCan conference in Taiwan next week. In the mean time, take a look at the video below which shows the exploit in action. Using the app he can take a copy of a users address book, direct them to a YouTube video or steal photos from the device running the app.

We are sure Apple will be releasing a fix very soon to plug this exploit, now that it is out in the open!

Source: Forbes via Daring Fireball

chrisoldroyd

UK editor at iMore, mobile technology lover and air conditioning design engineer.

More Posts

 

9
loading...
0
loading...
0
loading...
0
loading...

← Previously

Daily Tip: How to restore a recently closed tab on the iPad with iOS 5

Next up →

TiPb celebrity iPhone and iPad sightings for November 8, 2011

There are 19 comments. Add yours.

CP says:

There goes a possible jailbreak exploit

MojoVersion8 says:

could be used that way I guess but it's far too potentially damaging to not fix

Nathan Nye says:

The problem with trying to use this as a jailbreak exploit (and don't get me wrong I thought about the same thing at first), is that it requires an Apple signed app from the app store.

Waynethebest23 says:

Was that an iPhone 4S? When he was downloading the apps they installed pretty quickly.

Waynethebest23 says:

Was that an iPhone 4S? When he was downloading the apps they installed pretty quickly.

Nathan Nye says:

So, he finds a security flaw and tests it. Makes a video and lets Apple know about it. Then they remove the app and remove his developer license?

Robert White says:

That is gratitude for ya. Apple style.

Robert White says:

That is gratitude for ya. Apple style.

MojoVersion8 says:

well he did actually implement the exploit in an app, would be surprised if that doesn't break the developer agreement
also posting it on youtube probably doesn't sit in his favor

cardfan says:

I read it. Still don't care.

Anton Frost says:

Wouldn't it have been better to contact Apple directly about it first, and if they ignored him, then take this route, which would naturally result in his developer license being pulled? Can he prove that he contacted Apple first, was ignored, and then felt no other recourse but to do it this way? Or is he perhaps seeking media attention and not really trying to help Apple secure iOS?

Dev says:

Jeez...HTML tag fail. Sorry about that, glad disquss caught it but I'm closing the tag here just in case

Noel Hibbard says:

Apple doesn't review your source code so as long as the malicious code stays dormant during the review process then the App could get approved. But as soon as Apple catches wind of the malicious app it will get removed and the dev will be banned. I could even see legal action being taken. I just don't see this being a real problem. Who in their right mind is going to distribute something malicious in plain sight with their name plastered all over it? Am I missing something here?

Dev says:

You are missing what the bad guy can accomplish in the lag time between when the exploit is used, and when Apple catches wind of it. Apple only found out about this one because Miller told them -- he is a "white hat" who looks for these things, and notifies the company.
A "black hat" would put this exploit in the app, and quietly harvest whatever information (s)he chooses from the phone. Without people like Miller looking for holes, it could easily be weeks or months until the exploit is found. During that amount of time, the bad guy could collect more than enough contact information, passwords, and credit card numbers to make up losing the $99 developer account fee many times over. And, once banned, the bad guy can simply register again an under a different company or name.

Noel Hibbard says:

Yeah I guess you have a point. But still sounds a little risky doing something like this under your own name. But it is kind of hard to say this is a security flaw in the OS. The SDKs allow access to your contacts for example. You could harvest this data. So you aren't even exploiting a hole yet still doing shady activities. I don't see how Apple could combat this sort of thing without demanding source code in the future and slowing down the App approval process.

Dev says:

Agreed -- for the developer, there is some risk. But for a whole class of bad guys, the amount of risk is tiny, and the consequences of getting caught even less.
But this is without question a security flaw in the OS. The difference between the this and accessing contacts through proper SDK channels is one of permissions and visibility. Doing it through the SDK means that Apple -- and the end user -- know about this access, and have approved it.
Miller's exploit is much, much different. Neither Apple nor the user know what resources are being accessed, or even what code is run. This flaw allows arbitrary native code to be run remotely. It doesn't get much worse than that.

Not says:

Gotta love the Buy More shirt. lol

Josh Hurd says:

Charlie Miller has long been doing this sort of stuff. Apple knows him well yet still ignores him. Well publicly they ignore him but I am sure internally they know to look into the things he brings up. Getting banned from the App store is just another side effect of being a hacker. I'm sure he doesn't care....