Security researcher Charlie Miller recently discovered a bug in Apple’s iOS operating system that allowed him to execute downloaded code on the devices, compromising their security. He was set to present his findings at the SysCan conference in Taiwan next week.
“We're hunting for awesome startups”
Run an early-stage company? We're inviting 250 to exhibit at TNW Conference and pitch on stage!
Miller outlined the basics of the hack in an article on Forbes earlier today, which apparently uses a flaw in Apple’s code signing restrictions to execute code at a lower level than would normally be possible with its careful sandboxing:
Using his method–and Miller has already planted a sleeper app in Apple’s App Store to demonstrate the trick–an app can phone home to a remote computer that downloads new unapproved commands onto the device and executes them at will, including stealing the user’s photos, reading contacts, making the phone vibrate or play sounds, or otherwise repurposing normal iOS app functions for malicious ends.
Normally, apps on iOS are cordoned off in their own small portion of the devices memory in a procedure called sandboxing. This ensures security and prevents malicious code from executing elsewhere on the device. Code signing, the process of verifying that code is authorized to access certain functions of the device, along with sandboxing, are largely the reasons that the iPhone is more secure than most Android devices.
Miller, a former NSA analyst who works as a researcher with Accuvant, is relatively famous in hacking circles for his exploits of iPhone and Mac vulnerabilities.
He admitted to Forbes that he had submitted a ‘stealth’ app, called Instastock, to the App Store that allowed him to exploit this vulnerability to take control of a phone, activating vibrations, among other functions.
”Android has been like the Wild West,” Miller told Forbes’ Andy Greenberg. “And this bug basically reduces the security of iOS to that of Android.”
Frankly I’m not all that surprised that Miller’s apps were removed from the App Store once the news of the exploit began to spread around the web. Although specifics have not been divulged, Apple would be remiss if it did not remove the apps immediately.
As far as removing Miller’s developer account, he expresses some dismay, saying that it “feels heavy handed, I miss Steve.”
However, he did submit applications with a known potential to harm iOS devices if leveraged correctly to the App Store, so that is a pretty cut and dry example of breaking Apple policy.
If he is indeed signed up as a researcher, then it seems that it would be more prudent for him to submit the vulnerability to Apple privately. Miller doesn’t see it that way.
The code signing vulnerability will no doubt be examined by Apple and hopefully patched quickly. Miller’s work does raise some questions about Apple’s ability to police submitted applications that stealthily exploit issues in the OS.
That being said, it is unlikely that any iPhone user will be affected by this particular vulnerability, as Apple is obviously now aware of it. There is even an upcoming patch, iOS 5.0.1, that could most likely be altered to add a fix for this, if not soon thereafter.
The addition of over-the-air updates in iOS 5 means that Apple can patch issues like this much quicker than they used to be able to.
That doesn’t mean that we won’t see something similar in the future, however, which is why iOS security research is still important. Miller has long been a proponent and perpetrator of such research, so hopefully he can hash out what happened with Apple and continue his work.