iOS Keychain Security
Asked Answered
A

4

56

we want to use certificates on the iPhone to authenticate for MS Exchange Sync. We are not sure how the security concept is implemented to protect this certificates.

e.g. is it possible to get "full" Keychain access on the iPhone if no ScreenLock is enabled? (or with an Jailbroken iPhone).

Does anybody has some links about this?

Antofagasta answered 24/8, 2010 at 15:43 Comment(2)
This question would be appropriate over at security.stackexchange.comPreparatory
Still, very relevant to all of us iOS devs who frequent Stack Overflow. Perhaps we all should visit security.stackexchange.com more frequently? :)Margo
U
50

Fraunhofer's study on iOS keychain security:

From what I can tell, there are two levels of encryption that the iOS keychain uses. The first level uses the lock screen passcode as the encryption key. The second level uses a key generated by and stored on the device.

Fraunhofer's researchers have figured out how to get around the second level. This is the "easier" level to get around, since the encryption key is stored on the device. So on iOS4, their method only works with keychain entries which do NOT use kSecAttrAccessibleWhenUnlocked or kSecAttrAccessibleWhenUnlockedThisDeviceOnly, because those entries reside in memory with the first level decrypted--even when the phone is locked.

  • Starting from iOS 4, keys with kSecAttrAccessibleWhenUnlocked and kSecAttrAccessibleWhenUnlockedThisDeviceOnly are protected by an extra level of encryption
  • On iOS 3.x and earlier, all keys can be decrypted using Fraunhofer's method, regardless of accessibility attribute used
  • Devices with no passcodes at all will still be vulnerable
  • Devices with weak passcodes (less than six digits) will still be somewhat vulnerable

≈50ms per password try; → ≈20 tries per second; → ≈1.7 years for a 50% change of guessing the correct passcode for a 6-digit alphanumeric code with base 36. The standard simple code of 4 numeric digits would be brute-forced in less than 9 minutes. Based on the assumption that the counter for wrong tries in the iOS can be bypassed, as it is not hardware-based

Apple Inc. WWDC 2010, Core OS, Session 209 "Securing Application Data", Slide 24

Bottom line: If you must store sensitive data, better use your own encryption. And don't store the key on the device.

Edit: There are numerous news articles which cite the Fraunhofer study and reassure their readers not to worry unless their devices are stolen, because this attack can only be done with physical access to the device.

I'm somehow doubtful. The fact the researchers did their tests with physical access to the phone seems to have just been a way to simplify the problem, as opposed to being a limitation. This is their description of what they did to decrypt the keychain entries:

After using a jailbreaking tool, to get access to a command shell, we run a small script to access and decrypt the passwords found in the keychain. The decryption is done with the help of functions provided by the operating system itself.

As anyone who has used jailbreak.me knows, jailbreaking does not require physical access to the device. Theoretically it should be trivial to modify the jailbreak.me code and have it automate the following:

  1. Perform the jailbreak as normal (all this requires is for the user open a maliciously crafted PDF)
  2. Run Fraunhofer's scripts after the jailbreak is complete
  3. Send the passwords over the network to a location the attacker can read it from

So once again, be cautious about what you put in the keychain.

Uncertainty answered 15/7, 2011 at 20:12 Comment(9)
Physical access to the device is required, because there is a key stored somewhere on the motherboard which cannot be accessed or read by any means at all. This key is unique to each iOS device manufactured, and it means that only that specific device is capable of decrypting the device's data. So, physical access is required to decrypt, because you have to actually instruct the device to decrypt itself. Decrypting the device any other way is virtually impossible (as in, brute force attack taking billions of years). This doesn't apply to backups, which are encrypted without the on-device keyMahan
@AbhiBeckert: I think you misunderstood the meaning of physical access. The news article linked says "The attack, which requires possession of the phone...". But in fact there's no reason why a remote exploit that runs on the device cannot do the same thing.Uncertainty
A remote code exploit (unlikely on a fully patched phone) still runs in the same permissions as the exploited app, and all apps run in a sandbox - without read access to files outside a single directory the operating system creates specifically for it (empty by default). For a remote code exploit to gain arbitrary filesystem access would require a user who has rooted their phone (the whole point of rooting) or a privilege escalation exploit. Once again, if you apply patches you're pretty safe. Two zero day exploits is a a stretch. Without jail breaking, only USB allows full filesystem access.Mahan
You are right about jail breaking. If you jail break your phone almost all of the kernel's security features are flat out disabled. Then you are left with the same security as Mac OS X and Windows and (AFAIK) Linux: any app on your system can try to brute force your keychain, and with typical passwords it will not last long. Especially on a phone where 4 digit passcodes are common.Mahan
@AbhiBeckert - It's actually not a stretch at all--that's exactly how jailbreak.me worked. All the user had to do was visit a website to start the jailbreaking process. The user never had to connect their device to their computer. If I recall correctly, it actually did use multiple exploits to completely root the phone. My point was that if visiting a website can jailbreak your phone, then a malicious website can pretty much do anything it wants.Uncertainty
Sure, but jailbreak.me doesn't work the latest version of iOS. Whenever they find a new security hole, apple closes it. It looks like JailBreak.me hasn't worked on any version of iOS released in the last eleven months.Mahan
jailbreak.me for iOS 4 proves the concept of this type of attack. All it takes are a new set of exploits for it to happen. The fact that Apple patches them after the fact is not really relevant.Uncertainty
What about adding a layer of encryption, to be 'a bit safer'? Let's say you encrypt your data yourself before storing it on the keychain. And the 'key' used in your adhoc algorithm can't be easily found in the binary...Margo
Any new information regarding those findings and future releases of the iOS (7 - 8) ?Crista
C
4

Normally, the keychain would be the recommended way to store such a certificate. However, it has been discovered that jailbreaking can be used to bypass the security of the keychain (article).

Centime answered 2/3, 2011 at 10:20 Comment(2)
My understanding is that only keychain items with specific protection classes can be accessed with the technique described. These classes are kSecAttrAccessibleAlways and kSecAttrAccessibleAlwaysThisDeviceOnly. See forum.agile.ws/index.php?/topic/… for mo details.Upbow
Yes that article just confirms that you should not store sensitive items with attribute kSecAttrAccessibleAlways, see developer.apple.com/library/ios/#DOCUMENTATION/Security/…Inappropriate
G
3

Franhofer did a study on the safety of the iPhone Keychain :

http://www.sit.fraunhofer.de/Images/sc_iPhone%20Passwords_tcm501-80443.pdf

Galumph answered 19/3, 2011 at 22:48 Comment(0)
V
2

I can answer part of your question, but since the other part is still unknown, I'm voting the question up as I'm also eager to know the answer.

The part that I can answer is: 'can an app get full keychain access if no screenlock is enabled'. No, every app has its own keychain area on the iphone, which means an app can only get access to its own secrets. These secrets are not locked for the app itself, so there's no way to hide the keychain entries from the app itself. So to summarize: an app can read its own entries, and no other entries.

What I'm interested to know though is what happens on jailbroken devices. Are the keychains of all apps exposed once a device has a jailbreak?

Vollmer answered 11/10, 2010 at 15:54 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.