From what I can tell, there are two levels of encryption that the iOS keychain uses. The first level uses the lock screen passcode as the encryption key. The second level uses a key generated by and stored on the device.

Fraunhofer's researchers have figured out how to get around the second level. This is the "easier" level to get around, since the encryption key is stored on the device. So on iOS4, their method only works with keychain entries which do NOT use kSecAttrAccessibleWhenUnlocked or kSecAttrAccessibleWhenUnlockedThisDeviceOnly, because those entries reside in memory with the first level decrypted--even when the phone is locked.

Starting from iOS 4, keys with kSecAttrAccessibleWhenUnlocked and kSecAttrAccessibleWhenUnlockedThisDeviceOnly are protected by an extra level of encryption

On iOS 3.x and earlier, all keys can be decrypted using Fraunhofer's method, regardless of accessibility attribute used

Devices with no passcodes at all will still be vulnerable

Devices with weak passcodes (less than six digits) will still be somewhat vulnerable

≈50ms per password try; → ≈20 tries per second; → ≈1.7 years for a 50%
change of guessing the correct passcode for a 6-digit alphanumeric
code with base 36. The standard simple code of 4 numeric digits would
be brute-forced in less than 9 minutes. Based on the assumption that
the counter for wrong tries in the iOS can be bypassed, as it is not
hardware-based

Bottom line:
If you must store sensitive data, better use your own encryption. And don't store the key on the device.

Edit:
There are numerous news articles which cite the Fraunhofer study and reassure their readers not to worry unless their devices are stolen, because this attack can only be done with physical access to the device.

I'm somehow doubtful. The fact the researchers did their tests with physical access to the phone seems to have just been a way to simplify the problem, as opposed to being a limitation. This is their description of what they did to decrypt the keychain entries:

After using a jailbreaking tool, to get access to a command shell, we
run a small script to access and decrypt the passwords found in the
keychain. The decryption is done with the help of functions provided
by the operating system itself.

As anyone who has used jailbreak.me knows, jailbreaking does not require physical access to the device. Theoretically it should be trivial to modify the jailbreak.me code and have it automate the following:

Perform the jailbreak as normal (all this requires is for the user open a maliciously crafted PDF)

Run Fraunhofer's scripts after the jailbreak is complete

Send the passwords over the network to a location the attacker can read it from

Physical access to the device is required, because there is a key stored somewhere on the motherboard which cannot be accessed or read by any means at all. This key is unique to each iOS device manufactured, and it means that only that specific device is capable of decrypting the device's data. So, physical access is required to decrypt, because you have to actually instruct the device to decrypt itself. Decrypting the device any other way is virtually impossible (as in, brute force attack taking billions of years). This doesn't apply to backups, which are encrypted without the on-device key
–
Abhi BeckertApr 8 '12 at 7:05

@AbhiBeckert: I think you misunderstood the meaning of physical access. The news article linked says "The attack, which requires possession of the phone...". But in fact there's no reason why a remote exploit that runs on the device cannot do the same thing.
–
pepsiApr 8 '12 at 14:38

A remote code exploit (unlikely on a fully patched phone) still runs in the same permissions as the exploited app, and all apps run in a sandbox - without read access to files outside a single directory the operating system creates specifically for it (empty by default). For a remote code exploit to gain arbitrary filesystem access would require a user who has rooted their phone (the whole point of rooting) or a privilege escalation exploit. Once again, if you apply patches you're pretty safe. Two zero day exploits is a a stretch. Without jail breaking, only USB allows full filesystem access.
–
Abhi BeckertApr 8 '12 at 21:46

You are right about jail breaking. If you jail break your phone almost all of the kernel's security features are flat out disabled. Then you are left with the same security as Mac OS X and Windows and (AFAIK) Linux: any app on your system can try to brute force your keychain, and with typical passwords it will not last long. Especially on a phone where 4 digit passcodes are common.
–
Abhi BeckertApr 8 '12 at 21:47

@AbhiBeckert - It's actually not a stretch at all--that's exactly how jailbreak.me worked. All the user had to do was visit a website to start the jailbreaking process. The user never had to connect their device to their computer. If I recall correctly, it actually did use multiple exploits to completely root the phone. My point was that if visiting a website can jailbreak your phone, then a malicious website can pretty much do anything it wants.
–
pepsiApr 8 '12 at 23:17

My understanding is that only keychain items with specific protection classes can be accessed with the technique described. These classes are kSecAttrAccessibleAlways and kSecAttrAccessibleAlwaysThisDeviceOnly. See forum.agile.ws/index.php?/topic/… for mo details.
–
Jean RegisserMar 8 '11 at 9:58

I can answer part of your question, but since the other part is still unknown, I'm voting the question up as I'm also eager to know the answer.

The part that I can answer is: 'can an app get full keychain access if no screenlock is enabled'. No, every app has its own keychain area on the iphone, which means an app can only get access to its own secrets. These secrets are not locked for the app itself, so there's no way to hide the keychain entries from the app itself. So to summarize: an app can read its own entries, and no other entries.

What I'm interested to know though is what happens on jailbroken devices. Are the keychains of all apps exposed once a device has a jailbreak?