I have a texture-heavy OpenGL game that I'd like to tune based on how much RAM the device has. The highest resolution textures I have work fine on an iPhone 4 or iPad2, but earlier devices crash in the middle of loading the textures. I have low-res versions of these textures, but I need to know when to use them.
My current tactic is to detect specific older devices (3GS has a low-res screen; iPad has no camera), then only load the hi-res textures for IPad2 and above and iPhone 4 and above — and I suppose I'll need to do something for the iPod touch. But I'd much rather use feature detection than hard-coding device models, since model detection is fragile against future changes to APIs and hardware.
Another possibility I'm considering is to load the hi-res textures first, then drop and replace them with lo-res the moment I get a low memory warning. However, I'm not sure I'll get the chance to response; I've noticed that the app often dies before any notification appears on the debug console.
How do I detect whether the device I'm running on has insufficient RAM to load hi-res versions of my textures?
Taking a step back, is there some other adaptive technique I can use that's specific to OpenGL texture memory?
Notes:
I've searched on and off SO for answers related to available RAM detection, but they all basically advise profiling memory usage and eliminating waste (minimising the lifetime of temporaries, and all that guff). I've done as much of that as I can, and there is no way I am going to squeeze the hi-res textures into the older devices.
PVRTC isn't an option. The textures contain data to be used by fragment shaders and must be stored in a lossless format.