My app has a potentially very large CoreData datastore underneath it (could easily be upwards of 30MB). I've started noticing memory issues when using the automatic migration (addPersistentStoreWithType:configuration:URL:options:error:
) so I started looking into the methods of migrating smaller parts of the store to avoid all the CoreData object buildup that happens when you migrate everything at once.
This is discussed in the official documentation in the "Multiple Passes" section, however it looks like their suggested approach is to divide up your migration by entity type, i.e. make multiple mapping models, each of which migrate a subset of the entity types from the complete data model.
The only problem is - what if one entity type is the majority of your datastore? Going by the Apple-recommended approach, that whole entity type is still going to be done in a single migration and the memory issues will presumably persist.
Are there any techniques available to actually migrate a sub-set of entities of a specific type to guarantee that you will not run out of memory when trying to migrate them all?
Thanks in advance for any help.
EDIT: After doing more digging, I have discovered that the Apple-recommended split of the DB into entity types actually only works for non-related entities (as discussed here), so it's even less likely to solve the problems of a real-world DB than I thought when I originally wrote this post.
I'm starting to think that CoreData migrations that are actually done through NSMigrationManager don't scale at all now and you basically can't have a DB that is bigger than about 20-30MB if you want to be able to migrate it on current generation iOS devices. The only viable approach seems to be to short-circuit all the NSMigrationManager / NSMappingModel stuff completely and write the migration completely custom in code. Seems like a huge oversight on Apple's part if this is actually the case.