This is more of a conceptual question not necessarily bound to any specific technologies. Lets say you got some database on a server, some REST/JSON API to access content in that database and some mobile client displaying data retrieved through the API.
It would be nice to have some caching mechanism on the client and also to be able to enable offline access to the data as long as the client is only reading (In my case it's fine to deny write access to offline clients to avoid having to manage all those nasty conflicts that might happen).
It appears that a nice way to solve that would be to have a subset of the servers database model present on the client and synchronizing data from the server to the client. Access to the local database might then immediately return results but also trigger update requests to the server. In case the server returns modified data the client model then synchronizes it's local database and notifies the display of data changes.
The goal in the end is of course is that the user may browse the information regardless of the stability of his internet connection and is not annoyed by connection dialogs or similar as long as he doesn't modify any data.
Now from an implementation perspective... on one hand it seems like a bad idea to couple the server database directly to the client database as they may be from different vendors. I guess at least there would need to be a vendor independent model above both database implementations. On the other hand, transforming the data from the server database into some transport format and than putting it back into the client database seems like a lot of overhead.
Any suggestions how to solve that in an elegant and maintainable way?