Sync Framework with SQL DB's: Getting Started
Asked Answered
C

9

17

I have an app that uses SQL Enterprise to store all data in 4 different DB's. I needed to build in the ability to work "offline" for my users. I accomplished this through Merge Replication to local SQL Express installs for everyone. This "Works" but feels like the sledge hammer approach.

For instance, I am replicating all 14000 people to every DB when any individual user may only EVER interact with a 100 or so. That's not even counting the fact that they would NEVER interact with more then 5 ish in between connections to the Central DB's.

What I am looking for is tips, pointers, and, maybe, a nice tutorial on Sync Framework 2(with Databases). First hand accounts on what worked for you and why would also be most welcome. I have yet to come across a clear and concise(not to mention current) tutorial for working with Sync Framework.

My specifics are MS SQL Server 2005 or 2008, any version. Any .Net version(3.5 or 4). The current data layer is all LinqToSQL. There are not any Sprocs currently in use.

My thoughts, so far, are to only sync each workers assigned caseloads and associated data. Ideally, we would go straight to a "Check In/Check Out" format where they select the members they plan to go visit and it then syncs the necessary data.

As a bonus, could someone tell me what this is referred to as? I come across "Occasionally Connected" all the time but that seems inaccurate. It would be more accurate to call them "Occasionally DIS-Connected", thoughts?

Czarevna answered 9/8, 2010 at 20:12 Comment(3)
Please let me know if this needs more detail, re-wording, or better focus. This is very important to me.Czarevna
I am very grateful to all those who have taken the time to respond. I am disappointed in myself for not wording my question better as the answers, so far, do not really address my concern. I will Edit one more time to try and help this.Czarevna
I suppose you probably have solved this problem by now, but if not I can suggest a couple of ways of using sync framework 2.0 or 2.1 together with sql 2005 or higher to address this. I've implemented something similar for an occasionally connected .NET 4 app for 400 users.Progeny
K
3

We ve been using Sync framework in couple of our projects (one with sql server, one with PGsql) so I can say that it works fairly well.

Check out this walkthrough application to get an idea of what you can do.

http://code.msdn.microsoft.com/sync/Release/ProjectReleases.aspx?ReleaseId=4835

This will show you how to synchronize data between multiple sql server databases. You can also customize the 'increments' stored procedures to take custom parameters and filter data based on these parameters (for instance people the client-user is planning to visit).

I will also suggest that you use a reflector to decompile Sync framework code in case you see wierd errors - sometimes it is not possible to figure out where the error is till you see the exception getting caught in the framework code. Redgate works perfectly for me!

Let me know if you need more help!

Kletter answered 19/8, 2010 at 11:49 Comment(2)
BTW - the example given is fairly detailed and shows how the sync works with only two tables in three different databases (one server and two clients). You will have to handle scenarios of client inserts creating conflicts on primary keys - we did by having widely different sequences on each of the clients.Kletter
Note: You may not really be able to give people directly, but you will be able to mention filtering criteria (location? account?) in the increments SP.Kletter
O
1

I have a couple of suggestions and hints that may or may not be obvious / helpful.

To me it this sounds like a problem that can be split into 3 separate aspects:

Synchronisation process

  • You should definitely ensure that all rows have some sort of "last_update" column so that your synchronisation process can efficiently and reliably determine what data is already up-to-date - that way you can be a lot more aggressive with the number of records that you are synchronising.
  • I would avoid having a complex synchronisation processes and simply use the brute force approach wherever possible. 14,000 records doesn't sound like that many to me - with optimisations you may find that its possible to synchronise all changes made between connections in a reasonable time. If not then I'd still probably be quite liberal about what you synchronise to avoid users working with out-of-date data without realising.

Pushing changes made in offline mode

If at all possible I would probably simply disallow changes in offline mode - it thats not possible then you should consider pushing changes as being as a separate (and probably fairly complex) process in its own right.

Without knowing more about the application its difficult to make good suggestions as the synchronisation process is very business dependent, however some things to consider are:

  • Should uses be able to lock (or check out) an item to prevent other people from changing it while they are working offline?
  • Should the lock be overridable?
  • If a lock is overrided, what should happen when someone attempts to save changes (a merge process seems like a sensible choice)
  • Should someone be able to edit an item that they don't have checked out?

If you do decide to implement a complex offline changes process then you might want to take a look at the workflow used in common distributed VCS for inspiration.

Changing the offline data store

You might find using SQL Compact or SQLite as your local data store a more elegant solution (it would certainly make the installation process easier), if you are using LINQ then I'd probably swing in favour of SQL Compact as it definitely has LINQ to SQL support.

I would focus on the above two first as this change delivers the least in terms of end-user improvements and is probably the most work - the above two are completely achievable while still using SQL Server Express as the local data store.

Ozzy answered 16/8, 2010 at 6:28 Comment(0)
C
1

http://code.msdn.microsoft.com/sync/Release/ProjectReleases.aspx?ReleaseId=3422 This is a nice walk-through using SQL Server CE should work great for what you are trying to do and has a much smaller footprint. Seems to me that the functionality that you lose with not hamper what you are trying to do in the least.

Curative answered 20/8, 2010 at 12:36 Comment(0)
C
0

This is essentially called Distributed Computing.

In any such system that I've worked on, I've typically used a N-Tier Architecture to allow for occasional connecting of a client program to the database server on the back-end for CRUD operations. Any work that the client does is considered connection-less, as they do not interact with the server until they save their changes.

Using this kind of approach you should be able to make an application that allows them to connect for a moment to get some data, perform some work on the data and then propagate any changes (CRUD) to the server (database).

I would have to say that using Merge replication with Local Sql Express installs is for sure a sledge hammer approach. Don't feel too bad about it, we all do it from time to time :P.

EDIT

I have not used Sync Framework myself, but it seems nice. Check out the Sync Framework Developer Center for some more information on it.

Cystectomy answered 11/8, 2010 at 15:57 Comment(0)
S
0

It sounds to me like Microsoft Sync may be a good option for you. There is an overview of synchronizing databases here, http://msdn.microsoft.com/en-us/sync/bb887608.aspx. Take a look at it and see if it meets your needs. It sounds like the perfect solution to your problem.

Sethsethi answered 11/8, 2010 at 16:3 Comment(1)
Thanks, I have seen that. I was just hoping for something more hands on. Thanks for your time.Czarevna
N
0

The question is, does your app need to run on the device with locally stored disconnected data, or can you create a mobile version of a web front end because they're assumed to always need to be connected to a server when working with the data?

The projects I worked on (a long time ago) used the compact framework and activesync with either an Access or SQL Server back end because it was too expensive to have each device connected all the time, like a phone. They took the device to the field, needed to access the data from the disconnected db, did their mods, and synced when they got back in to the office.

If that's what you're doing, you can use their deviceID to identify which cases/rows each device needs, but if you're thinking of developing a custom sync solution, I think you're fighting the wrong battle. That's been done already. Use a proven sync solution, limitations and all, and focus on criteria you can use to limit the synced data.

Naked answered 12/8, 2010 at 17:38 Comment(2)
Thank you for the response. My users will be connected, in the office, most of the time and will then take there laptops out into the field. They will, usually, know ahead of time what data they will need. It needs to be persistent data, so a local DB. They will not have a connection while in the field.Czarevna
I'm surprised they don't want tablets. Most people taking laptops into the field find them cumbersome. Again, I'd use some freely-available sync solution, even though it will probably mean you're syncing more data than they actually need in the field. Most sync solutions are table and field level, not row level.Naked
F
0

It seems that you need to have more control over the merging data options. With this requirement I can see there are 2 types of data; Data that belongs to a Member (Case belongs to a member), and Shared Data (Reference, Master data).

I would have a mixture of these techniques in this case to make things semi rapid.

For instance, Master and Reference data changes are not that frequent. Therefore, we need not to sync very often but need to have the control over changes and should/must done in a central location. In this case, I would do the changes on my Central DB and use the Sledge Hammer Approach.

For Transactional data, assuming once-directional, in this case member can only change cases/rows that belongs to him/her, can be implemented using many ways.

As mentioned above, you can use Sync Framework Developer Center or ActiveSync.

Since the thresholds are not that strict like in PDAs or Palmtops, we can tryout a manual approach as well since we have Laptops. Implementing a functionality/process routine to connect and push data that is created, updated, and deleted (in this case we have to maintain a flag in the db level) using BulkCopy Operations a most common approach in this scenario (need to provide a button for members to click and invoke, or a service to poll and check the connectivity and start sync routine automatically).

/KP

Frontlet answered 15/8, 2010 at 16:0 Comment(0)
U
0

After many days of struggling with the sync framework I'm considering ditching it and just writing some simple WCF services and code. My requirements are fairly simple, one-way synchronisation from SQL2008 to SQL CE on mobile devices. From my experience it's very difficult to customise (only sync some fields etc) and extremely slow and inefficient. I think Microsoft need to do some more work to make it a bit easier to use.

Cheers

Mark

Unitary answered 19/8, 2010 at 0:8 Comment(0)
S
0

Have you attempted to filter your merge replication articles at the row level yet? This would reduce your 14k row size into something manageable.

Shabuoth answered 19/8, 2010 at 14:54 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.