How to Improve WCF Data Services Performance
Asked Answered
H

8

24

I'm new to WCF Data Services so I've been playing. After some initial tests I am disappointed by the performance of my test data service.

I realize that because a WCF DS is HTTP-based there is overhead inherent in the protocol but my tests are still way slower than I would expect:

Environment:

  • All on one box: Quad core 64-bit laptop with 4GB RAM running W7. Decent machine.
  • Small SQL database (SQLExpress 2008 R2) with 16 tables... the table under test has 243 rows.
  • Hosted my test service in IIS with all defaults.

Code:

  • I've created a Entity Framework model (DataContext) for this database (stock codegen by VS2010).
  • I've created a data-service based on this model.
  • I've created a client which has a direct service reference (ObjectContext) for this service (stock codegen by VS2010)
  • In the client I am also able to call the EF model directly and also use Native SQL (ADO.NET SqlConnection)

Test Plan:

  • Each iteration connects to the database (there is an option to reuse connections), queries for all rows in the target table ("EVENTS") and then counts them (thus forcing any deferred fetches to be performaed).
  • Run for 25 iterations each for Native SQL (SqlConnection/SqlCommand), Entity Framework (DataContext) and WCF Data Services (ObjectContext).

Results:

  • 25 iterations of Native SQL: 436ms
  • 25 iterations of Entity Framework: 656ms
  • 25 iterations of WCF Data Services: 12110ms

Ouch. That's about 20x slower than EF.

Since WCF Data Services is HTTP, there's no opportunity for HTTP connection reuse, so the client is forced to reconnect to the web server for each iteration. But surely there's more going on here than that.

EF itself is fairly fast and it's the same EF code/model is reused for both the service and the direct-to-EF client tests. There's going to be some overhead for Xml serialization and deserialization in the data-service, but that much!?! I've had good performance with Xml serialization in the past.

I'm going to run some tests with JSON and Protocol-Buffer encodings to see if I can get better performance, but I'm curious if the community has any advice for speeding this up.

I'm not strong with IIS, so perhaps there are some IIS tweaks (caches, connection pools, etc) that can be set to improves this?

Harwilll answered 12/10, 2010 at 16:38 Comment(9)
Interesting... a number of views, a few up-votes, and a couple of favorite adds, but no answers. I'm opening a bounty on this to breathe some more life into this question. Hopefully someone has an answer.Harwilll
I wouldn't use WCF Data Services unless I was planning to expose my data to other applications. If everything is running on the same box why not just use EF directly?Numbers
It's not running on the same box. But the data sources are all on a (very large and international) corporate network. I'm trying to put a service layer in front of a bunch of different data sources (SQL, XML, flat files, etc) that isolates the actual storage semantics from the ability to discover and query the data.Harwilll
How are you testing this.I once tested a webservice by IE. The browser overhead for showing the data was 80% or so.Kahn
How much data are you using for this test, Could be throughput.Bettinabettine
@Barfieldmv - I'm not using IE... just a standard console app client where I connect and run the query within a StopWatch block.Harwilll
@Bettinabettine - Not much data just a few 100 KB (less than a MB). And this is all local so network is not as much of a factor.Harwilll
a few 100KB XMLs into a large amount of data.Bratton
Can you give some feedback on these ? Did you finally dropped WCF DS for a conventional WCF service with more tunable settings ? Maybe you succeeded to optimize things enough to keep it? I'm really interested about your experience on that. Thanks!Patten
G
2

Consider deploying as a windows service instead? IIS may have ASAPI filters, rewrite rules, etc that it runs through. even if none of these are active, the IIS pipeline is so long, something may slow you down marginally.

a service should give you a good baseline of how long it takes the request to run, be packed, etc, without the IIS slowdowns

Glauce answered 8/11, 2010 at 23:21 Comment(1)
That's a good idea. I'll give this a shot also when I get back to this part of the project.Harwilll
E
2

The link below has video that has some interesting WCF benchmarks and comparisons between WCF data services and Entity Framework.

http://www.relationalis.com/articles/2011/4/10/wcf-data-services-overhead-performance.html

Exo answered 11/4, 2011 at 15:20 Comment(0)
N
2

I increased performance of our WCF Data Service API by 41% simply by enabling compression. It was really easy to do do. Follow this link that explains what to do on your IIs server: Enabling dynamic compression (gzip, deflate) for WCF Data Feeds, OData and other custom services in IIS7

Don't forget to iisReset after your change!

On the client-side:

// This is your context basically, you should have this code throughout your app.
var context = new YourEntities("YourServiceURL");
context.SendingRequest2 += SendingRequest2;

// Add the following method somewhere in a static utility library
public static void SendingRequest2(object sender, SendingRequest2EventArgs e)
{
    var request = ((HttpWebRequestMessage)e.RequestMessage).HttpWebRequest;
    request.AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate;
}
Number answered 24/7, 2015 at 0:25 Comment(0)
P
1

Try setting security to "none" in the binding section in the configuration. This should make big improvement.

Pointtopoint answered 31/10, 2010 at 8:4 Comment(1)
Good idea... I'll give it a try. Of course, I still need security in my final app. But I'm curious what the overhead is.Harwilll
T
0

In order to eliminate most of the connection overhead you can try to batch all operations to the WCF DS to to see if that makes a significant difference.

NorthwindEntities context = new NorthwindEntities(svcUri);
var batchRequests = 
     new DataServiceRequest[]{someCustomerQuery, someProductsQuery};

var batchResponse = context.ExecuteBatch(batchRequests);

For more info see here.

Thyrsus answered 26/10, 2010 at 1:3 Comment(1)
Batching is only possible when you know what operations you are performing ahead of time. In my case I need to react to the data, so unfortunately its a cycle of query->process->display->query->process->display.Harwilll
L
0

How do you pass those 25 iterations for WCF?

var WCFobj = new ...Service();
foreach(var calling in CallList)
   WCFobj.Call(...)

If you call like that it means you call WCF 25 times, which consumes too many resources.

For me, I used to build up everything into a DataTable and user table name to stored procedure I'm calling; DataRow is params. When calling, just pass the DataTable in encrypted form by using

var table = new DataTable("PROC_CALLING")...
...
StringBuilder sb = new StringBuilder();
var xml = System.Xml.XmlWriter.Create(sb);
table.WriteXml(xml);
var bytes = System.Text.Encoding.UTF8.GetBytes(sb.ToString());
[optional]use GZip to bytes
WCFobj.Call(bytes);

The thing is you pass all 25 calls at once, that can save performance significantly. If the return object is same structure, just pass it as DataTable in bytes form and convert it back to DataTable.

I used to implement this methods with GZip for import/export data modules. Passing large amount of bytes is going make WCF unhappy. Its depends whatever you want to consume; computing resources or networking resources.

Lamppost answered 31/10, 2010 at 6:29 Comment(1)
This is specifically for a WCF Data Service... the problem is very solvable for a standard WCF service where there is better control of the binding and message encoding.Harwilll
P
0

WCF DataServices are for providing your disparate clients with OpenData protocol; so as you don't have to write/refactor multiple web service methods for each change request. I never advise it to be used if the entire system is microsoft technology stack based. It's meant for remote clients.

Pennate answered 15/5, 2013 at 13:33 Comment(0)
B
-1

things to try:

1) results encoding: use binary encoding of your WCF channel if possible, see http://msdn.microsoft.com/en-us/magazine/ee294456.aspx -- alternately use compression: http://programmerpayback.com/2009/02/18/speed-up-your-app-by-compressing-wcf-service-responses/

2) change your service instance behavior, see http://msdn.microsoft.com/en-us/magazine/cc163590.aspx#S6 -- try InstanceContextMode = InstanceContextMode.Single, ConcurrencyMode=ConcurrencyMode.Multiple - if you can verify that your service is built in a thread safe way.

Regarding your benchmark, I think you should simulate more realistic load (including concurrent users) and ignore outliers, the first request to IIS will be really slow (it has to load all the DLLs)

Bratton answered 8/11, 2010 at 20:31 Comment(4)
It sounds like you're giving more general WCF guidance than specific WCF Data Service guidance.It not possible (that I know of) to change the encoding method for WCF Data Services... they must be HTTP endpoints with text data... the serialization format can be either Xml or Json, but it must be text. So compression and binary encoding is not possible (I wish it was).Harwilll
I'll look at the instancing and concurency. Best practice for WCF states that PerCall instancing should be used, but single instance with multiple concurrency might work as well. I'll give it a shot.Harwilll
As for the benchmark, realism is less an issue as consistency between operations across different providers. I'm already discarding the outliers... the initialation run takes about 4-5 seconds all on it's own. I can live with that if it's only once.Harwilll
compression works with text replies, obviously binary encoding is out. all the xml data you are talking about is going to take up lots of time. PerCall means you have to instantiate your DB connections every time, Single means they are cached (again, thread safty will be an issue).Bratton

© 2022 - 2024 — McMap. All rights reserved.