Saturday, February 16, 2008

Transferring large DataSets with WCF

I was working on an application which transfered a large amount of data in a DataSet object to a client. Everything worked fine for small amounts of data, but as soon as the DataSet got bigger than 100 rows, client didn't get the message sent by the server anymore!

It took some time to figure out what the problem was. If you're transferring a large amount of data and having the same problem you may need to do two things :

  • You should set the MaxReceivedMessageSize on your binding to a larger value. If the message sent by the client/server exceeds this value, they will not recieve it. Use this value cautiously because setting it a value larger than the data you can load into your RAM could at least crash the application, and also you might expose your server to Denial of Service attacks.

  • You might also need to set the MaxArrayLength property of the ReaderQuotas object on your client/server binding, especially if you recieve serialization exceptions when the data is being serialized to be sent to client/server. Remember to add a reference to System.Runtime.Serialization in case you need to do this.
Here's how you can do this :

NetTcpBinding binding = new NetTcpBinding(SecurityMode.None, true);
binding.MaxReceivedMessageSize = 10485760;
binding.ReaderQuotas.MaxArrayLength = 10485760;

Also, if you're transferring data objects like large files, audio or video data, etc. you may consider using streaming message mode of WCF. In this mode, the large message is split into chunks and each chunk is loaded into memory and sent through the service. For more info on this see
Submit this story to DotNetKicks Shout it

1 comment:

maruti said...

just created a video and uploaded for the easy kindly download the same and you will get the idea regarding the buffer size, or maximassagesize issue, or maxarraylength etc.

for further videos --