SQLCommand ExecuteNonQuery Maximum CommandText Length?
Asked Answered
D

2

9

I've searched around the internet and everything seems to be about individual fields or doing one insert. I have a migration tool that is migrating from an old legacy database (superbase) to our SQL server DB (2008). Currently I'm reading 20,000 records from the old database and generating one big SQLCommand.CommandText string with 20,000 insert statements delimited by a semicolon. This works fine. But can I do 25k? 30k? I tried not having any limit at all, but when I tried to run ExecuteNonQuery with a CommandText containing over 4 million INSERT statements, it said something about being too long. I don't recall the exact error message, sorry. I can't find any documentation on the exact limitations. It's important for me to find out because the more inserts I can fire at a time, the faster the whole process. Any advice / links / documentation would be greatly appreciated, thanks!

Deshawndesi answered 27/1, 2014 at 20:4 Comment(8)
Your limitations are .NET string size (which is int.MaxValue.. you will run out of memory before hitting that) and the maximum allowable query size for SQL...whatever that is.Winterkill
I figured as much, but finding the actual max query size is where I'm having troubleDeshawndesi
I believe it is 64K * packet size. However instead of 20,000 individual insert statements, have you considered (a) breaking it up into smaller batches (b) using VALUES() to break it up into, say, 20 INSERT statements? (c) using SSIS or some other tool that is much better at generating bulk operations than 20,000 individual INSERT statements will ever be?Gallant
Also, why do you need to "recall" the exact error message? Can't you reproduce the problem, then copy and paste the error message, without having to invoke your own memory at all? Not that you should ever be sending 4 million INSERT statements in any scenario, regardless of issues with string lengths in the application language.Gallant
Breaking it up into smaller batches was the original problem. The more inserts that can be done in one ExecuteNonQuery the faster the migration. I can reproduce the problem but it would take 15 hours or so. It's a long story. Getting data out of SuperBase requires using their ODBC driver which is painfully slow. We're talking like 5 mins to gather 1000 records depending on the query. I don't know how I'd use SSIS or any tool since this DB is 20 years old and not supported by anything but I'll look into it furtherDeshawndesi
So SuperBase doesn't have a way to extract this data in any type of bulk fashion? If not, then perhaps you should plan to have a significant window to perform this migration correctly, instead of worrying about doing it as fast as possible...Gallant
Correct. And I would agree (do it right rather than fast), but that's not an option given my requirements for the project.Deshawndesi
The old addage. Never time to do it right, never time to fix it. (:)->-<Draughtboard
D
7

It depends on the SQL Server version.

http://technet.microsoft.com/en-us/library/ms143432.aspx

For SQL Server 2012, the batch size or SQL Statement (String) = 65,536 * Network packet size.

Network Packet Size is the size of the tabular data stream (TDS) packets used to communicate between applications and the relational Database Engine. The default packet size is 4 KB, and is controlled by the network packet size configuration option.

In short, 64K x 4K = 256 MB. Again, this assumes you are using default packet sizes.

Draughtboard answered 27/1, 2014 at 20:18 Comment(14)
This is SQL Server 2008, how do I determine whether or not default packet sizes are being utilized?Deshawndesi
technet.microsoft.com/en-us/library/ms177437(v=sql.100).aspx Here is how you do it in SQL Server 2008. What size do you have??Draughtboard
4096. Thank you!! So now I just have to make sure my CommandText string is 256MB or less. Correct?Deshawndesi
Exactly. Please give me credit for the answer. Happy programming!Draughtboard
I will test this and, once successful, gladly give you credit. I appreciate itDeshawndesi
PS - Aaron is right about 4 M insert statements. I used to use TOAD to dump data to inserts. However, if you have a chance in the future, rewrite this process!Draughtboard
Not familiar with TOAD (I'm not a DBA -- clearly). I'll investigate what TOAD means, but other than that. I don't know how I could rewrite the process other than firing off 256MB commands at a time.Deshawndesi
Toad is no better. It will generate INSERTS for recordset data. I one application written in PowerBuilder and SQL Server 7.0. Can not get rid of it! I understand the problem, but the business is not willing to spend $.Draughtboard
I'm getting an out of memory exception trying to set the CommandText property with a string that's only 221K in length, if int.MaxValue is over 2 billion. Does it have to do with memory limitations on the hardware? I'm 45MB under the 256MB on batch sizeDeshawndesi
Okay, I am not familiar with SuperBase. If it is an odbc driver, SSIS can use it. technet.microsoft.com/en-us/library/hh758693.aspxDraughtboard
Is there any way you can partition the data. For instance, I found if I kicked off one DataWareHouse job, it took 10 hrs to run. But running 10 jobs each taking 1/10 of the data took only 1 hr to run. Something like this depends on network speed, CPU power, and disk speed. Is SuperBase a single threaded application?Draughtboard
It is ODBC but SSIS doesn't work because SuperBase doesn't have normal schema data names. You can get the "tables" schema but there's no "table_name" column, which it's expecting. It's "tablename" without the _. And I've tried running more than 1 job at a time but the ODBC driver is very unstable, even 2 jobs running and it can have a memory overload. It's 16 bit so there's no way to flush the memory that I'm aware of, so it requires a restartDeshawndesi
superbase.com/downloads/lib-08-odbc-and-file-engine/ODBC.PDF This is like VB 3.0, back in the early 16 bit days. Wow, the memories!Draughtboard
Oh yeah. I spent countless hours in VB3. I was making hacker apps for AOL back in the day. Anyway. I found my magic query length somewhere between 44 million and 45 million so I'm just going to fire off the bulk inserts once the query reaches 44 million. Thanks StringBuilder. Thanks for your help CRAFTY DBADeshawndesi
C
1

A brief bing search showed me this link: http://dbaspot.com/sqlserver-programming/399616-what-max-length-sqlcommand-commandtext.html

It says:

The maximum size for a query batch is 65536 * network packet size. The default packet is 4096 bytes, which gives an upper limit of 268 million bytes.

It's unclear if that changes for different versions of sql server though.

Christabella answered 27/1, 2014 at 20:11 Comment(4)
I don't think the limitation is on my string length because the error popped up on the ExecuteNonQuery not setting the CommandText line. Seems to be a limitation in SQL Server. I read there's a setting called Maximum insert commit size but I'm not sure if that has anything to do with it, where to find it, or what its max value can be set toDeshawndesi
Right. That's what that link refers to - the SQL Server limit. The string length is mentioned in that thread, but the last post (the one I quoted) is referring to the SQL Server limit. That's about 1/8th of the maximum size of a string, I think.Christabella
"The maximum size for a query batch is 65536 * network packet size. The default packet is 4096 bytes, which gives an upper limit of 268 million bytes." Any idea how I determine network packet size?Deshawndesi
You are both right. 256 MB = about 268 million bytes. 65,536 x 4096 = 268,435,456. Saying a MB equals a million bytes is like saying a liter equals a quart.Brecciate

© 2022 - 2024 — McMap. All rights reserved.