SQL Server - Running large script files
Asked Answered
C

6

69

I have a database table on a development server that is now fully populated after I set it running with an import routine for a CSV file containing 1.4 million rows.

I ran the Database Publishing Wizard on the table, and now I have a 286MB SQL script on my local machine. The problem is, I can't figure out how to run it. If I load it into SQL Server Management Studio Express I get an alert window that says "The operation could not be completed".

Any ideas on how I can get this SQL script to run?

Cylindroid answered 21/10, 2008 at 16:7 Comment(1)
I asked a similar question on DBA and received some pretty good feedback (dba.stackexchange.com/questions/16763/…)Spratt
L
-9

Running something that large inside a single transaction is not a good idea. Therefore, I'd recommend breaking up the file into smaller, more manageable chunks.

Another option is to look at some of the other ways to import CSV data directly.

Lactescent answered 21/10, 2008 at 16:10 Comment(2)
Gulzar Nazim's answer is the best answer.Chihuahua
@feryt that may be your opinion, but that is not what the OP decided, and it is also not Michael's fault that you like someone else's answer better (even if the other answer did not solve the OP's problem).Aalborg
V
173

use the sqlcmd tool to execute the file..

sqlcmd -S myServer\instanceName -i C:\myScript.sql

In case your have an unexplained "script error" for large sql files (> 100MB) which includes several INSERT, just replace "INSERT INTO" by "GO INSERT INTO" in your file, which will reduce size of transaction.

Vonnie answered 21/10, 2008 at 16:9 Comment(6)
Using this method I get the error "Sqlcmd: Error: Scripting error." Is this to do with the size of the file?Cylindroid
first check on the connectivity. maybe you have to pass username and password (-U -P options). can you also try with a smaller file. I have ran huge scripts without any issues.Vonnie
If I take out all but 15 INSERT statements, it works. If I put the other 1,410,472 statements back in it fails with that error message.Cylindroid
Are all these statements inside a transaction? maybe it is timing out. try setting the timeout to 0 (infinite) before you run this.Vonnie
there is some acivity going on here to discuss large file text editors #223353Vonnie
sqlcmd loads the entire file, unfortunately. In fact it seems to consume a lot more memory than the file size.Polynesian
A
18

This tool (Big SQL Script File Runner) on CodePlex will run any size script file with log and GUI.

Arrow answered 1/6, 2014 at 11:59 Comment(0)
L
9

Adding to Gulzar Nazim's answer: If you still get a failure, try specifying the codepage of your SQL file using option -f:

sqlcmd -S myServer\instanceName -d databaseName -i C:\myScript.sql -f 65001

I was trying to import a .dump file from SQLite (UTF-8 by default), and sqlcmd kept throwing an error after encountering the first special character. -f 65001 fixed it for me.

Lole answered 23/4, 2015 at 13:12 Comment(0)
S
1

Why not just use DTS to import the CSV file directly?

Scherle answered 21/10, 2008 at 18:2 Comment(1)
A bunch of processing needs to be done on the data first, which I don't want running on the live server. Therefore, I need to be able to process the data into a simple SQL script to minimize the work required by that box.Cylindroid
B
1

Yes we could do that, I tried with BCP(Bulk Copy Program) approach in order to avoid OutOfMemory issue.

Note : Tried in SQLServer 2014

In BCP, first we need to export the Source DataBase data to bcp file(in local directory folder) and then need to import that bcp file to Source DataBase

enter image description here

Below are the cake walk steps:

Note:

a) Make sure empty table is present in Destination DataBase

b) Make sure Temp folder is present in C drive

1) Create a bat file named as Export_Data.bat with below command

bcp.exe [Source_DataBase_Name].[dbo].[TableName] OUT "C:\Temp\TableName.bcp" -S "Computer Name" -U "SQL Server UserName" -P "SQL Server Password" -n -q 

pause

2) Run that bat file, as a result of that a bcp file will get generated in Temp folder

3) Then Create a another bat file named as Import_Data.bat with below command

bcp.exe [Destination_DataBase_Name].[dbo].[TableName] IN "C:\Temp\TableName.bcp" -S "Computer Name" -U "SQL Server UserName" -P "SQL Server Password" -n -q 

Pause

And here we go!!

Baiss answered 16/7, 2019 at 17:38 Comment(0)
L
-9

Running something that large inside a single transaction is not a good idea. Therefore, I'd recommend breaking up the file into smaller, more manageable chunks.

Another option is to look at some of the other ways to import CSV data directly.

Lactescent answered 21/10, 2008 at 16:10 Comment(2)
Gulzar Nazim's answer is the best answer.Chihuahua
@feryt that may be your opinion, but that is not what the OP decided, and it is also not Michael's fault that you like someone else's answer better (even if the other answer did not solve the OP's problem).Aalborg

© 2022 - 2025 — McMap. All rights reserved.