How can I make a big insertion with SqlBulkCopy from a List<> of simple object ?
Do I implement my custom IDataReader ?
How can I make a big insertion with SqlBulkCopy from a List<> of simple object ?
Do I implement my custom IDataReader ?
Simply create a DataTable from your list of objects and call SqlBulkCopy.WriteToServer
, passing the data table.
You might find the following useful:
For maximum performance with SqlBulkCopy, you should set an appropriate BatchSize. 10,000 seems to work well - but profile for your data.
You might also observe better results when using SqlBulkCopyOptions.TableLock.
An interesting and informative analysis of SqlBulkCopy performance can be found here.
With FastMember, you can do this without ever needing to go via DataTable
(which, in my tests, more-than-doubles the performance):
using(var bcp = new SqlBulkCopy(connection))
using(var reader = ObjectReader.Create(data, "Id", "Name", "Description"))
{
bcp.DestinationTableName = "SomeTable";
bcp.WriteToServer(reader);
}
Note that ObjectReader
can also work with non-generic sources, and it is not necessary to specify the member-names in advance (although you probably want to use the ColumnMappings
aspect of SqlBulkCopy
if you don't specify them in the ObjectReader
itself).
DataTable
first, so I wanted to use this method. However is the strings listed in the params variable the actual names of the variables being used in order from the object the underlying data structure is iterating over? –
Avery Simply create a DataTable from your list of objects and call SqlBulkCopy.WriteToServer
, passing the data table.
You might find the following useful:
For maximum performance with SqlBulkCopy, you should set an appropriate BatchSize. 10,000 seems to work well - but profile for your data.
You might also observe better results when using SqlBulkCopyOptions.TableLock.
An interesting and informative analysis of SqlBulkCopy performance can be found here.
Late to the party, but if you add this EntityDataReader
class from Microsoft, there's an AsDataReader()
extension method that does exactly that: https://github.com/matthewschrager/Repository/blob/master/Repository.EntityFramework/EntityDataReader.cs
(example [List].AsDataReader()
implementation:)
var connStr = "";
using (var connection = new SqlConnection(connStr))
{
var startTime = DateTime.Now;
connection.Open();
var transaction = connection.BeginTransaction();
try
{
//var connStr = connection.ConnectionString;
using (var sbCopy = new SqlBulkCopy(connection, SqlBulkCopyOptions.Default, transaction))
{
sbCopy.BulkCopyTimeout = 0;
sbCopy.BatchSize = 10000;
sbCopy.DestinationTableName = "Foobars";
var reader = Foobars.AsDataReader();
sbCopy.WriteToServer(reader);
}
transaction.Commit();
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
transaction.Rollback();
}
finally
{
transaction.Dispose();
connection.Close();
var endTime = DateTime.Now;
Console.WriteLine("Upload time elapsed: {0} seconds", (endTime - startTime).TotalSeconds);
}
}
sbCopy.ColumnMappings.Add(new SqlBulkCopyColumnMapping("col1", "col1"));
–
Subdominant Depending on what you are trying to accomplish by calling SqlBulkCopy
in the first place, it might make more sense to use a Table-Valued Parameter (TVP). Using a TVP would make it trivial to send in a collection of any custom type. The data can be streamed in so you can avoid the DataTable
(much like in @Marc Gravell's answer) and you can avoid SqlBulkCopy
as well. TVP's allow for completely flexibility of how to handle the data once it gets to SQL Server as you call a Stored Procedure to pass the TVP data into and it appears as a Table Variable that you can do anything with, not just INSERT
(which is the case with SqlBulkCopy
). You can also get data back via a SqlDataReader
, data such as newly created IDENTITY
values. I added an example and some additional notes on this answer: How can I insert 10 million records in the shortest time possible?. And several years ago I wrote an article on SQL Server Central (free registration required), Streaming Data Into SQL Server 2008 From an Application, which is also noted in that linked answer, providing a working example of passing in a Generic List of a custom type, streamed in from a 3 million row text file.
Came across a similar situation trying to insert a couple million rows into the db.
Got this done by converting the List into a DataTable and then inserting the table into the database.
private static DataTable CreateDataTableItem(List<ListItem> ItemList)
{
DataTable dt = new DataTable();
try
{
dt.TableName = "PartPrice";
foreach (PropertyInfo property in typeof(ListItem).GetProperties())
{
dt.Columns.Add(new DataColumn() { ColumnName = property.Name, DataType = Nullable.GetUnderlyingType(property.PropertyType) ?? property.PropertyType, AllowDBNull = true });
}
foreach (var item in ItemList)
{
DataRow newRow = dt.NewRow();
foreach (PropertyInfo property in typeof(ListItem).GetProperties())
{
newRow[property.Name] = item.GetType().GetProperty(property.Name)?.GetValue(item, null) ?? DBNull.Value;
}
dt.Rows.Add(newRow);
}
return dt;
}
catch (Exception ex)
{
Console.WriteLine(ex.ToString());
return null;
}
}
public class ListItem
{
public int Id { get; set; }
public string Name { get; set; }
public int? NullableId { get; set; }
}
And then bulk insert using
private void BulkInsert(DataTable dt)
{
string consString = _config.GetConnectionString("yourConnectionStringkey");
using SqlConnection connection = new SqlConnection(consString);
connection.Open();
using var sqlBulkCopy = new SqlBulkCopy(connection, SqlBulkCopyOptions.TableLock | SqlBulkCopyOptions.UseInternalTransaction, null);
sqlBulkCopy.DestinationTableName = "dbo.TargetDb";
sqlBulkCopy.ColumnMappings.Add("Id", "Id");
sqlBulkCopy.ColumnMappings.Add("Name", "Name");
sqlBulkCopy.ColumnMappings.Add("NullableId", "NullableId");
sqlBulkCopy.WriteToServer(dt);
connection.Close();
}
You dont have to do the column mappings given
// Summary:
// Returns a collection of Microsoft.Data.SqlClient.SqlBulkCopyColumnMapping items.
// Column mappings define the relationships between columns in the data source and
// columns in the destination.
//
// Value:
// A collection of column mappings. By default, it is an empty collection.
//
// Remarks:
// If the data source and the destination table have the same number of columns,
// and the ordinal position of each source column within the data source matches
// the ordinal position of the corresponding destination column, the <xref:Microsoft.Data.SqlClient.SqlBulkCopy.ColumnMappings>
// collection is unnecessary. However, if the column counts differ, or the ordinal
// positions are not consistent, you must use <xref:Microsoft.Data.SqlClient.SqlBulkCopy.ColumnMappings>
// to make sure that data is copied into the correct columns. During the execution
// of a bulk copy operation, this collection can be accessed, but it cannot be changed.
// Any attempt to change it will throw an <xref:System.InvalidOperationException>.
These 2 lines lets you insert nullable values into the datatable column
{ ColumnName = property.Name, DataType = Nullable.GetUnderlyingType(property.PropertyType) ?? property.PropertyType, AllowDBNull = true }
newRow[property.Name] = item.GetType().GetProperty(property.Name)?.GetValue(item, null) ?? DBNull.Value;
© 2022 - 2024 — McMap. All rights reserved.
EntityDataReader
class, there's anAsDataReader()
extension method that does exactly that: github.com/matthewschrager/Repository/blob/master/… – Intendance