17

My current project consists of 3 standard layers: data, business, and presentation. I would like to use data entities for all my data access needs. Part of the functionality of the app will that it will need to copy all data within a flat file into a database. The file is not so big so I can use SqlBulkCopy. I have found several articles regarding the usage of SqlBulkCopy class in .NET. However, all the articles are using DataTables to move data back and forth.

Is there a way to use data entities along with SqlBulkCopy or will I have to use DataTables?

John Saunders
  • 157,405
  • 24
  • 229
  • 388
Skadoosh
  • 2,476
  • 7
  • 39
  • 51

7 Answers7

22

You'll need to convert the Entities to a IDataReader or DataTable.

There is a small helper class designed to assist with with: http://archive.msdn.microsoft.com/LinqEntityDataReader/Release/ProjectReleases.aspx?ReleaseId=389

EDIT: msdn link is broken, alt copy can be found here: https://github.com/matthewschrager/Repository/blob/master/Repository.EntityFramework/EntityDataReader.cs

Then you can use SqlBulkCopy like so:

var sbCopy= new SqlBulkCopy(connectionString);
sbCopy.DestinationTableName = "TableName";
sbCopy.WriteToServer(entitiesList.AsDataReader()); 
Dave Hogan
  • 3,165
  • 4
  • 28
  • 53
  • 2
    @JonBarker - Here's the GitHub link: https://github.com/matthewschrager/Repository/blob/master/Repository.EntityFramework/EntityDataReader.cs – Dave Hogan Jul 09 '14 at 13:12
  • @DaveHogan I like the idea you have here. We are seeding 200K plus items and its painful to say the least. I tried to pull it into our repo EF seeding but when creating an Office item that has a child object of say Address it would not create the child items or relation ship. Is this sort of thing possible or beyond the scope of the code linked to? Thanks – Jon Sep 15 '14 at 11:16
  • 1
    @Jon - I wouldn't think you could easily do child items without manually explicitly setting up a SqlBulkCopy for each child to a table (and manually managing the relationship constraints) – Dave Hogan Sep 16 '14 at 07:50
  • I'm the original author of the ObjectDataReader and have added it to a new GitHub home: https://github.com/dbrownems/ObjectDataReader – David Browne - Microsoft Nov 18 '20 at 17:13
6

We've tried and tested a couple of approaches when bulk inserting using EF and eventually went with table-valued parameters for getting the best performance at a range of row sizes. I don't have the numbers to hand but I know this Performance of bcp/BULK INSERT vs. Table-Valued Parameters was a guiding factor.

We originally used SqlBulkCopy coupled with an adapter that took an IEnumerable<T> and created an IDataReader. It also generated the relevant metadata for SqlBulkCopy. Advantage here was that the import is a code only thing. The code that @davehogan posted was used as a basis for this.

Table-valued parameters require a stored procedure and a table-type defined in the database. If you're using code-first you can execute SQL to create these as part of your creation script. Whilst this is more work we found that we got a significantly more consistent and faster throughput of rows into the database.

Also, it's worth considering not bulk inserting into your main table. We use a temp heap table and add a clustered index to it once the data is imported. We then perform a MERGE between the temp table and the main table. This has the benefit of not locking the main table's index while inserting and improves concurrency. We tend to get upwards of 2500 rows/sec per CPU inserted using this method.

Let me know if you want more info.

Community
  • 1
  • 1
Dean Ward
  • 4,598
  • 2
  • 27
  • 36
2

You may use Bulk package library. Bulk Insert 1.0.0 version is used in projects having Entity framework >=6.0.0 . More description can be found in below link- Bulkoperation source code

Nadeem
  • 31
  • 1
  • 4
2

For EFCore here are BulkExtensions (Insert, InsertOrUpdate Update, Delete):
Link: https://github.com/borisdj/EFCore.BulkExtensions
Can also be installed via Nuget

borisdj
  • 1,681
  • 2
  • 17
  • 28
1

You can consider a Dataset to be a serialisation of the data entity. However generally speaking I think SqlBulkCopy is a table to table thing, hence the reason For datatables.

Preet Sangha
  • 61,126
  • 17
  • 134
  • 202
1

SqlBulkCopy uses an IDataReader when calling the WriteToServer method so you should be able to implement IDataReader based on a collection that is IEnumerable. This would allow you to take in an entity set and call SqlBulkCopy using your IDataReader implementation.

Joseph Pisano
  • 123
  • 1
  • 8
1

SqlBulkCopy is a direct, almost byte-array-like transfer of row data from client to SQL Server. It is easily the most efficient way to get data into SQL Server.

Its performance lies in truly "bulk" operations, however. Hundreds or thousands of rows isn't necessarily high enough to justify use. Tens of thousands to millions of rows are were SqlBulkCopy's performance will truly shine. And, in the end, all we're really talking about is getting data to the server.

There are other significant challenges in getting a set of rows into a production database's table. Reindexing, reordering (if there is a clustered index), foreign key validation, all these kinds of things add time to your insert and are potentially table- and index-locking.

Also, TVP data is written to disk (as temp table data), and then is accessible to put into your tables. SqlBulkCopy is capable of going directly at your table... performance in that case is significantly faster, however, one must balance speed for concurrency.

I think the overall rule is, if you have a handful of rows to deal with, think TVPs, and if you have many thousands of rows, consider getting it to SQL Server as quickly as possibly via SqlBulkCopy.

Craig Brunetti
  • 419
  • 5
  • 4