0

I have a DataTable which I want to save to a SQLite Database Table. Here is my dilemma, I don't know which way to go. At most the DataTable would contain 65,000 rows and probably 12 columns.

So, would it be faster to save the DataTable to a CSV file and then Bulk Insert it into SQLite (which I have no idea how to do) or would it be faster to loop through all the columns create parameters and then loop through each individual row in the datatable to retrieve the information to insert into the database table.

Is there an even better way than what I have listed?

Thanks, Nathan

Nathan
  • 4,539
  • 15
  • 45
  • 60

1 Answers1

0

Check this question out.

There is a SqlBulkCopy in the .Net framework class that provides funcitonality for bulk inserts. Unfortunately it is supported only for SQL Server databases.

However tweaking a few parameters on your inserts will make the bulk insert a lot quicker. From what people are reporting there's not that much of a performance hit with single inserts.

Community
  • 1
  • 1
Mircea Grelus
  • 2,667
  • 17
  • 14