Questions tagged [bulkinsert]

Act of inserting multiple rows into a database simultaneously.

Act of inserting multiple rows into a database simultaneously.

The purpose is to speed up loading of large amounts of data into a database. Depending on the database or database drivers used, the data is generally transferred and committed to the database in large groups of rows instead of one row at a time.

Also include appropriate database tags such as or , and access methods such as , , or .

2272 questions
278
votes
9 answers

What's the fastest way to do a bulk insert into Postgres?

I need to programmatically insert 10's of millions of records into a postgres database. Presently I am executing 1000's of insert statements in a single "query". Is there a better way to do this, some bulk insert statement I dont know about?
Ash
  • 21,088
  • 34
  • 100
  • 145
241
votes
7 answers

How to speed up insertion performance in PostgreSQL

I am testing Postgres insertion performance. I have a table with one column with number as its data type. There is an index on it as well. I filled the database up using this query: insert into aNumber (id) values (564),(43536),(34560) ... I…
Luke101
  • 56,845
  • 75
  • 204
  • 330
207
votes
12 answers

Import CSV file into SQL Server

I am looking for help to import a .csv file into SQL Server using BULK INSERT and I have few basic questions. Issues: The CSV file data may have , (comma) in between (Ex: description), so how can I make import handling these data? If the client…
Prabhat
  • 3,824
  • 3
  • 15
  • 12
163
votes
8 answers

mongodb: insert if not exists

Every day, I receive a stock of documents (an update). What I want to do is insert each item that does not already exist. I also want to keep track of the first time I inserted them, and the last time I saw them in an update. I don't want to have…
LeMiz
  • 5,068
  • 5
  • 25
  • 23
160
votes
6 answers

How do I temporarily disable triggers in PostgreSQL?

I'm bulk loading data and can re-calculate all trigger modifications much more cheaply after the fact than on a row-by-row basis. How can I temporarily disable all triggers in PostgreSQL?
David Schmitt
  • 54,766
  • 26
  • 117
  • 159
133
votes
13 answers

How to insert multiple rows from array using CodeIgniter framework?

I'm passing a large dataset into a MySQL table via PHP using insert commands and I'm wondering if it's possible to insert approximately 1000 rows at a time via a query other than appending each value on the end of a mile-long string and then…
toofarsideways
  • 3,766
  • 2
  • 27
  • 48
84
votes
4 answers

Performance of bcp/BULK INSERT vs. Table-Valued Parameters

I'm about to have to rewrite some rather old code using SQL Server's BULK INSERT command because the schema has changed, and it occurred to me that maybe I should think about switching to a stored procedure with a TVP instead, but I'm wondering what…
Aaronaught
  • 115,846
  • 24
  • 251
  • 329
81
votes
9 answers

How can I Insert many rows into a MySQL table and return the new IDs?

Normally I can insert a row into a MySQL table and get the last_insert_id back. Now, though, I want to bulk insert many rows into the table and get back an array of IDs. Does anyone know how I can do this? There are some similar questions, but they…
Peacemoon
  • 2,756
  • 4
  • 27
  • 53
76
votes
9 answers

BULK INSERT with identity (auto-increment) column

I am trying to add bulk data in database from CSV file. Employee table has a column ID (PK) auto-incremented. CREATE TABLE [dbo].[Employee]( [id] [int] IDENTITY(1,1) NOT NULL, [Name] [varchar](50) NULL, [Address] [varchar](50) NULL ) ON…
Abhi
  • 1,811
  • 5
  • 27
  • 31
61
votes
11 answers

Bulk Insert Correctly Quoted CSV File in SQL Server

I'm trying to import a correctly quoted CSV file, meaning data is only quoted if it contains a comma, e.g.: 41, Terminator, Black 42, "Monsters, Inc.", Blue I observe that the first row imports correctly, but the second row errors in a manner that…
Eric J.
  • 139,555
  • 58
  • 313
  • 529
51
votes
8 answers

Writing large number of records (bulk insert) to Access in .NET/C#

What is the best way to perform bulk inserts into an MS Access database from .NET? Using ADO.NET, it is taking way over an hour to write out a large dataset. Note that my original post, before I "refactored" it, had both the question and answer in…
Marc Meketon
  • 2,213
  • 1
  • 21
  • 18
49
votes
7 answers

Accelerate bulk insert using Django's ORM?

I'm planning to upload a billion records taken from ~750 files (each ~250MB) to a db using django's ORM. Currently each file takes ~20min to process, and I was wondering if there's any way to accelerate this process. I've taken the following…
Jonathan
  • 84,911
  • 94
  • 244
  • 345
49
votes
5 answers

Bulk Insertion on Android device

I want to bulk insert about 700 records into the Android database on my next upgrade. What's the most efficient way to do this? From various posts, I know that if I use Insert statements, I should wrap them in a transaction. There's also a post…
Ron Romero
  • 8,494
  • 8
  • 39
  • 62
36
votes
5 answers

How to speed up bulk insert to MS SQL Server using pyodbc

Below is my code that I'd like some help with. I am having to run it over 1,300,000 rows meaning it takes up to 40 minutes to insert ~300,000 rows. I figure bulk insert is the route to go to speed it up? Or is it because I'm iterating over the…
TangoAlee
  • 955
  • 2
  • 9
  • 26
35
votes
1 answer

Use binary COPY table FROM with psycopg2

I have tens of millions of rows to transfer from multidimensional array files into a PostgreSQL database. My tools are Python and psycopg2. The most efficient way to bulk instert data is using copy_from. However, my data are mostly 32-bit floating…
Mike T
  • 34,456
  • 15
  • 128
  • 169
1
2 3
99 100