0

I want to insert my data into PostgreSQL 1000 by 1000 records ( Bulk Insertion ) to make it quick and low load on the DBMS, So my code for now is :

cursor.execute("INSERT INTO bar(first_name,last_name) VALUES ('David', 'Bar')")
cursor.execute("INSERT INTO bar(first_name,last_name) VALUES ('David2', 'Bar2')")
cursor.execute("INSERT INTO bar(first_name,last_name) VALUES ('David3', 'Bar3')")
.... etc
connection.commit()

And as you can see i committed the changes at the end and that's saving a lot of time for me instated of committing the changes after every insert query. The problem is if any query crashed for any reason ( invalid data ), all the quires will fail to execute and i will lose the data. Is there anyway to save the time of the insertion and avoid the data loss at the same time??

Fanooos
  • 2,468
  • 5
  • 26
  • 47
Morad Edwar
  • 910
  • 1
  • 9
  • 27

1 Answers1

0

It depends on your requirements of course, depending on your transaction needs I would recommend one of the following options:

1. Using SAVEPOINT's (subtransactions):

BEGIN;
    SAVEPOINT savepoint;
    INSERT ...;
    RELEASE savepoint;
    SAVEPOINT savepoint;
    INSERT ...;
    /* If you're getting an error */
    ROLLBACK TO SAVEPOINT savepoint;
COMMIT;

2. Using AUTOCOMMIT

SET AUTOCOMMIT TO ON;
INSERT ...
INSERT ...
INSERT ...
Wolph
  • 69,888
  • 9
  • 125
  • 143