I want to insert my data into PostgreSQL 1000 by 1000 records ( Bulk Insertion ) to make it quick and low load on the DBMS, So my code for now is :
cursor.execute("INSERT INTO bar(first_name,last_name) VALUES ('David', 'Bar')")
cursor.execute("INSERT INTO bar(first_name,last_name) VALUES ('David2', 'Bar2')")
cursor.execute("INSERT INTO bar(first_name,last_name) VALUES ('David3', 'Bar3')")
.... etc
connection.commit()
And as you can see i committed the changes at the end and that's saving a lot of time for me instated of committing the changes after every insert query. The problem is if any query crashed for any reason ( invalid data ), all the quires will fail to execute and i will lose the data. Is there anyway to save the time of the insertion and avoid the data loss at the same time??