Having a paython script that processes a large file (~1GB) then inserts its content into PostgreSQL database 9.3, all in one connection. This process takes very long time so I thought of distrubuting this quesry among more than one core (I got 8 cores) but from what I've read this seems impossible. Any idea if there is a workaround?
Piece of my code:
import psycopg2 as psycopg
try:
connectStr = "dbname='postgis20' user='postgres' password='' host='localhost'"
cx = psycopg.connect(connectStr)
cu = cx.cursor()
logging.info("connected to DB")
except:
logging.error("could not connect to the database")
global cx
try:
cu.execute("INSERT INTO taxi (userid,carNum) SELECT '"+str(msg['UserID'])+"',"+str(msg['CarNumber']))
cu.execute
cu.execute
cu.execute
cu.execute
..
..
..
.
except Exception, err:
print('ERROR: %s\n' % str(err))
cx.commit()
cx.commit()