I'm using pqxx
as an api to access postgres
from c++ code. I'm trying to insert a large amount of data and am finding the performance isn't good enough. I've tried several things and nothing is giving the performance that I need.
I've been trying to use pqxx::pipeline
to get asyn inserts, but it seams with this, I can either wait until the end of all inserts to commit the transaction in which case I run the risk of losing a very large amount of data if the process crashes before the commit. Or I can commit occasionally (like every 5 minutes) in which case I have a blocking call every 5 minutes which takes quite a large amount of time.
Is there a way to do this without having a transaction, or to have asynchronous commits for my transaction?