-1

I have data of approx. 110 million in text file. What is the fastest way to load this data in Oracle db table as I am using sqlloader which took around 34 hrs to completely load in table.

jww
  • 83,594
  • 69
  • 338
  • 732
Abhishek
  • 1
  • 1
  • 1
    Please don't use local measurements, they will just confuse people. And what is "110 million data". Bytes? Rows? Something else? – James Z Apr 18 '19 at 11:19
  • Please show the relevant code and state the exact problem or error. Also see [How to create a Minimal, Complete, and Verifiable example](http://stackoverflow.com/help/mcve). Also see questions like [Why are SQL server inserts so slow?](https://stackoverflow.com/q/1743496/608639), [SQL speed up performance of insert?](https://stackoverflow.com/q/7090243/608639), [Fastest way to insert a million rows in Oracle](https://stackoverflow.com/q/18414529/608639), etc. – jww Apr 18 '19 at 11:33

1 Answers1

1

sqlloader is the tool to use to load data in an Oracle table. If I correctly remember, and if you can trust the input data, possible optimizations are:

  • tweak the buffer size and number of lines per commit. The less commits will be the fastest at the price of more memory
  • drop the indexes before loading data and re-create them after - risk: if a data violates a unique or non null index you lose, but gain can be one order of magnitude
  • if it makes sense and again if you can trust the input data, disable any trigger on the table - risk: same as previous line
Serge Ballesta
  • 121,548
  • 10
  • 94
  • 199