I have following two requirements:
- To read a CSV file and put rows line by line into the database (RDSMS) without any data manipulation.
- To read a CSV file and put this data into the database (RDBMS). In this case, row Z might be dependent on row B. So need to have a staging DB (in-memory or another a staging RDBMS)
I am analyzing multiple ways to accomplish this:
- Using Core java, and read file in Producer-consumer way.
- Using Apache Camel and BeanIO to read the csv file.
- Using SQL to read the file.
Wanted to know, if is there an already industry defined preferred way to do such kind of tasks?
I found few links on stackoverflow, but I am looking for more options:
- How to read a large text file line by line using Java?
- Read a huge file of numbers in Java in a memory-efficient way?
- Read large CSV in java
I am using Java6 for implementation.