0

I am try to read a csv file which has 200000 records, and then using loop to insert the record which is quite very slow.

So I want to read the file by 10k. Like first read 1 to 10k and in second time read 10k to 20k and so on.

I have done the following

$row1 = fgetcsv($handle, $delimiter);
$endNumber = 8635;

for ($i = $start; $i < $endNumber; $i++) {
    fgets($handle);
}

It is used to skip the record but it is not working as i need and also used this

for($i = 8; $row = fgetcsv($handle, $delimiter); $i++) {
        if($i === $endNumber) {
            fclose($handle);
           return $rowy;
        }
}
DevDonkey
  • 4,706
  • 2
  • 24
  • 40
  • 3
    Possible duplicate of [Read large data from csv file in php](https://stackoverflow.com/questions/17520093/read-large-data-from-csv-file-in-php) – B001ᛦ Jan 30 '18 at 09:45
  • @B001 it is not same it is csv filw with number of record not with size of file –  Jan 30 '18 at 09:48
  • Go through the answers there is a reference to another answer – B001ᛦ Jan 30 '18 at 09:50
  • @B001 I have checked that and then after say that –  Jan 30 '18 at 09:51
  • Does https://stackoverflow.com/questions/14127529/mysql-import-data-from-csv-using-load-data-infile give an alternative. Load the file to a work table and then process it. – Nigel Ren Jan 30 '18 at 09:54

0 Answers0