I have a website where users can upload a csv file. Each file might have a different number of columns and rows and I use the following to handle the upload of a csv file and then write this data into a new table. This all works but it can be slow if the csv file is large. Writing the data row by row is slow and I wonder if there is a way of writing the whole csv file ($matrix - see below) into the newly created SQL table in one go?
//Handling the csv file and reading data into $matrix
$file = $_FILES['csv1']['tmp_name'];
$handle = fopen($file,"r");
$colsMax=0;
$matrix = array();
while (($row = fgetcsv($handle, 10000, ",")) !== FALSE) {
$matrix[] = $row;
}
$rows = count($matrix);
$cols = count($matrix[$row]);
I then create a table in the database with the correct number of columns (with field names A1, A2 etc. which are changed later):
//create importDataTable table for all the data
$sql = "CREATE TABLE $tblImportedData (id INT(6) UNSIGNED AUTO_INCREMENT PRIMARY KEY";
for ($colloop=0; $colloop<$cols; $colloop++) {
$sql = $sql . ", A$colloop VARCHAR(80)";
}
$sql = $sql . ");";
mysqli_query($connect, $sql);
The data is then inserted into the table row by row:
for ($rowloop=0; $rowloop < $rows; $rowloop++) { //loops through each row
$dataToWrite = $matrix[$rowloop][0];
$sql = "INSERT INTO $tblImportedData SET ";
for ($colloop=0; $colloop < $cols; $colloop++) { //loops through each column
$dataToWrite = isset($matrix[$rowloop][$colloop]) ? $matrix[$rowloop][$colloop] : '';
$sql = $sql . ", A$colloop = '" . $dataToWrite . "'";
}
$sql = $sql . ";";
mysqli_query($connect, $sql);
}