0

I have a website where users can upload a csv file. Each file might have a different number of columns and rows and I use the following to handle the upload of a csv file and then write this data into a new table. This all works but it can be slow if the csv file is large. Writing the data row by row is slow and I wonder if there is a way of writing the whole csv file ($matrix - see below) into the newly created SQL table in one go?

//Handling the csv file and reading data into $matrix
$file = $_FILES['csv1']['tmp_name'];
$handle = fopen($file,"r");
$colsMax=0;

$matrix = array();
while (($row = fgetcsv($handle, 10000, ",")) !== FALSE) {
    $matrix[] = $row;
}
$rows = count($matrix);
$cols = count($matrix[$row]);

I then create a table in the database with the correct number of columns (with field names A1, A2 etc. which are changed later):

//create importDataTable table for all the data
$sql = "CREATE TABLE $tblImportedData (id INT(6) UNSIGNED AUTO_INCREMENT PRIMARY KEY";
for ($colloop=0; $colloop<$cols; $colloop++) {
    $sql = $sql . ", A$colloop VARCHAR(80)";
}
$sql = $sql . ");";
mysqli_query($connect, $sql);

The data is then inserted into the table row by row:

for ($rowloop=0; $rowloop < $rows; $rowloop++) { //loops through each row
    $dataToWrite = $matrix[$rowloop][0];        
    $sql = "INSERT INTO $tblImportedData SET "; 
    for ($colloop=0; $colloop < $cols; $colloop++) {  //loops through each column
        $dataToWrite = isset($matrix[$rowloop][$colloop]) ? $matrix[$rowloop][$colloop] : '';
        $sql = $sql . ", A$colloop = '" . $dataToWrite . "'";
    }
    $sql = $sql . ";";
    mysqli_query($connect, $sql);
}
Funk Forty Niner
  • 73,764
  • 15
  • 63
  • 131
RGriffiths
  • 5,174
  • 16
  • 63
  • 99
  • 1
    **WARNING**: When using `mysqli` you should be using [parameterized queries](http://php.net/manual/en/mysqli.quickstart.prepared-statements.php) and [`bind_param`](http://php.net/manual/en/mysqli-stmt.bind-param.php) to add any data to your query. **DO NOT** use string interpolation or concatenation to accomplish this because you have created a severe [SQL injection bug](http://bobby-tables.com/). **NEVER** put `$_POST`, `$_GET` or data *of any kind* directly into a query, it can be very harmful if someone seeks to exploit your mistake. – tadman Oct 28 '19 at 17:20
  • Note: The [object-oriented interface to `mysqli`](https://www.php.net/manual/en/mysqli.quickstart.connections.php) is significantly less verbose, making code easier to read and audit, and is not easily confused with the obsolete `mysql_query` interface where missing a single `i` can cause trouble. Example: `$db = new mysqli(…)` and `$db->prepare("…")` The procedural interface is largely an artifact from the PHP 4 era when `mysqli` API was introduced and should not be used in new code. – tadman Oct 28 '19 at 17:20
  • If you can, just [load it in directly using `LOAD DATA INFILE`](https://stackoverflow.com/questions/14127529/mysql-import-data-from-csv-using-load-data-infile). – tadman Oct 28 '19 at 17:20
  • @tadman Thanks - I will have a look at this. Would this work with the $matrix array to the MySql table. There is some work done on the data prior to writing it to the table which for the sake of keeping the question shorter I have not included here. – RGriffiths Oct 28 '19 at 17:23
  • 1
    It's often easier to do post-processing in the database than it is to do pre-processing and slowly insert the data. Import into a temporary table, rework the data, then merge it in the final location with `INSERT INTO ... SELECT FROM ...` – tadman Oct 28 '19 at 17:24
  • @tadman Yes I realise that can be done but my question is whether or not it is possible to write the $matrix array to the table in one go in the same way as the LOAD DATA INFILE works. – RGriffiths Oct 28 '19 at 17:28
  • Are you querying the matrix data directly or just using that as a dumb data store? if the latter, just save it into a singular JSON column and be done with it. Don't make this many columns, it's pointless. – tadman Oct 28 '19 at 17:30
  • 1
    @tadman Thanks for your assistance. LOAD DATA INFILE working and much quicker. Appreciate your help. – RGriffiths Oct 29 '19 at 23:04

1 Answers1

2

As @tadman suggested in his comment, you should use the LOAD DATA INFILE functionality that come out of the box with mysql.

hd1
  • 30,506
  • 4
  • 69
  • 81