I am currently using arrays and fputcsv to export MySQL tables to a csv file. This works great, however I have one table which is too large (100,000 + rows) for this method and so my site times out.
My current code is;
<?php
//open database connection
require ('../database-config.php');
header('Content-Type: text/csv');
header('Content-Disposition: attachment;filename=exported_archive_all.csv');
//SQL Query for Data
$sql = "SELECT * FROM inverter_data_archive;";
//Prepare Query, Bind Parameters, Excute Query
$STH = $dbh->prepare($sql);
$STH->execute();
//Export to .CSV
$fp = fopen('php://output', 'w');
// first set
$first_row = $STH->fetch(PDO::FETCH_ASSOC);
$headers = array_keys($first_row);
fputcsv($fp, $headers); // put the headers
fputcsv($fp, array_values($first_row)); // put the first row
while ($row = $STH->fetch(PDO::FETCH_NUM)) {
fputcsv($fp,$row); // push the rest
}
fclose($fp);
?>
I have read that a possible approach is to read each data row individually from the query result set and write directly to php://output, then read the next row, etc; rather than building a large array or building the csv in memory.
I've tried a few things but I struggle with for loops so any help on how to achieve this would be much appreciated.