2

I am currently using arrays and fputcsv to export MySQL tables to a csv file. This works great, however I have one table which is too large (100,000 + rows) for this method and so my site times out.

My current code is;

<?php                                    
//open database connection                            
require ('../database-config.php');
header('Content-Type: text/csv');
header('Content-Disposition: attachment;filename=exported_archive_all.csv');

//SQL Query for Data
$sql = "SELECT * FROM inverter_data_archive;";        
//Prepare Query, Bind Parameters, Excute Query
$STH = $dbh->prepare($sql);
$STH->execute();

//Export to .CSV
$fp = fopen('php://output', 'w');

// first set
$first_row = $STH->fetch(PDO::FETCH_ASSOC);
$headers = array_keys($first_row);
fputcsv($fp, $headers); // put the headers
fputcsv($fp, array_values($first_row)); // put the first row

while ($row = $STH->fetch(PDO::FETCH_NUM))  {
fputcsv($fp,$row); // push the rest
}
fclose($fp);        
?>

I have read that a possible approach is to read each data row individually from the query result set and write directly to php://output, then read the next row, etc; rather than building a large array or building the csv in memory.

I've tried a few things but I struggle with for loops so any help on how to achieve this would be much appreciated.

EllBrandon
  • 91
  • 2
  • 10
  • 4
    Why do this with PHP when you can do it directly? http://stackoverflow.com/questions/356578/how-to-output-mysql-query-results-in-csv-format – Jay Blanchard Jul 01 '16 at 18:49
  • What about [this](http://stackoverflow.com/questions/4249432/export-to-csv-via-php). Can it be useful to you? – cachique Jul 01 '16 at 18:51
  • The example on that page still loads the whole table into an array so causes the same memory / timeout issues. – EllBrandon Jul 02 '16 at 22:26

1 Answers1

-1

Check Your php.ini file Find search max and crease value like max_execution_time max_input_time max_input_vars max_input_vars post_max_size upload_max_filesize

  • 1
    [It's not hard to earn enough rep to make comments.](http://meta.stackexchange.com/questions/214173/why-do-i-need-50-reputation-to-comment-what-can-i-do-instead). While this may answer the question, providing additional context regarding why and/or how this answers the question improves its long-term value. – Jay Blanchard Jul 01 '16 at 19:22
  • Although not the solution I was looking for PHP ini has allowed me to use my existing code for large tables. I have added these 2 lines to the top; `ini_set('memory_limit', '-1'); ini_set('max_execution_time', 0);` – EllBrandon Jul 02 '16 at 22:28