I'm working on a project that I need to pull out data from database which contains almost 10k rows then export it to CSV. I tried the normal method to download CSV but I'm always getting memory limit issue even if we already sets the memory_limit to 256MB.
If any of you have experienced the same problem, please share your ideas on what is the best solutions or approach.
Really appreciate your thoughts guys.
Here is my actual code:
$filename = date('Ymd_His').'-export.csv';
//output the headers for the CSV file
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header('Content-Description: File Transfer');
header("Content-type: text/csv");
header("Content-Disposition: attachment; filename={$filename}");
header("Expires: 0");
header("Pragma: public");
//open the file stream
$fh = @fopen( 'php://output', 'w' );
$headerDisplayed = false;
foreach ( $formatted_arr_data_from_query as $data ) {
// Add a header row if it hasn't been added yet -- using custom field keys from first array
if ( !$headerDisplayed ) {
fputcsv($fh, array_keys($ccsve_generate_value_arr));
$headerDisplayed = true;
}
// Put the data from the new multi-dimensional array into the stream
fputcsv($fh, $data);
}
// Close the file stream
fclose($fh);
echo
to send data to the browser? 2. Are you asking the browser to download the file through an HTTP header 3. How are you fetching the data from your DB? – Reginareginaldmysqldump
system
command. This had solved all of my issues in the past. – Deathlikesystem("mysqldump -h $dbhost -u $dbuser -p $dbpass $dbname > $backupfile");
– Deathlike