So I have this table:
mysql> DESCRIBE table;
+-------+------------------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+-------+------------------+------+-----+---------+----------------+
| id | int(15) unsigned | NO | PRI | NULL | auto_increment |
| unid | char(9) | NO | UNI | NULL | |
| rs | varchar(255) | NO | | NULL | |
+-------+------------------+------+-----+---------+----------------+
3 rows in set (0.00 sec)
Which contains millions of rows:
mysql> SELECT COUNT(1) FROM table;
+----------+
| COUNT(1) |
+----------+
| 9435361 |
+----------+
1 row in set (0.00 sec)
I'm willing to export all rows in a .csv
file (I'm using Symfony2.6
). This file is meant to be stored on the server (not downloaded) and later on, read by PHP.
1st attempt
I tried to make a huge request to select all at once (as per this blog post) but this has, despite the use of ->iterate()
, led to Allowed memory size of 1073741824 bytes exhausted
after having run for ~9s.
ini_set('memory_limit', '1024M');
ini_set('max_execution_time', -1);
$results = $em
->getRepository('MyBundle:Entity')
->createQueryBuilder('e')
->getQuery()
->iterate();
$handle = fopen('/path/to/csv/file/', 'w');
while (false !== ($row = $results->next())) {
fputcsv($handle, $row[0]->toArray());
$em->detach($row[0]);
}
fclose($handle);
2nd attempt
I retrieved the total of rows and then, did a loop to make the same number of queries to retrieve rows one by one. But after having written ~260K rows into the .csv
file, PHP runs out of memory and throws the same error as above : Allowed memory size of 1073741824 bytes exhausted
.
ini_set('memory_limit', '1024M');
ini_set('max_execution_time', -1);
$total = (int) $em
->getRepository('MyBundle:Entity')
->countAll();
$csv = '/path/to/csv/file';
$handle = fopen($csv, 'w');
for($i = 1; $i < $total; $i++)
{
$entity = $em->getRepository('MyBundle:Entity')->findOneById($i);
fputcsv($handle, $entity->toArray());
$em->detach($entity);
}
fclose($handle);
3rd attempt
I have thought of the use of the exec()
function to run the MySQL command line that would export the table. However, my manager seems not to like this option.
So am I making a fool of myself thinking that dumping ~9.5M of rows using PHP into a .csv
file is even possible? Are there any other way I'm not yet aware of?
Thanks for your help on this one.
detached
from thepersistence context
. Maybe you can try to explicit useunset()
after detaching the entity? – Enrichwhile ($row = $stmt->fetch()) {}
and save to file via fput, fputcsv or whatever stream. This should bypass memory exhaustion. – Declaratoryunlink
deletes files... – Cumulativeunset()
ofc. – Enrichini_set('memory_limit', '4096M');
Since you have mentioned mem_limit as 1024M you gotAllowed memory size of 1073741824 bytes exhausted
? – Departmentalism