Reading large excel file with PHP
Asked Answered
T

6

7

I'm trying to read a 17MB excel file (2003) with PHPExcel1.7.3c, but it crushes already while loading the file, after exceeding the 120 seconds limit I have. Is there another library that can do it more efficiently? I have no need in styling, I only need it to support UTF8. Thanks for your help

Tranquil answered 3/6, 2010 at 13:36 Comment(0)
D
10

Filesize isn't a good measure when using PHPExcel, it's more important to get some idea of the number of cells (rowsxcolumns) in each worksheet.

If you have no need for styling, are you calling:

$objReader->setReadDataOnly(true);

before loading the file?

If you don't need to access all worksheets, or only certain cells within a worksheet, look at using

$objReader->setLoadSheetsOnly(array(1,2))

or

$objReader->setLoadSheetsOnly(1)

or defining a readFilter

Are you using cell caching? If so, what method? That slows down the load time.

Darreldarrell answered 3/6, 2010 at 13:55 Comment(0)
I
2

If the only thing you need from your read Excel file is data, here is my way to read huge Excel files :

I install gnumeric on my server, ie with debian/ubuntu : apt-get install gnumeric

Then the php calls to read my excel file and store it into a two dimensionnal data array are incredibly simple (dimensions are rows and cols) :

system("ssconvert \"$excel_file_name\" \"temp.csv\"");
$array = array_map("str_getcsv", file("temp.csv"));

Then I can do what I want with my array. This takes less than 10 seconds for a 10MB large xls file, the same time I would need to load the file on my favorite spreadsheet software !

For very huge files, you should use fopen() and file_getcsv() functions and do what you have to do without storing data in a huge array to avoid storing the whole csv file in memory with the file() function. This will be slower, but will not eat all your server's memory !

Invocation answered 17/9, 2013 at 14:4 Comment(0)
V
1

17MB is a hefty file.

Time how long a 1MB file takes to parse so you can work out how long a 17MB file would take. Then one option might be just to increase your 120 second limit.

Alternatively, you could export to CSV, which will be way more efficient, and import via PHP's fgetcsv.

Vingtetun answered 3/6, 2010 at 13:50 Comment(0)
T
1

I've heard that Excel Explorer is better in reading large files.

Thordis answered 3/6, 2010 at 13:57 Comment(2)
At $500 for commercial use, I'd hope it would be betterDarreldarrell
Price and quality have no correlation. You only need a $99 personal license if you want to use it for your own web application.Thordis
I
0

Maybe you could convert/export into csv, and use built-in fgetcsv(). Depends on what kind of functionality you need.

Illusive answered 29/6, 2010 at 0:59 Comment(0)
C
0

I'm currently using the spreadsheet-reader (https://github.com/nuovo/spreadsheet-reader) which is quite fast in reading XLSX, ODS and CSV, and has the problems mentioned only in reading the XLS format.

Clemenceau answered 1/4, 2015 at 20:10 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.