Laravel lazy collection for huge data
Asked Answered
F

2

3

I am querying a large data sets from the table and then iterating through a loop for creating a json file.

$user = App\User::all();

foreach($user as $val){
  // logic goes here for creating the json file
}

Now the problem i am facing is that when iterating through the loop it is consuming memory and i am getting error 'Allowed memory size exhausted'.And also the cpu usage of the server becomng so high. My question how i should use the laravel lazy collections to get rid of this issue.I have gone through the offcial docs but couldnt find the way.

Fibrillation answered 20/11, 2019 at 7:21 Comment(8)
what are you doing with the data inside your foreach loop? and you can't find the way to do what?Brashear
i am creating a new array based on certain conditions and later that array is converted to jsonFibrillation
as you keep adding to this array it will use more and more memory ... and what can't you find the way to do in the docs?Brashear
i dont understand how to use the lazy collection of laravel and create the array based on the conditionFibrillation
create the array based on what condition? ... the 3rd code block laravel.com/docs/6.x/collections#lazy-collections shows how to get a lazy collection from the query; you don't need the filter call (as that is just an example) "However, the query builder's cursor method returns a LazyCollection instance."Brashear
condition means if and else..on which some flags are changed ...my query is returning the data very fast but i my problem is with the foreach loop and i want to get rid of itFibrillation
if you need to iterate the data you need to iterate the data, you don't just get rid of it ... in the docs it gets the lazy collection from the cursor method then it iterates it with a foreach loopBrashear
Let us continue this discussion in chat.Fibrillation
H
8

Just replace the all method with the cursor one.

$user = App\User::cursor();

foreach($user as $val){
  // logic goes here for creating the json file
}

For more informations about the methods you can chain, refer to the official documentation

Headpin answered 20/11, 2019 at 7:32 Comment(0)
J
2

If you need to handle big/large/huge Collection/array/data with Laravel you have to use LazyCollection class (documentation) with chunk() method (documentation).

Using chunks is necessary because if your script will fails in a middle of your large array - script job will cancel everything with error, in case of chunks some of job will be handled. And LazyCollection provides a low memory usage. So to get a maximum of profit the best way is to use the ones together.

<?php
use Illuminate\Support\LazyCollection;

//$bigArray - an array with large data

$chunkSize = 100;
$lazy = collect($bigArray)->lazy();
// or MyModel::lazy($chunkSize)->..., NOT MyModel::all()->lazy()
$lazy->chunk($chunkSize)
->each(function (LazyCollection $items) {
    $items->each(function ($item) {
        //do your job here with $item
    })
});

This method does not guarantee a complete solution of the problem like 504 Gateway Time-out or memory leaking. So try to increase memory_limit and max_execution_time with fastcgi_read_timeout if you use Nginx web server. You can solve this error via Laravel and Nginx settings

Johnette answered 23/6, 2023 at 7:2 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.