48

In the same system, I can make call to db, and there is no problem, but in some case ( with the biggest table ), I get

"PHP Fatal error: Allowed memory size of 536870912 bytes exhausted (tried to allocate 32 bytes) in /home/forge/sximo.sp-marketing.com/vendor/laravel/framework/src/Illuminate/Database/Connection.php on line 311

I debugged the code and the problem is a basic query:

"  SELECT partidascapturainfo.* FROM partidascapturainfo    WHERE partidascapturainfo.partidascapturainfoid IS NOT NULL       ORDER BY partidascapturainfoid asc   LIMIT  0 , 10 "

When I run the query in a Mysql Client, query runs in 0.17s

I've already set memory_limit to 2048, restart nginx and my query only return 10 rows...

Here are my 10 rows:

123044,42016,249,3762,2,,0
123045,42016,249,3761,2,,0
123046,42016,249,3764,1,,0
123047,42016,249,3765,,,0
123048,42016,249,3775,,,0
123049,42016,249,3771,3,,0
123050,42016,249,3772,3,,0
123051,42016,250,3844,HAY,,0
123052,42016,255,3852,,,0
123053,42017,249,3761,1,,0

Any Idea what's going on???

Kenny Horna
  • 9,846
  • 3
  • 30
  • 58
Juliatzin
  • 14,437
  • 26
  • 115
  • 241
  • If you show us what kind of data you are retrieving, it could just be that the data in those 10 rows is massive. You might want to profile your code to see where the memory is getting eaten up. – jardis Jan 18 '16 at 22:37
  • how can I profile my code? I will update my question with a row as example – Juliatzin Jan 18 '16 at 22:41
  • The simplest way is http://stackoverflow.com/a/880483/3358181, but there are more advanced tools. – jardis Jan 18 '16 at 23:10
  • 1
    Possible duplicate of [Allowed memory size of 33554432 bytes exhausted (tried to allocate 43148176 bytes) in php](https://stackoverflow.com/questions/415801/allowed-memory-size-of-33554432-bytes-exhausted-tried-to-allocate-43148176-byte) – Muhammad Mar 06 '18 at 18:50

16 Answers16

76

You can try editing /etc/php5/fpm/php.ini:

; Old Limit
; memory_limit = 512M

; New Limit
memory_limit = 2048M

You may need to restart nginx:

sudo systemctl restart nginx

You may also have an infinite loop somewhere. Can you post the code you're calling?

Stefano A.
  • 2,448
  • 1
  • 6
  • 18
Deciple
  • 1,358
  • 2
  • 12
  • 19
24

It is happened to me with laravel 5.1 on php-7 when I was running bunch of unitests.

The solution was - to change memory_limit in php.ini but it should be correct one. So you need one responsible for server, located there:

/etc/php/7.0/cli/php.ini

so you need a line with

 memory_limit

After that you need to restart php service

sudo service php7.0-fpm restart

to check if it was changed successfully I used command line to run this:

 php -i

the report contained following line

memory_limit => 2048M => 2048M

Now test cases are fine.

Yevgeniy Afanasyev
  • 27,544
  • 16
  • 134
  • 147
3

This problem occurred to me when using nested try- catch and using the $ex->getPrevious() function for logging exception .mabye your code has endless loop. So you first need to check the code and increase the size of the memory if necessary

 try {
        //get latest product data and latest stock from api
        $latestStocksInfo = Product::getLatestProductWithStockFromApi();
    } catch (\Exception $error) {
        try {
            $latestStocksInfo = Product::getLatestProductWithStockFromDb();
        } catch (\Exception $ex) {
            /*log exception */
            Log::channel('report')->error(['message'=>$ex->getMessage(),'file'=>$ex->getFile(),'line'=>$ex->getLine(),'Previous'=>$ex->getPrevious()]);///------------->>>>>>>> this problem when use 
            Log::channel('report')->error(['message'=>$ex->getMessage(),'file'=>$ex->getFile(),'line'=>$ex->getLine()]);///------------->>>>>>>> this code is ok 
        }
        Log::channel('report')->error(['message'=>$error->getMessage(),'file'=>$error->getFile(),'line'=>$error->getLine()]);

        /***log exception ***/
    }
2

Share the lines of code executed when you make this request. There might be an error in your code.

Also, you can change the memory limit in your php.ini file via the memory_limit setting. Try doubling your memory to 64M. If this doesn't work you can try doubling it again, but I'd bet the problem is in your code.

ini_set('memory_limit', '64M');
Alec Walczak
  • 337
  • 1
  • 3
  • 15
2

I realize there is an accepted answer, and apparently it was either the size of memory chosen or the infinite loop suggestion that solved the issue for the OP.

For me, I added an array to the config file earlier and made some other changes prior to running artisan and getting the out of memory error and no amount of increasing memory helped. What it turned out to be was a missing comma after the array I added to the config file.

I am adding this answer in hopes that it helps someone else figure out what might be causing out of memory error. I am using laravel 5.4 under MAMP.

StevenHill
  • 300
  • 3
  • 7
1

I got this error when I restored a database and didn't add the user account and privileges back in. Another site gave me an authentication error, so I didn't think to check that, but as soon as I added the user account back everything worked again!

Dylan Glockler
  • 849
  • 15
  • 31
1

I had this problem when trying to resize a CMYK jpeg using the Intervention / gd library. I had to increase the memory_limit.

Keith Turkowski
  • 547
  • 4
  • 9
1

Sometime limiting your data is also helpful, for example checkout the followings:

//This caused error:
print_r($request);

//This resolved issue:
print_r($request->all());




    
Naser Nikzad
  • 323
  • 3
  • 13
1

I suggest you check your Apache server virtual memory in MB and then set the memory limit.

because Server crash problem if your Apache server has 1024 Mb of virtual memory and set 2048 Mb.

ini_set use in php code

ini_set('memory_limit', '512M');

Also you can change in php.ini:

memory_limit = 512M

Find the path location of php.ini using the command line in Linux or Windows:

php --ini

Php ini file location find

Change in Php.ini Memory Limit Settings on Ubuntu

  • 1
    Thanks for your answer. This solution treats the symptom but not the cause. If there's a process that uses fairly more memory than configured there might be something wrong with the process. – shaedrich May 27 '21 at 07:17
0

I had the same problem. No matter how much I was increasing memory_limit (even tried 4GB) I was getting the same error, until I figured out it was because of wrong database credentials setted up in .env file

GarryOne
  • 998
  • 9
  • 16
0

I had also been through that problem. in my case, I was adding the data to the array and passing the array to the same array which brings the problem of memory limits. Some of the things you need to consider:

  1. Review our code, look if any loop is running infinity.

  2. Reduce the unwanted column if you are retrieving the data from the database.

  3. Maybe you can increase the memory limits in our XAMPP other any other software you are running.

Lakmi
  • 1,361
  • 3
  • 16
  • 26
0

While using Laravel on apache server there is another php.ini

 /etc/php/7.2/apache2/php.ini

Modify the memory limit value in this file

 memory_limit=1024M

and restart the apache server

 sudo service apache2 restart
Athul Raj
  • 1,288
  • 1
  • 9
  • 22
0

for xampp it there is in xampp\php\php.ini now mine new option in it looks as :

;Maximum amount of memory a script may consume
;http://php.net/memory-limit
memory_limit=2048M
;memory_limit=512M
CodeToLife
  • 2,276
  • 2
  • 28
  • 21
0

For litespeed servers with lsphp*.* package.

Use following command to find out default set memory limit for PHP applications.

php -r "echo ini_get('memory_limit').PHP_EOL;"

To locate active php.ini file from CLI

php -i | grep php.ini

Example:

/usr/local/lsws/lsphp73/etc/php/7.3/litespeed/php.ini

To change php.ini default value to custom:

php_memory_limit=1024M #or what ever you want it set to
sed -i 's/memory_limit = .*/memory_limit = '${php_memory_limit}'/' /usr/local/lsws/lsphp73/etc/php/7.3/litespeed/php.ini

Dont forget to restart lsws with: systemctl restart lsws

0

If you in the same story as me:

  • Laradock or other Docker containers
  • No errors
  • No logs
  • Laravel request eats our all your memory in docker and killed silently
  • You even tried to debug your code with xDebug and nothing happened after Exception

The reason is xDebug 3.0 with develop enabled, it prints detailed info about the exceptions and in the case of Laravel take a lot of memory, even 3GB not enough, it dies, without printing any info about it.

Just disable xDebug, at least develop, and it will be fine.

KorbenDallas
  • 683
  • 7
  • 13
-1

You can also get this error if you fail to include .htaccess or have a problem in it. The file should be something like this

<IfModule mod_rewrite.c>
<IfModule mod_negotiation.c>
    Options -MultiViews -Indexes
</IfModule>

RewriteEngine On

# Handle Authorization Header
RewriteCond %{HTTP:Authorization} .
RewriteRule .* - [E=HTTP_AUTHORIZATION:%{HTTP:Authorization}]

# Redirect Trailing Slashes If Not A Folder...
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_URI} (.+)/$
RewriteRule ^ %1 [L,R=301]

# Handle Front Controller...
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME} !-f
RewriteRule ^ index.php [L]

Mitch M
  • 188
  • 2
  • 5