309

This error message is being presented, any suggestions?

Allowed memory size of 33554432 bytes exhausted (tried to allocate 43148176 bytes) in php

Damjan Pavlica
  • 21,431
  • 6
  • 55
  • 65
panidarapu
  • 8,031
  • 6
  • 24
  • 24
  • 1
    What is the script doing when it fails? can you post the code? – Neil Aitken Jan 06 '09 at 09:43
  • we are trying to read a .txt file – panidarapu Jan 07 '09 at 05:06
  • 3
    Looks like a very huge txt file. – macbirdie Jan 22 '09 at 10:07
  • 10
    Conventionally, you read files that are of potentially large or arbitrary size one line at a time, over-writing the previous line memory with each line read. Or you may just want to tail or head the file to get the latest entries. Upping your memory allocation as the file grows is not the answer. – ekerner May 21 '11 at 01:12
  • 1
    Possible duplicate of [Fatal Error: Allowed Memory Size of 134217728 Bytes Exhausted (CodeIgniter + XML-RPC)](http://stackoverflow.com/questions/561066/fatal-error-allowed-memory-size-of-134217728-bytes-exhausted-codeigniter-xml) – kenorb Jan 13 '16 at 21:57
  • 5
    Increase your maximum memory limit to 64MB in your php.ini file. [Google search](http://www.google.com/search?hl=en&rlz=1G1GGLQ_ENUS302&sa=X&oi=spell&resnum=0&ct=result&cd=1&q=increase+memory+limit+php&spell=1) But could I ask why you are trying to allocate that much memory? What line of code does it fail at? –  Jan 06 '09 at 08:21
  • PHP is efficient if you use it right. It is hard though to keep track of all of your objects due to the managed nature of the runtime - not unlike C#. But too many high-level programmers period (including C#) do not have an appreciation of how their code affects the resources it runs on. –  Jan 07 '09 at 17:08
  • It's not necessarily a language problem - it's an algorithm problem, too. Too many PHP programmers do repeated actions on a whole dataset rather than doing all processing on one item at a time. – staticsan Jan 06 '09 at 22:26
  • PHP can be very inefficient with memory usage, I have often seen simple datagrids blow well into 80mb with a mere couple hundred records. This seems to especially happen when you go the OOP route. – TravisO Jan 06 '09 at 17:16

22 Answers22

353

At last I found the answer:

Just add this line before the line where you get error in your php file

ini_set('memory_limit', '-1');

It will take unlimited memory usage of server, it's working fine.

Consider '44M' instead of '-1' for safe memory usage.

panidarapu
  • 8,031
  • 6
  • 24
  • 24
  • 134
    You still should check *why* the memory is exhausted. Maybe you don't need to read the whole file, maybe read it sequentially. – macbirdie Jan 22 '09 at 10:12
  • Worked great. I know _why_ memory was being exhausted - I'm using Zend-Feed to consume a rather large-ish Atom feed. I can't control the size of it or do anything other than swallow the whole feed in one pull, so bumping the memory limit for that one operation solved it. Cheers! – Don Jones Oct 13 '09 at 20:53
  • 8
    - @panidarapu and @Don Jones: Depending on the amount of memory, and *how* this script is used, it could be dangerous to allow the change in memory usage in this way. Don, in your case, you can likely break the feed down into smaller chunks and parse what you need. Glad it works, but be careful. – anonymous coward Jun 11 '10 at 20:51
  • 6
    This suggestion worked for me. Dynamically increasing memory limit in the script is done via function ini_set(): ini_set('memory_limit', '128M'); – mente Feb 12 '11 at 04:16
  • ini_set('memory_limit', '128M'); i used same thing but it showing same problem – panidarapu Feb 12 '11 at 04:16
  • 1
    @panidarapu: You might not have permissions to increase the memory limit - especially if you're on a shared host. – Kristian J. Feb 12 '11 at 04:16
  • 1
    If this doesn't work, add 'php_value memory_limit 50M' in your .htaccess. Replace 50M with your own value. – mixdev Mar 02 '11 at 21:24
  • 4
    This is a bad idea that potentially leaves your server open to malicious attacks via large data posts. – ekerner May 21 '11 at 01:07
  • If anyone else is running into this problem with Wordpress, try setting PHP's memory limits within wp-config.php. Here's the code: `define('WP_MEMORY_LIMIT', '128M');` – dmanexe Aug 24 '11 at 02:09
  • @panidarapu To use unlimited memory is too much, isn't it? – Ahmad Hajjar Jan 10 '12 at 20:45
  • I had this error in /wp-includes/js/tinymce/langs/wp-langs.php. Changing memory wasn't possible/didn;t work. I just deleted most part of the file (from $lang= on )This worked (so now the editor is in english, but so what). – Jeroen K May 09 '12 at 14:11
  • 43
    Guys, please don't go for this quick fix. It can hurt you in long run. As a good programmer, you should figure-out the reason behind this memory consumption & increase as required instead of keeping it `UNLIMITED`. – Sumoanand Jun 11 '13 at 21:17
  • 121
    Wow, this massively upvoted answer seems like really bad practice to me. -1! – Ambidex Oct 24 '13 at 07:16
  • I also feel it's dangerous but what's the worst that can happen in the future? Bring down the server? What if it's down currently? Can it make it get any worse? What if you're using a third party app and cannot change the code? – sam yi Feb 20 '14 at 17:02
  • I had just also this error in my script. It was because I came in an endless loop, which resulted in the Stack Overflow – peter70 Jun 28 '14 at 07:00
  • Better try to use something like `memory_get_usage` to diagnose the cause. In my case it was an infinite loop. And it is not good to have infinite loops in production :) – chestozo Nov 10 '15 at 18:35
  • 1
    This is an AWFUL solution. Only use this to get things working enough so that you can identify the high memory usage and REDUCE it. This will KILL your production server when you get another memory bug. Rather than killing the one bad request, it will knock out quite a bit more. – wedstrom Jun 30 '16 at 21:55
  • Setting unlimited memory is a very bad idea. It hides the memory consumption of the program, which can certainly be lowered. Moreover, if this is running on a server and called several times a second it will hog the server memory or, probably, end dying on some instances. You MUST analyze your program with memory_get_usage() traces at various places, although values given by this function are sometimes disturbing due to the block-allocation scheme of PHP. – Francis Pierot Jul 01 '16 at 08:18
  • It doesn't work on PHP 5.6 on Windows. I get a similar exhausted memory error. – DavidHyogo Jun 23 '17 at 00:08
  • This is a great article on this error and why NOT to use UNLIMITED memory on a server. Use -1 on a dev for testing only but not on production. https://www.airpair.com/php/fatal-error-allowed-memory-size – nwolybug Dec 28 '17 at 17:38
  • Following guidance here: https://meta.stackoverflow.com/questions/255198/how-to-handle-historical-highly-upvoted-but-completely-incorrect-answers I've had to downvote your answer, as it doesn't fix the problem you described, only masks it - and potentially causes further issues later down the line. – Alex Mulchinock Apr 16 '18 at 13:58
  • This is very bad. If you do this, you might consume a considerable amount of server memory and consequences might be critical – Matthew Barbara Dec 16 '18 at 23:38
  • I agree this is a bad solution because of the potential future risk it creates, but I chose it because it was a quick fix. I was trying to update a Joomla! install to the latest version. Once I was able to do that, I removed the ini_set addition immediately. – Daydah Feb 18 '19 at 07:12
  • this is not a recommended method. it's like you are telling to take as much as memory needed. so run this command "php -d memory_limit=4G bin/magento" in Magento root directory – Mohammed Muzammil Jul 23 '19 at 10:29
  • When you don't like your webserver, set memory limit to infinite... LOL – zanderwar Sep 03 '19 at 06:11
  • awesome init_set but I have a question is this safe to use for a server? – Grald Oct 11 '19 at 14:54
  • I would **never, ever** recommend setting the memory limit to -1 (unlimited) in a production environment. That's a recipe for disaster. Don't make that newbie mistake. ------------------------- So how do you do this? Simple —If you do it this way, you can give PHP extra memory only when that piece of code gets called rather than increasing the memory limit for all PHP processes. If by increasing the memory limit you have gotten rid of the error and your code now works, you'll need to take measures to decrease that memory usage. – Sayem Feb 20 '20 at 12:55
  • In your .htaccess you can add: PHP 5.x php_value memory_limit 64M PHP 7.x php_value memory_limit 64M – neo Apr 22 '20 at 14:31
  • `ini_set('memory_limit', '44M');` worked well in my wordpress case. – Elron Sep 07 '20 at 00:04
  • This does not really solve the issue, it just is a workaround for a still existing problem. – Hiran Chaudhuri Mar 19 '21 at 22:45
67

Here are two simple methods to increase the limit on shared hosting:

  1. If you have access to your PHP.ini file, change the line in PHP.ini If your line shows 32M try 64M: memory_limit = 64M ; Maximum amount of memory a script may consume (64MB)

  2. If you don't have access to PHP.ini try adding this to an .htaccess file: php_value memory_limit 64M

Brad Mace
  • 26,280
  • 15
  • 94
  • 141
Haider Abbas
  • 673
  • 5
  • 3
51

Your script is using too much memory. This can often happen in PHP if you have a loop that has run out of control and you are creating objects or adding to arrays on each pass of the loop.

Check for infinite loops.

If that isn't the problem, try and help out PHP by destroying objects that you are finished with by setting them to null. eg. $OldVar = null;

Check the code where the error actually happens as well. Would you expect that line to be allocating a massive amount of memory? If not, try and figure out what has gone wrong...

Rik Heywood
  • 13,368
  • 9
  • 56
  • 77
  • 3
    I had this exact problem - turned out I had inadvertedly created a recursive function - and thus it ran out of memory at any random time during code execution. This had the upside of me now having the world's most memory efficient code, created in the hunt for a memory leak. – Kris Selbekk Oct 15 '13 at 18:13
  • 1
    For the sake of others who will chase a rabbit down a hole. Doctrine in Symfony i think has an issue with monolog and when there is PDO exception it will create an infinite loop of exceptions as it will try an exception for the exception thus hiding the real issue (a corrupted db file in my case). – Desislav Kamenov Feb 07 '19 at 19:50
45

Doing :

ini_set('memory_limit', '-1');

is never good. If you want to read a very large file, it is a best practise to copy it bit by bit. Try the following code for best practise.

$path = 'path_to_file_.txt';

$file = fopen($path, 'r');
$len = 1024; // 1MB is reasonable for me. You can choose anything though, but do not make it too big
$output = fread( $file, $len );

while (!feof($file)) {
    $output .= fread( $file, $len );
}

fclose($file);

echo 'Output is: ' . $output;
Delali
  • 748
  • 1
  • 13
  • 20
  • 7
    Can't believe all these people recommending to set memory_limit to -1... Crazy thing to do on a production server. Thanks for a much cleaner solution. – JohnWolf May 16 '15 at 15:50
  • 2
    While at "best practice", it is good to close the file handler after the `while` loop: `fclose($file)` – kodeart Sep 09 '15 at 09:39
  • 3
    @assetCorp How does this help, provided the file has for example 100MiB and PHP memory limit is still set to 32 MiB. You read it by secure chunks of 1MiB, but then append it into a variable that is going to use all the available memory once the loop reaches 31. iteration. How is it any better? Only outputting the chunks similarly not to require storing them all in one variable would help to solve the problem. – helvete Apr 04 '18 at 09:37
19

It is unfortunately easy to program in PHP in a way that consumes memory faster than you realise. Copying strings, arrays and objects instead of using references will do it, though PHP 5 is supposed to do this more automatically than in PHP 4. But dealing with your data set in entirety over several steps is also wasteful compared to processing the smallest logical unit at a time. The classic example is working with large resultsets from a database: most programmers fetch the entire resultset into an array and then loop over it one or more times with foreach(). It is much more memory efficient to use a while() loop to fetch and process one row at a time. The same thing applies to processing a file.

staticsan
  • 28,233
  • 4
  • 55
  • 72
17

I have faced same problem in php7.2 with laravel 5.6. I just increase the amount of variable memory_limit = 128M in php.ini as my applications demand. It might be 256M/512M/1048M.....Now it works fine.

Ziaur Rahman
  • 708
  • 7
  • 22
15

If you want to read large files, you should read them bit by bit instead of reading them at once.
It’s simple math: If you read a 1 MB large file at once, than at least 1 MB of memory is needed at the same time to hold the data.

So you should read them bit by bit using fopen & fread.

Gumbo
  • 594,236
  • 102
  • 740
  • 814
  • 1
    Solved this by using: `$fh = fopen($folder.'/'.$filename, "rb") or die(); $buffer = 1024*1024; while (!feof($fh)) { print(fread($fh, $buffer)); flush(); } fclose($fh);` – Avatar Apr 28 '14 at 11:09
7

I was also having the same problem, looked for phpinfo.ini, php.ini or .htaccess files to no avail. Finally I have looked at some php files, opened them and checked the codes inside for memory. Finally this solution was what I come out with and it worked for me. I was using wordpress, so this solution might only work for wordpress memory size limit problem. My solution, open default-constants.php file in /public_html/wp-includes folder. Open that file with code editor, and find memory settings under wp_initial_constants scope, or just Ctrl+F it to find the word "memory". There you will come over WP_MEMORY_LIMIT and WP_MAX_MEMORY_LIMIT. Just increase it, it was 64 MB in my case, I increased it to 128 MB and then to 200 MB.

// Define memory limits.
if ( ! defined( 'WP_MEMORY_LIMIT' ) ) {
    if ( false === wp_is_ini_value_changeable( 'memory_limit' ) ) {
        define( 'WP_MEMORY_LIMIT', $current_limit );
    } elseif ( is_multisite() ) {
        define( 'WP_MEMORY_LIMIT', '200M' );
    } else {
        define( 'WP_MEMORY_LIMIT', '128M' );
    }
}

if ( ! defined( 'WP_MAX_MEMORY_LIMIT' ) ) {
    if ( false === wp_is_ini_value_changeable( 'memory_limit' ) ) {
        define( 'WP_MAX_MEMORY_LIMIT', $current_limit );
    } elseif ( -1 === $current_limit_int || $current_limit_int > 268435456 /* = 256M */ ) {
        define( 'WP_MAX_MEMORY_LIMIT', $current_limit );
    } else {
        define( 'WP_MAX_MEMORY_LIMIT', '256M' );
    }
}

Btw, please don't do the following code, because that's bad practice:

ini_set('memory_limit', '-1');
garakchy
  • 496
  • 10
  • 15
  • 1
    Works for wordpress websites, where access is not available for .htaccess and php.ini files. +1 – Mustafa sabir May 04 '17 at 07:23
  • I'd say that changing those limits on a 'core' WordPress file is not really a good idea; you can so very easily add those limits on `wp-config.php` instead, where they will not get overwritten by future WordPress updates. Also, a few security plugins (such as WordFence, for instance) will complain if 'core' WordPress files are changed... – Gwyneth Llewelyn Jan 03 '20 at 19:12
  • 1
    @GwynethLlewelyn I don't know how to do that, can you elaborate it, please? – garakchy Jan 06 '20 at 18:18
  • 1
    Oh... just edit `wp-config.php` and add the two lines there (i .e. `define( 'WP_MEMORY_LIMIT', '200M' );` and `define( 'WP_MAX_MEMORY_LIMIT', '256M' );`. Unlike the files on the 'core' WP (namely, everything under `wp-includes`), which will be overwritten by WP upgrades, `wp-config.php` will not — it's there exactly for the exact purpose of overriding WP constants! – Gwyneth Llewelyn Jan 07 '20 at 20:38
  • 1
    "bad practice" is situational. `-1` is fine for short-running processes. For example a php builder container that's used for running unit tests or composer installs/etc. Just don't run your production site with it set like that. – emmdee Jan 28 '20 at 20:32
6

You can increase the memory allowed to php script by executing the following line above all the codes in the script:

ini_set('memory_limit','-1'); // enabled the full memory available.

And also de allocate the unwanted variables in the script.

Check this php library : Freeing memory with PHP

Sanjay Kumar N S
  • 4,165
  • 3
  • 18
  • 34
6
ini_set('memory_limit', '-1');
Abdul Rasheed
  • 5,615
  • 3
  • 30
  • 45
4

I notice many answers just try to increase the amount of memory given to a script which has its place but more often than not it means that something is being too liberal with memory due to an unforseen amount of volume or size. Obviously if your not the author of a script your at the mercy of the author unless your feeling ambitious :) The PHP docs even say memory issues are due to "poorly written scripts"

It should be mentioned that ini_set('memory_limit', '-1'); (no limit) can cause server instability as 0 bytes free = bad things. Instead, find a reasonable balance by what your script is trying to do and the amount of available memory on a machine.

A better approach: If you are the author of the script (or ambitious) you can debug such memory issues with xdebug. The latest version (2.6.0 - released 2018-01-29) brought back memory profiling that shows you what function calls are consuming large amounts of memory. It exposes issues in the script that are otherwise hard to find. Usually, the inefficiencies are in a loop that isn't expecting the volume it's receiving, but each case will be left as an exercise to the reader :)

The xdebug documentation is helpful, but it boils down to 3 steps:

  1. Install It - Available through apt-get and yum etc
  2. Configure it - xdebug.ini: xdebug.profiler_enable = 1, xdebug.profiler_output_dir = /where/ever/
  3. View the profiles in a tool like QCacheGrind, KCacheGrind
SeanDowney
  • 16,172
  • 17
  • 77
  • 88
2

If you are trying to read a file, that will take up memory in PHP. For instance, if you are trying to open up and read an MP3 file ( like, say, $data = file("http://mydomain.com/path/sample.mp3" ) it is going to pull it all into memory.

As Nelson suggests, you can work to increase your maximum memory limit if you actually need to be using this much memory.

Beau Simensen
  • 4,458
  • 2
  • 34
  • 55
2

We had a similar situation and we tried out given at the top of the answers ini_set('memory_limit', '-1'); and everything worked fine, compressed images files greater than 1MB to KBs.

1

Write

ini_set('memory_limit', '-1');

in your index.php at the top after opening of php tag

C Travel
  • 4,625
  • 7
  • 29
  • 45
Prafull
  • 573
  • 7
  • 15
1

I was receiving the same error message after switching to a new theme in Wordpress. PHP was running version 5.3 so I switched to 7.2. That fixed the issue.

DavGarcia
  • 17,702
  • 14
  • 52
  • 94
1

If you are using a shared hosting, you will not be able to enforce the increment in the php size limit.

Just go to your cpanel and upgrade your php version to 7.1 and above then you are good to go.

successtar
  • 81
  • 3
0

wordpress users add line:

@ini_set('memory_limit', '-1');

in wp-settings.php which you can find in the wordpress installed root folder

C Travel
  • 4,625
  • 7
  • 29
  • 45
Praveesh P
  • 1,133
  • 1
  • 20
  • 34
0

I hadn't renewed my hosting and the database was read-only. Joomla needed to write the session and couldn't do it.

Alberto M
  • 1,115
  • 1
  • 12
  • 33
0

I had the same issue which running php in command line. Recently, I had changes the php.ini file and did a mistake while changing the php.ini

This is for php7.0

path to php.ini where I made mistake: /etc/php/7.0/cli/php.ini

I had set memory_limit = 256 (which means 256 bytes)
instead of memory_limit = 256M (which means 256 Mega bytes).

; Maximum amount of memory a script may consume (128MB)
; http://php.net/memory-limit
memory_limit = 128M

Once I corrected it, my process started running fine.

theBuzzyCoder
  • 2,016
  • 1
  • 23
  • 25
0

Run this command in your Magento root directory php -d memory_limit=4G bin/magento

0

I want to share my experience on this issue!

Suppose you have a class A and class B.

class A {
    protected $userB;

    public function __construct() {
        $this->userB = new B();
    }
}

class B {
    protected $userA;

    public function __construct() {
        $this->userA = new A();
    }
}

this will initiate a chain formation of objects which may be create this kind of issue!

devbikash07
  • 51
  • 1
  • 2
-1

if you are using laravel then use this ways

public function getClientsListApi(Request $request){ print_r($request->all()); //for all request print_r($request->name); //for all name }

instead of

public function getClientsListApi(Request $request){ print_r($request); // it show error as above mention }

Ajay
  • 319
  • 3
  • 6