1

Here is my sample snippet for reading and writing by output stream and I am getting out of memory exception.

 public static void readFileContent(InputStream in, OutputStream out) throws IOException {
    byte[] buf = new byte[500000];
    int nread;
    int navailable;
    int total = 0;
    synchronized (in) {
        try {
            while((nread = in.read(buf, 0, buf.length)) >= 0) {
                out.write(buf, 0, nread);
                total += nread;
            }
        }
        finally {
            if (in != null) {
                try {
                    in.close();
                } catch (Exception e) {
                    e.printStackTrace();
                }
            }
        }
    }

    out.flush();
    buf = null;
}
  1. What are the possible scenarios with the above snippet to get "out of memory exception" ?
  2. Is it necessary to close the output stream here? And does stream, flush is enough or do we need to close the stream always? If so why?
  3. How could I avoid Out of memory exception in general?

Please clarify me.

Ravindra babu
  • 42,401
  • 8
  • 208
  • 194
Tom Taylor
  • 2,378
  • 1
  • 27
  • 48

4 Answers4

1
  1. What are the possible scenarios with the above snippet to get "out of memory exception" ?

There are various root causes for out of memory exceptions. Refer to oracle documentation page for more details.

java.lang.OutOfMemoryError: Java heap space:

Cause: The detail message Java heap space indicates object could not be allocated in the Java heap.

java.lang.OutOfMemoryError: GC Overhead limit exceeded:

Cause: The detail message "GC overhead limit exceeded" indicates that the garbage collector is running all the time and Java program is making very slow progress

java.lang.OutOfMemoryError: Requested array size exceeds VM limit:

Cause: The detail message "Requested array size exceeds VM limit" indicates that the application (or APIs used by that application) attempted to allocate an array that is larger than the heap size.

java.lang.OutOfMemoryError: Metaspace:

Cause: Java class metadata (the virtual machines internal presentation of Java class) is allocated in native memory (referred to here as metaspace)

java.lang.OutOfMemoryError: request size bytes for reason. Out of swap space?:

Cause: The detail message "request size bytes for reason. Out of swap space?" appears to be an OutOfMemoryError exception. However, the Java HotSpot VM code reports this apparent exception when an allocation from the native heap failed and the native heap might be close to exhaustion

  1. Is it necessary to close the output stream here? And does stream, flush is enough or do we need to close the stream always? If so why?

since you are using raw InputStream and OutputStream in your method, we don' t know which type of actual Stream is getting passed to this method and hence explicitly close these Streams is good idea.

  1. How could I avoid Out of memory exception in general?

This question is already answered in response to your first question.

Refer to this SE question on handling large files for IO operations :

Java OutOfMemoryError in reading a large text file

Community
  • 1
  • 1
Ravindra babu
  • 42,401
  • 8
  • 208
  • 194
0

I think it's obvious that the problem is that you allocate 500000 bytes at once, and they may not be available in the heap at runtime.

Explanation: I would not suggest it, but you could increment the heap size of your program. The default heap size for a java program is determined at runtime, but it can also be parameterized.

Recommendation: As far as I can see by the provided snippet, it's not absolutely necessary to read 500000 bytes at once. So, you can initialize your byte array with a smaller number that would result in having more reading loops. But if it's not a problem for your program... I guess.

Conclusion: Try by setting the initial byte array size to 5000, or even 1000.

EDIT:

An extra point to take into consideration is that in the above code snippet you only flush once at the end. The bytes you are writting to the OutputStream are kept in memory, and their size may cause an OutOfMemoryException also.

In order to overcome this, you should flush more often. It will affect your performance if you flush too often, but you can always experiment with a condition in your loop e.g.

...
if (total % 5000 == 0) {
    out.flush();
}
...

EDIT 2:

As the InputStream and OutputStream objects are passed to the given method as parameters, so, in my opinion this method is not responsible for closing them. The method that initializes the Streams is also responsible for close them gracefully. Flush is enough for this method. But consider doing it in smaller chunks.

EDIT 3:

To summarize the suggested tweaks:

public static void readFileContent(InputStream in, OutputStream out) throws IOException {
    byte[] buf = new byte[1000];
    // wrap your OutputStream in a BufferedOutputStream
    BufferedOutputStream bos = new BufferedOutputStream(out, 5000);
    int nread;
    int navailable;
    int total = 0;
    synchronized (in) {
        try {
            while((nread = in.read(buf, 0, buf.length)) >= 0) {
                // use the BufferedOutputStream to write data
                // you don't need to flush regularly as it is handled automatically every time the buffer is full
                bos.write(buf, 0, nread);
                total += nread;
            }
        }
        finally {
            if (in != null) {
                try {
                    in.close();
                } catch (Exception e) {
                    e.printStackTrace();
                }
            }
        }
    }

    // flush the last contents of the BufferedOutputStream
    bos.flush();
    buf = null;
}

Please note also that BufferedOutputStream will automatically call flush() when you will close it gracefully.

EDIT 4:

Example calling the above method:

public static void main(String[] args) {
    String filename = "test.txt";
    String newFilename = "newtest.txt";

    File file = new File(filename);
    File newFile = new File(newFilename);

    try (InputStream fis = new FileInputStream(file);
            OutputStream fout = new FileOutputStream(newFile)) {
        readFileContent(fis, fout);
    }
    catch(IOException ioe) {
        System.out.println(ioe.getMessage());
    }
}
Community
  • 1
  • 1
sanastasiadis
  • 1,140
  • 1
  • 15
  • 21
  • Is there any way that I could find my default heap size? – Tom Taylor May 11 '16 at 13:22
  • You can use `Runtime.totalMemory()` and [these commands](http://stackoverflow.com/a/13871564/3652270). – sanastasiadis May 11 '16 at 13:45
  • Though the above snippet invokes `Flush` only once the output stream buffer is of fixed length, then how could this cause output of memory exception? – Tom Taylor May 13 '16 at 04:25
  • You are writing to an OutputStream that keeps everything in memory. So it grows always and it may cause memory issues until you flush. – sanastasiadis May 13 '16 at 05:05
  • @ sanastasiadis Please make me clear the following. Consider, for an example, (RAM size = 2GB, file size = 3GB) I am writing it this file into another file with the output stream with buffer size 1024 (1 KB). I am flushing only when I am closing the stream. Then are you sure that in this case I could not write the file (since, the file is greater than RAM size) or this would cause exception? – Tom Taylor May 13 '16 at 05:23
  • Yes, I guess that in the case you describe you will cause a memory-related exception. When using raw `OutputStream` you should take care of the flushing. But I would suggest you instead, to wrap your `OutputStream` in a `BufferedOutputStream` that uses an internal buffer, and when this buffer is full, then flush is called automatically, and the buffer is emptied to be filled again. – sanastasiadis May 13 '16 at 06:40
  • @ sanastasiadis Thank you. You’re so helpful. Let me try with `BufferedOutputStream` and update you. – Tom Taylor May 13 '16 at 08:19
  • BufferedOuputStream didn't work :( Still I am facing Out of memory exception for the above snippet. – Tom Taylor May 17 '16 at 03:14
  • Is this code used to copy a file on the disk to another location? How many parallel threads/clients are you testing with? – sanastasiadis May 17 '16 at 04:26
  • I am getting the exception when I test it even with a single client and single file (whose file size is greater than the buffer size). – Tom Taylor May 17 '16 at 05:07
  • An out of memory exception can be thrown from a different place than the place of the memory leak. Is your program doing anything else that may use a lot of memory? – sanastasiadis May 17 '16 at 06:07
  • my program just copies the content from one location to another where i'm facing this kinda exception.. – Tom Taylor May 17 '16 at 06:12
  • I guess you are not calling the method `readFileContent` from within a loop. I added an example of how I tested the method, and it worked with an example file of 4.5 GB. – sanastasiadis May 17 '16 at 07:51
0
  1. Change the buf to new byte[1*1024]
  2. Read using just buf no need to specify length e.g. pos = in.read(buf)

The rest of the code looks good. No need to increase the memory. Also, any points of synchronised inputStream?

Minh Kieu
  • 400
  • 3
  • 9
  • 1. Is there any reason to read bytes in the multiple of 1024? 2. I was about to ask, can you please explain me the correct usage of in.read(buf) and in.read(buf, startindex, endindex) – Tom Taylor May 11 '16 at 13:52
  • Instead of reading everything into memory, you read 1K at-a-time, process it and then read the next 1k. InputStream.read(buf) should behaves the same as you given it a buffer and that is how much it can read. You can increase the buffer to 4K (4*1024) etc which may give you a better performance but will you more memory. – Minh Kieu May 11 '16 at 19:02
  • Synchronized the InputStream as the example above will do nothing as no other code trying to lock on to the InputStream. – Minh Kieu May 11 '16 at 19:03
  • Thanks a lot Minh Kieu – Tom Taylor May 12 '16 at 04:19
-1

In Java there isn't any brute way of freeing memory. Even calling the built-in Garbage Collector (System.gC()) might not solve the problem, as the GC only frees objects that are not referenced anymore. You need to take care of the code that you are writing so that it can employ the resources in its best way it can. Certainly, there are cases that you are left out of options, especially when you are using big or giant data structures regardless of any code optimization you can think of (in your case, you are creating an array with half a million records of bytes).

As a partial solution, you can increase your Heap Size Memory so that Java can allocate more memory.

Lefteris008
  • 852
  • 3
  • 10
  • 27
  • How is this answer related to the questions asked? It does neither explain which scenarios could cause the code to throw an OutOfMemoryError, nor how to avoid them in general. – jarnbjo May 11 '16 at 13:00
  • @jarnbjo It actually provides a solution: **increase the Java Heap size**. The only way of avoiding falling into `OutOfMemoryException` is to optimize your code; I can't possibly teach someone how to code right. Regarding the code that the poster provided, he is looping over an array of half a million bytes which then outputs into a file. If his PC is old and packs little amount of RAM, then it's perfectly normal to throw this exception. – Lefteris008 May 11 '16 at 13:03
  • @Lefteris008 fyki : It is a web application server which has a huge RAM where I am facing this kind of exception. – Tom Taylor May 12 '16 at 04:20