2

I have to make a function that receives as en entry a List<PDXObjectImage> list and creates a small icon for each of these elements and store them in a JTable.

Now I found a way to create icons from a PDXObjectImage without loading the whole Image so that my program does not throw OutOfMemoryError: Java heap space:

for(int k=0;k<list.size();k++)
{
    ByteArrayOutputStream output = new ByteArrayOutputStream();
    list.get(k).write2OutputStream(output);
    ByteArrayInputStream bais = new ByteArrayInputStream(output.toByteArray());
    ImageInputStream iis = ImageIO.createImageInputStream(bais);
    Iterator iter = ImageIO.getImageReaders(iis);
    if (iter.hasNext()) {
        ImageReader reader = (ImageReader) iter.next();
        reader.setInput(iis, true, true);
        ImageReadParam params = reader.getDefaultReadParam();
        params.setSourceSubsampling(2.0, 2.0, 0, 0);
        BufferedImage img = reader.read(0, params);
        ImageIcon imageIcon = new ImageIcon(img);
        model.addRow(new Object[]{imageIcon});
    }
}

I managed to avoid the OutOfMemoryError: Java heap space for a small amount of pictures in the list using readers instead of loading the images as BufferedImage each time. Unfortunately, I still get this error when more than 84 elements are stored in the list.

I used jvisualvm to see what objects took all the heap space and I found out it was byte[] objects(around 85%).

The problem is clearly located where I create all the streams to get the iconImage. The thing is I don't know any ways of getting a ImageInputStream without having to create new streams each time.

I tried to avoid the problem by generating all the streams in a single function :

private ImageInputStream fct(PDXObjectImage img) throws IOException{
    ByteArrayOutputStream output = new ByteArrayOutputStream();
    img.write2OutputStream(output);
    ByteArrayInputStream bais = new ByteArrayInputStream(output.toByteArray());
    return ImageIO.createImageInputStream(bais);
}

thinking that java would automatically delete objects when it reach the end of the scope.

I tried adding the following in any order possible at the end of each loop :

output.reset();
output.flush();
bais.reset();
bais.close();
iis.flush();
output=null;
bais=null;
iis=null;
System.gc();

I also tried to instantiate the streams outside the function's scope, but there is no way of setting a ByteArrayInputStream from a byte[] without using the new keyword, thus creating a new object.

I still get the same error, nothing works.

I read some posts on Statements and ResultSets but I not found them relevant. (Maybe I am wrong)

If someone has any idea of how I could avoid this error, I would be very gratefull.

Thank you

EDIT:

I have modified my code so that I get the following :

for(int k=0;k<list.size();k++)
{
    list.get(k).write2OutputStream(cbb.getOutputStream());
    ImageInputStream iis = ImageIO.createImageInputStream(cbb.getInputStream());
    Iterator iter = ImageIO.getImageReaders(iis);
    if (iter.hasNext()) {
        ImageReader reader = (ImageReader) iter.next();
        reader.setInput(iis, true, true);
        BufferedImage img = reader.read(0, null);
        ImageIcon imageIcon = new ImageIcon(img);
        model.addRow(new Object[]{imageIcon});
    }
}

I have also added a listener to the reader so that it prints out the percentage of the reading done. It always goes up to 84.2% and stops.

Does anyone knows how can this be possible?

Mtrompe
  • 351
  • 4
  • 15
  • Instead of printing the meaningless percentage, look at the *file* that is processed when it crashes. Its probably a very (too) large image. – Durandal Jun 07 '13 at 13:19
  • The file should be quite large, and my program should be able to read it. I instantiate the buffer using the following line : `CircularByteBuffer cbb = new CircularByteBuffer(CircularByteBuffer.INFINITE_SIZE);` which sould be able to contain large data – Mtrompe Jun 08 '13 at 19:16

3 Answers3

2

Use PipedInputStream/PipedOutputStream or CircularByteBuffer to channel the bytes written directly to the input stream. This way you will not have to create intermediate streams and waste memory.

Take a look at this post :

http://ostermiller.org/convert_java_outputstream_inputstream.html

Rewriting fct method using CircularByteBuffer :

private ImageInputStream fct(PDXObjectImage img) throws IOException{
CircularByteBuffer cbb = new CircularByteBuffer(CircularByteBuffer.INFINITE_SIZE);
img.write2OutputStream(cbb.getOutputStream());
return ImageIO.createImageInputStream(cbb.getInputStream());
}

You can also use a multi-threaded approach where you are writing bytes in one thread and reading on another. Thus, writing/reading can occur concurrently optimizing the CPU usage and memory utilization.

Note: com.Ostermiller.util.CircularByteBuffer is not a standard Java API. But the source is freely available

nadirsaghar
  • 547
  • 1
  • 4
  • 20
  • Thank you I think that should do it, I will test this as soon as I can. I find it strange though that java does not free memory at the end of each loop iteration. Is there anyway to actually force java to do so? Because even using the garbage collector, the memory is still used. – Mtrompe Jun 06 '13 at 18:35
  • [You cannot force garbage collector](http://stackoverflow.com/questions/66540/system-gc-in-java) to work even using System.gc(). You can try to [fine tune the JVM](http://www.oracle.com/technetwork/java/javase/gc-tuning-6-140523.html) but your best bet is to optimize the code as far as possible. – nadirsaghar Jun 06 '13 at 19:05
  • Ok, so that would mean that even if I manage to get my code working, I would still have a memory heap problem when dealing with a very large list to begin with, because I would still end up with a `CircularByteBuffer` and a `ImageIcon` at each loop iteration. What you're saying is that there is no way for me to avoid this from happening? – Mtrompe Jun 06 '13 at 19:38
  • You will have CircularByteBuffer in each loop , which is better than creating byte[] . Memory footprint of CircularByteBuffer is much smaller because remember it is flushing off the bytes as soon as it reads them. However when you load the entire byte[] a huge chunk of memory is allocated , which gets out-of-reference once you are out of the loop. Bottom-line : keep bytes read into the memory at minimum – nadirsaghar Jun 06 '13 at 22:22
  • I have implemented your function with the `CircularByteBuffer` but now my program just stop when it encounters the `BufferedImage img = reader.read(0, params);` line of my program – Mtrompe Jun 07 '13 at 09:06
1

1) There's nothing in your code that suggests that loaded image is compressed anyhow to "small icon", except in Iterator iter = ImageIO.getImageReaders(iis); line. Can you confirm image is compressed indeed? Otherwise, it may be the simple case of not enough RAM allocated to JVM heap;

2) Even if image is compressed to smaller byte volume, you may still have insufficient RAM allocated to JVM. Or your compressed image may have internal reference to uncompressed one (I'm not familiar with API used, so I can't tell for sure), resulting in larger image being retained in JVM heap. Using memory profiler, take a look how much memory is occupied on each iteration of loop and if it decreseas over the time due to GC. Then divide total heap by this figure, and you may have an idea how many icons you can load w/o getting OutOfMemory.

3) Statements and ResultSets have nothing to do with Java image processing, these are about working with relational databases.

Victor Sorokin
  • 11,395
  • 1
  • 32
  • 48
0

You are only halving the image size in both dimensions (with the subsampling parameters). Creating an icon should scale the image so that the images larger dimension just matches the display size of the icon.

You need first to determine the image dimensions, then calculate the proper target size, then read the image with subsampling.

If your icon size is (for example) 100 by 100 pixels, the icon should be exactly 100 pixels in its larger dimension.

Durandal
  • 19,415
  • 2
  • 32
  • 62