1

I have an application that takes a number of different images, makes new images from them and then saves them for use to make a video. The images are all PNG's and the videos are several minutes long so the program requires a lot of memory (one image every 33.33 MS of video play time). When I process a single video, it all works fine. I can even process several videos and it all works fine. But, eventually, I get an outofmemory error if I try to process 1 + n videos.

What is confusing me is how that error happens. Here is the part of the program where the error happens:

        ComposeVideoController cvc = new ComposeVideoController();          
        boolean made = cvc.setXmlUrl(sourcePath, saveDir, fileId);
        cvc = null;

To be more precise, the error happens in one of the frame construction classes which is referenced by the ComposeVideoController. ComposeVideoController is scoped to a single void method that runs recursively (if there are more videos to be made). I have gone through all the objects referenced by ComposeVideoController, which is the entry point to the library that builds the videos, and made sure they are all set to null too.

How can I get outofmemory errors in ComposeVideoController when any individual video does not cause an outofmemory error and ComposeVideoController is out of scope (and set null) after any given video is made?

The full recursion is shown below. I have one method that checks to see if there are new messages in queue (messages are sent by Socket) and if there are, it calls the method that processes the video. If not, the recursion ends:

private void processQueue() {
    if(makingVideo) 
        return;
    MakeVideoObject mvo = queue.remove(0);      
    makingVideo = true;

    String[] convertArr = mvo.getConvertArrayCommand();
    String sourcePath = convertArr[1];
    String fileId = convertArr[2] + ".mp4";
    String saveDir = convertArr[3] + System.getProperty("file.separator");
    try {
        ComposeVideoController cvc = new ComposeVideoController();          
        boolean made = cvc.setXmlUrl(sourcePath, saveDir, fileId);
        cvc = null;
        if(made) {              
            cleanDir(mvo);
        }
    }
    catch(Exception e) {
        e.printStackTrace();
    }
}

/**
 * Moves all the assets off to a storage directory where we can be 
 * able to recover the video assets if something goes wrong during 
 * video creation.
 * 
 * @param mvo
 */
private void cleanDir(MakeVideoObject mvo) {
    String[] convertArr = mvo.getConvertArrayCommand();
    String sourceDir = convertArr[1];
    String saveDir = convertArr[3] + System.getProperty("file.separator");
    String fileId = convertArr[2];
    sourceDir = sourceDir.substring(0, sourceDir.lastIndexOf(System.getProperty("file.separator")));
    try {
        File f = new File(sourceDir);
        File[] files = f.listFiles();
        for(File file : files) {
            if(file.getName().indexOf(fileId) != -1) {
                file.renameTo(new File(saveDir + file.getName()));
            }
        }
        makingVideo = false;
        mvo = null;
        if(queue.size() > 0) {
            processQueue();
        }           
    }
    catch(Exception e) {
        e.printStackTrace();
    }
}

[Edited to show more of the program]

Howard Roark
  • 301
  • 1
  • 9
  • Try using `-Xmx` flag. It seems that one of your videos is bigger than default max heap size. – mostruash Sep 06 '14 at 14:27
  • Cannot do that. I'm at max memory on Win 32 machines. Any more than 1280 and the JVM cannot start. If this were all running on 64 bit machines, presumably, I could add more memory. My machine has 32 GB of RAM. But my customers don't enjoy the same. Windows 32 bit systems limit the JVM to 1280 in every test I've run on them (1281 and the JVM won't start). – Howard Roark Sep 06 '14 at 15:13
  • your system setup is ... somehow broken. It is certainly possible to assign (much) more than 1281M on windows 32 bit - machines. In fact, up to roundabout 3.2 GB on Win32-machines. – specializt Sep 06 '14 at 15:22
  • I've run it on many Win 32 machines with -Xmx above 1280 and get the error, "Could not start the JVM". I don't have control over my client's machines. But, I do have control over the start flags so I have to go with what their machines will accept. The jar is started as a CLI. The machines all have between 4 and 6 GB of RAM. – Howard Roark Sep 06 '14 at 15:30
  • 1
    @HowardRoark If you have no control over your clients and if one video (`frame count x average size of your png images`) is bigger than the heap, you cannot keep the whole video in memory. You have to stream the video into a file while constructing it. – mostruash Sep 06 '14 at 15:34
  • As I stated, one video is not larger than what I have available in memory or else I could not process the video at all. I can process the video. It's when I process several of them (one at a time) that I get a problem. – Howard Roark Sep 06 '14 at 15:48

3 Answers3

0

thats pretty much what happens if you execute nontrivial code recursively (either this or a classic stack overflow, whichever occurs first) - recursion is VERY resource-intensive, one should avoid it at all costs. Simply exchanging your recursion with an iterative algorithm will make your error go away, most likely

specializt
  • 1,836
  • 15
  • 22
  • I will try to change my design. I'm still baffled at why I'd have this problem though. Recursion or no, the GC documentation says it will run and reclaim all memory of any objects set to null before throwing outofmemory. So why would recursion make the difference? Thanks for the feedback. I'll give your suggestion a go. – Howard Roark Sep 06 '14 at 15:15
  • you cant actually predict when GC will happen - it MAY happen on time but it also may happen long after you would've expected it to happen. Never make assumptions about the garbage collection - if you run into memory problems you will have to re-think your strategy altogether. The garbage collector may wait for system-resources to be available, for instance - or it may be busy with other tasks, or it simply has an unexpected collection-schedule and so forth ... – specializt Sep 06 '14 at 15:19
  • "Recursion..., one should avoid it at all cost." That's an extreme position to defend. There's lots of production code out in the world that successfully uses recursion. If the depth of recursion is bounded, and if you _know_ the bound, and if the stack is configured large enough, then there won't be a problem. If you do it _without_ knowing how much stack the recursive call actually will use, then yes, that's dangerous. – Solomon Slow Sep 09 '14 at 14:25
  • I spend a lot of time working with synchronous languages like Java. I also spend a lot of time working with asynchronous languages like ActionScript. The only way I know to produce working, non-trivial, software in asynchronous languages is to use recursion and events. If you try using iteration in those languages, your software will fail. Still, the point is taken and you are right in that in synchronous languages, recursions should probably be avoided. In my case, the recursion was not causing the problems, but the static class and members were. – Howard Roark Sep 10 '14 at 05:58
  • yet again : that doesnt make much sense. Neither is there a real "asynchronous language" nor a real synchronous one - and if you actually believe that ANY algorithm can ONLY be implemented recursively ... well ... you certainly lack a lot of knowledge. But thats no problem; we all start somewhere but PLEASE dont state your experiences like they're hard facts. Just stop it, people - this is getting ridiculous. – specializt Sep 10 '14 at 08:24
  • @specializt, Sorry, perhaps "configure" was not the right word. http://docs.oracle.com/javase/7/docs/api/java/lang/Thread.html#Thread%28java.lang.ThreadGroup,%20java.lang.Runnable,%20java.lang.String,%20long%29 – Solomon Slow Sep 11 '14 at 19:53
0

I'm posting an answer since I've figured it out finally and in the off-case this helps anyone else trying to diagnose a memory leak in the future.

During my profiling, I could see that there were objects held in memory, but it made no sense to me as they had been set to null. After going through one of the objects again, I noticed that I had declared it static. Because it was static, it also had static members, one of which was a ConcurrentHashMap... So, those Maps were having things added to them and since the object was static, the object and its members would never be dereferenced. Another lesson for me as to why I almost never declare objects static.

Howard Roark
  • 301
  • 1
  • 9
  • "Another lesson for me as to why I almost never declare objects static" That's a wrong conclusion. – Nuri Tasdemir Sep 07 '14 at 16:21
  • Perhaps. I'm not saying I never mark a class static. I use them for objects that don't change state. But I have encountered other troubles when there are static classes around (in testing for example) and so I typically use a design pattern and don't mark objects static. Software should be open to extension and closed to modification... I've also inherited projects where everything was marked static which made the software difficult to extend. So I avoid them but certainly understand they are useful sometimes. – Howard Roark Sep 08 '14 at 02:31
-1

Use -Xmx to increase your memory space. But also, I'd advise explicitly calling System.gc() right about where you're getting the OutOfMemoryException... If you are nulling out your other objects, this will help a lot

ControlAltDel
  • 28,815
  • 6
  • 42
  • 68
  • that doesnt solve anything : http://stackoverflow.com/questions/66540/when-does-system-gc-do-anything – specializt Sep 06 '14 at 15:01
  • I cannot use Xmx to increase memory space. This has to run on all sorts of systems, including Win 32 bit platforms and I'm already at 1280M for my Xmx and Xms flags. Any more and the JVM won't be able to start on 32 bit machines. – Howard Roark Sep 06 '14 at 15:05
  • it would'nt have helped much anyway. At best, invoking the garbage collection postpones your problem. Oh and NEVER make Xms and Xmx the same value, Xms has to be _MUCH_ smaller than Xmx otherwise you will defeat any clever memory-management of your VM – specializt Sep 06 '14 at 15:07
  • I set the values the same so that GC does not have to run so often and knowing that I have to have at least half a gig at all times anyway. This answer also explains why: http://stackoverflow.com/questions/16087153/what-happens-when-we-set-xmx-and-xms-equal-size – Howard Roark Sep 06 '14 at 15:27