13

Sometimes when using ipython you might hit an exception in a function which has opened a file in write mode. This means that the next time you run the function you get a value error,

ValueError: The file 'filename' is already opened. Please close it before reopening in write mode.

However since the function bugged out, the file handle (which was created inside the function) is lost, so it can't be closed. The only way round it seems to be to close the ipython session, at which point you get the message:

Closing remaining open files: filename... done

Is there a way to instruct ipython to close the files without quitting the session?

jcollado
  • 35,754
  • 6
  • 94
  • 129
tdc
  • 7,149
  • 11
  • 38
  • 61
  • 6
    You should try to always use the [`with`](http://docs.python.org/reference/compound_stmts.html#the-with-statement) statement when working with files, for example, use `with open("x.txt") as fh: `. This ensures that if something goes wrong the file is guaranteed to be closed. – Chris Jan 24 '12 at 16:32
  • 2
    @Chris you should post it as an answer because is the only valid and simple solution – JBernardo Jan 24 '12 at 16:37
  • @JBernardo Thanks for the suggestion - done. – Chris Jan 24 '12 at 16:43
  • [Related question](http://stackoverflow.com/q/2023608/183066) – jcollado Jan 24 '12 at 16:43

2 Answers2

13

You should try to always use the with statement when working with files. For example, use something like

with open("x.txt") as fh:
    ...do something with the file handle fh

This ensures that if something goes wrong during the execution of the with block, and an exception is raised, the file is guaranteed to be closed. See the with documentation for more information on this.

Edit: Following a discussion in the comments, it seems that the OP needs to have a number of files open at the same time and needs to use data from multiple files at once. Clearly having lots of nested with statements, one for each file opened, is not an option and goes against the ideal that "flat is better than nested".

One option would be to wrap the calculation in a try/finally block. For example

file_handles = []
try:
    for file in file_list:
        file_handles.append(open(file))

    # Do some calculations with open files

finally:
    for fh in file_handles:
        fh.close()

The finally block contains code which should be run after any try, except or else block, even if an exception occured. From the documentation:

If finally is present, it specifies a "cleanup" handler. The try clause is executed, including any except and else clauses. If an exception occurs in any of the clauses and is not handled, the exception is temporarily saved. The finally clause is executed. If there is a saved exception, it is re-raised at the end of the finally clause. If the finally clause raises another exception or executes a return or break statement, the saved exception is lost. The exception information is not available to the program during execution of the finally clause.

Chris
  • 39,262
  • 15
  • 126
  • 145
  • Ok but the trouble I'm having with this approach is that I'm opening a few files in a loop, e.g. `for fname in fname_list: file_list.append(open(fname))` because later I have to do operations over the list. If I use the `with ...` approach then all the files are closed in the loop - i.e. I can't have multiple files open. – tdc Jan 25 '12 at 16:36
  • Do you need to have multiple files open simultaneously? If not, then just use `for fname in fname_list: with open(fname) as fh: `. Part of your original problem was that you left files open and then had no way to close them we something went wrong - it is safer just to open one at a time. – Chris Jan 25 '12 at 21:20
  • I need to have multiple files open because I have multiple sets of data in HDF5 and I'm using pytables to access them. To use pytables the files need to stay open – tdc Jan 26 '12 at 08:52
  • I think you are missing my point. Yes the files need to stay open while working them and yes you need to work more than one file, but do you need to work with more than one file at a time? Can't you open a file, extract the data you need (plot it, whatever), close the file and then open the next file? – Chris Jan 26 '12 at 09:24
  • No I'm not missing the point. pytables allows you to access disk files dynamically, so that you don't have to load everything into memory. I'm doing mathematical operations on parts of matrices stored in different files. These are very large matrices that can't be stored in memory. The only way around it that I can see is to store everything in one gigantic file (this could be ~1Tb), but I'm not sure what issues I'll run into if I do that – tdc Jan 26 '12 at 09:56
  • Ok, that was what I was after - you need to work on multiple files at the same time and can't do the work sequentially on each file. I have edited my answer to give a suggestion. I hope it helps. – Chris Jan 26 '12 at 10:48
  • Cool, thanks. I've implemented this approach and it works fine - hopefully useful to others! – tdc Jan 26 '12 at 13:45
7

A few ideas:

  • use always finally (or a with block) when working with files, so they are properly closed.
  • you can blindly close the non standard file descriptors using os.close(n) where n is a number greater than 2 (this is unix specific, so you might want to peek /proc/ipython_pid/fd/ to see what descriptors the process have opened so far).
  • you can inspect the captured stack frames locals to see if you can find the reference to the wayward file and close it... take a look to sys.last_traceback
fortran
  • 67,715
  • 23
  • 125
  • 170