26

I have a very large python script, 200K, that I would like to use as little memory as possible. It looks something like:

# a lot of data structures
r = [34, 78, 43, 12, 99]

# a lot of functions that I use all the time
def func1(word):
    return len(word) + 2

# a lot of functions that I rarely use
def func1(word):
    return len(word) + 2


# my main loop
while 1:
   # lots of code
   # calls functions

If I put the functions that I rarely use in a module, and import them dynamically only if necessary, I can't access the data. That's as far as I've gotten.

I'm new at python.

Can anyone put me on the right path? How can I break this large script down so that it uses less memory? Is it worth putting rarely used code in modules and only calling them when needed?

cedbeu
  • 1,651
  • 12
  • 22
Coo Jinx
  • 309
  • 1
  • 3
  • 3
  • 5
    Are you sure it uses *too much* memory? – eumiro Jun 12 '12 at 18:08
  • 4
    Remember that "Premature optimization is the root of all evil". – Amr Jun 12 '12 at 18:17
  • in terms of your functions issue, have you checked whether your functions are referring to global variables? If they are (and presumably the data isn't defined *in* that module) you could either: 1. add a parameter to each function to take in whatever global variable or 2. define all the functions within a class and pass the global variables to `__init__` and rewrite the functions to call the globals as `self.` – Jeff Tratner Jun 12 '12 at 19:47
  • 1
    If you're script file is that large, then it sounds like either you're using extremely variable names everywhere and have lots of comments in the code, or more likely you're doing something very wrong or at best inefficiently. Unfortunately it's doubtful that anyone will be able give you much help based just on the vague description you've given of your code. Time to get specific (and accept some answers)! – martineau Jun 12 '12 at 20:13

5 Answers5

41

Organizing:

Your python script seems indeed to be huge, maybe you should consider reorganizing your code first, to split it into several modules or packages. It will probably make easier the code profiling and the optimization tasks.

You may want to have a look there:

And possibly:

Optimizing:

There is a lot of things that can be done for optimizing your code ...

For instance, regarding your data structures ... If you make big use of lists or lists comprehensions, you could try to figure out where do you really need lists, and where they might be replaced by non-mutable data structures like tuples or by "volatile" objects, "lazy" containers, like generator expressions.

See:

On these pages, you could find some useful information and tips:

Also, you should study your ways of doing things and wonder whether there is a way to do that less greedily, a way that it's better to do it in Python (you will find some tips in the tag pythonic) ... That is especially true in Python, since, in Python, there is often one "obvious" way (and only one) to do things which are better than the others (see The Zen of Python), which is said to be pythonic. It's not especially related to the shape of your code, but also - and above all - to the performances. Unlike many languages, which promote the idea that there should be many ways to do anything, Python prefers to focus on the best way only. So obviously, there are many ways for doing something, but often, one is really better.

Now, you should also verify whether you are using the best methods for doing things because pythonicality won't arrange your algorithms for you.

But at last, it varies depending on your code and it's difficult to answer without having seen it.

And, make sure to take into account the comments made by eumiro and Amr.

Krishna Chaurasia
  • 7,502
  • 6
  • 14
  • 28
cedbeu
  • 1,651
  • 12
  • 22
  • Do you know of any good way to determine the amount of memory some snippet of Python code takes? It's easy to use `timeit` for speed comparisons, so I'm looking for something that will allow me to determine/characterize memory consumption. Just curious if there's something as simple. – Levon Jun 12 '12 at 19:33
  • 4
    [memory_profiler](http://pypi.python.org/pypi/memory_profiler) is pretty useful, easy to use for quick debugging. Now you can try [meliae](https://code.launchpad.net/meliae) ([step-by-step how-to](http://jam-bazaar.blogspot.ie/2010/08/step-by-step-meliae.html)), or [heapy](http://guppy-pe.sourceforge.net/#Heapy) for more complete solutions. Good discussion [here](http://stackoverflow.com/questions/110259/python-memory-profiler) and some interresting estimation methods [here](http://stackoverflow.com/questions/563840/how-can-i-check-the-memory-usage-of-objects-in-ipython) – cedbeu Jun 12 '12 at 23:18
  • I think you are more looking for something like the [memory_profiler](http://pypi.python.org/pypi/memory_profiler) module I mentioned, though. – cedbeu Jun 12 '12 at 23:21
  • Thanks for the information, I favored this question so that I can come back to it and follow up on the links you mentioned. Much appreciated. – Levon Jun 13 '12 at 20:45
5

This video might give you some good ideas: http://pyvideo.org/video/451/pycon-2011---quot-dude--where--39-s-my-ram--quot-

xsbrz
  • 135
  • 9
Bryce
  • 222
  • 2
  • 9
3

The advice on generator expressions and making use of modules is good. Premature optimization causes problems, but you should always spend a few minutes thinking about your design before sitting down to write code. Particularly if that code is meant to be reused.

Incidentally, you mention that you have a lot of data structures defined at the top of your script, which implies that they're all loaded into memory at the start. If this is a very large dataset, consider moving specific datasets to separate files, and loading it in only as needed. (using the csv module, or numpy.loadtxt(), etc)

Separate from using less memory, also look into ways to use memory more efficiently. For example, for large sets of numeric data, numpy arrays are a way of storing information that will provide better performance in your calculations. There is some slightly dated advice at http://wiki.python.org/moin/PythonSpeed/PerformanceTips

abought
  • 2,393
  • 15
  • 13
2

Moving functions around won't change your memory usage. As soon as you import that other module, it will define all the functions in the module. But functions don't take up much memory. Are they extremely repetitive, perhaps you can have less code by refactoring the functions?

@eumiro's question is right: are you sure your script uses too much memory? How much memory does it use, and why is it too much?

Ned Batchelder
  • 323,515
  • 67
  • 518
  • 625
1

If you're taking advantage of OOP and have some objects, say:

class foo:
    def __init__(self, lorem, ipsum):
        self.lorem = lorem
        self.ipsum = ipsum
    # some happy little methods

You can have the object take up less memory by putting in:

__slots__ = ("lorem", "ipsum")

right before the __init__ function, as shown:

class foo:
    def __init__(self, lorem, ipsum):
        self.lorem = lorem
        self.ipsum = ipsum
    # some happy little methods

Of course, "premature optimization is the root of all evil". Also profile mem usage before and after the addition to see if it actually does anything. Beware of breaking code (shcokingly) with the understanding that this might end up not working.

CtrlAltF2
  • 1,030
  • 3
  • 12
  • 37