8

The R package multicore has a function mclapply() which applies a specified function over a list of things and takes advantage of multiple cores. It's easy to use and results in big speed boosts.

Is there a Python equivalent? Thanks

jamylak
  • 111,593
  • 23
  • 218
  • 220
N. McA.
  • 4,236
  • 3
  • 32
  • 58
  • 1
    http://docs.python.org/2/library/multiprocessing.html – jamylak May 02 '13 at 13:10
  • 2
    I don't think pandas/numpy can make use of multicores... yet. This is [being worked on](https://github.com/ContinuumIO/blaze-core). – Andy Hayden May 02 '13 at 13:15
  • This isn't a single process solution and will not map to mclapply. But if you run processing functions on a single dataset that doesn't change, this helps - http://ipython.org/ipython-doc/dev/parallel – user1827356 May 02 '13 at 14:25
  • 1
    we should have something like this hopefully pretty soon: https://github.com/pydata/pandas/issues/3202 – Jeff May 02 '13 at 14:33

1 Answers1

3

This functionality is available from the map method in multiprocessing.Pool()

See this answer for more details: Is there a simple process-based parallel map for python?

Community
  • 1
  • 1
N. McA.
  • 4,236
  • 3
  • 32
  • 58