I'm going to discourage using sum
here, as it's a form of Schlemiel the Painter's algorithm. sum
actually forbids it with str
; they didn't try to block all sequence uses to avoid slowing down sum
trying to block every misuse, but it's still a bad idea.
The problem is that it means you build progressively larger temporary list
s each time, throwing away the last temporary after building the next one by copying everything you've seen so far, plus new stuff, over and over. If the first list has a million items in it, and you have ten more list
s to concatenate onto it, you're copying at least 10 million elements (even if the other ten list
s are empty). Your original code was actually better, in that using the +=
operator performed in-place extension, keeping the worst case performance in the O(n)
(for n
elements across all list
s) range, rather than O(n*m)
(for n
elements across m
list
s).
It also has the problem of only working for one consistent type; if some inputs are list
s, some tuple
s, and some generators, sum
won't work (because list.__add__
won't accept non-list
operands for the other side).
So don't do that. This is what itertools.chain
and it's alternate constructor, itertools.chain.from_iterable
were made for:
from itertools import chain
list_of_lists = [[1, 2, 3], [4, 5, 6], [7, 8, 9]]
joined_list = list(chain.from_iterable(list_of_lists))
It's guaranteed O(n)
, works with any iterables you throw at it, etc.
Yes, obviously if you've just got three list
s of three elements a piece, it hardly matters. But if the size of the input iterables or the number of iterables is arbitrarily large, or the types aren't consistent, chain
will work, sum
won't.