1

The acceptance of PEP 448 has introduced Additional Unpacking Generalizations in Python 3.5.

For example:

>>> l1 = [1, 2, 3]
>>> l2 = [4, 5, 6]

# unpack both iterables in a list literal
>>> joinedList = [*l1, *l2]
>>> print(joinedList)
[1, 2, 3, 4, 5, 6]

QUESTION: Is there a way to do similar thing with a list of lists?

This code does not work:

SyntaxError: iterable unpacking cannot be used in comprehension

# List of variable size
list_of_lists = [[1, 2, 3], [4, 5, 6], [7, 8, 9]]
joined_list = [*l for l in list_of_lists]

Of course, you could do the following but that looks less elegant and does not look efficient:

# List of variable size
list_of_lists = [[1, 2, 3], [4, 5, 6], [7, 8, 9]]
joined_list = list()
for l in list_of_lists:
    joined_list += l
ShadowRanger
  • 108,619
  • 9
  • 124
  • 184
Jean-Francois T.
  • 9,065
  • 3
  • 46
  • 79
  • No. Use either [`itertools.chain.from_iterable`](https://docs.python.org/3/library/itertools.html#itertools.chain.from_iterable) or a nested list comprehension `[x for l in list_of_lists for x in l]` – Patrick Haugh Feb 12 '18 at 03:03

2 Answers2

2

How about going old school: sum()

Code:

joined_list = sum(list_of_lists, [])

Test Code:

# List of variable size
list_of_lists = [[1, 2, 3], [4, 5, 6], [7, 8, 9]]
joined_list = sum(list_of_lists, [])
print(joined_list)

Results:

[1, 2, 3, 4, 5, 6, 7, 8, 9]
Stephen Rauch
  • 40,722
  • 30
  • 82
  • 105
0

I'm going to discourage using sum here, as it's a form of Schlemiel the Painter's algorithm. sum actually forbids it with str; they didn't try to block all sequence uses to avoid slowing down sum trying to block every misuse, but it's still a bad idea.

The problem is that it means you build progressively larger temporary lists each time, throwing away the last temporary after building the next one by copying everything you've seen so far, plus new stuff, over and over. If the first list has a million items in it, and you have ten more lists to concatenate onto it, you're copying at least 10 million elements (even if the other ten lists are empty). Your original code was actually better, in that using the += operator performed in-place extension, keeping the worst case performance in the O(n) (for n elements across all lists) range, rather than O(n*m) (for n elements across m lists).

It also has the problem of only working for one consistent type; if some inputs are lists, some tuples, and some generators, sum won't work (because list.__add__ won't accept non-list operands for the other side).

So don't do that. This is what itertools.chain and it's alternate constructor, itertools.chain.from_iterable were made for:

from itertools import chain

list_of_lists = [[1, 2, 3], [4, 5, 6], [7, 8, 9]]
joined_list = list(chain.from_iterable(list_of_lists))

It's guaranteed O(n), works with any iterables you throw at it, etc.

Yes, obviously if you've just got three lists of three elements a piece, it hardly matters. But if the size of the input iterables or the number of iterables is arbitrarily large, or the types aren't consistent, chain will work, sum won't.

ShadowRanger
  • 108,619
  • 9
  • 124
  • 184