1

I'm trying to figure out the difference in the time complexity between these two methods:

public ArrayList<Integer> populateList(int n){
  ArrayList<Integer> list = new ArrayList<Integer>();
  for(int i=0; i< n; i++)
     list.add(0, i);
     return list;
  }

public ArrayList<Integer> populateList(int n){
  ArrayList<Integer> list = new ArrayList<Integer>();
  for(int i=0; i< n; i++)
    list.add(i);
    return list;
  }   

I understand that big-o is defined in terms of worst case scenario and the worst case for an addition to an arraylist involves resizing, and thus copying all the elements into a new array. I think method 2 would be O(n^2) because for each element in the array, there is a possibility that you could have to copy all the elements into a larger array.

But I am not sure about method 1 because I am not sure about the order in which things are done. It seems like the copying elements and inserting a new element could be combined such that you wouldn't have to first add the old elements to a larger list and then shift all the elements as needed when you add the new element. If that were the case, it would seem like method 1 is O(n^2) instead of O(n^3). But is that how that works?

philosonista
  • 55
  • 1
  • 7

2 Answers2

1

ArrayList is backed by an array. If you "insert" an element at the head of the array, you have to shuffle all other element to the right by one to make room. This means that a single call to:

list.add(0, i);

is O(n), because every element (already added) must be moved.
If you do this n times, it's O(n^2).

But adding an element to the end of the (backing) array:

list.add(i);

only requires putting a value in an unused element of the array, which is O(1), unless the array is full and another bigger array needs to be allocated and copied to, which is O(n), but that only happens with ever-decreasing frequency as the array grows, specifically O(log n).

If you do an operation that is O(1), except O(n) every log n times, n times, that's O(n log n).

Bohemian
  • 365,064
  • 84
  • 522
  • 658
0

Method 1 is O(n^2)

Method 2 is much better than O(n^2): The internal array storing the data is allocated with some free space, and internal array growth is exponential, so the iteration of n elements happens only infrequently. Since you append at the end, there is no other reason to iterate all elements each step.

Stefan Haustein
  • 17,024
  • 3
  • 30
  • 45
  • I'm a bit confused about how worst case scenarios work. Infrequently, but still sometimes, Method 2 would be O(n^2), no? Do those infrequent occurrences not constitute the big-o for the algorithm as a whole? Is method 2 simply O(n)? – philosonista Sep 16 '15 at 00:56
  • Yes, method 2 is O(n) -- a nice explanation is here: http://stackoverflow.com/questions/200384/constant-amortized-time – Stefan Haustein Sep 16 '15 at 11:33