2

I am reading Cracking the Coding Interview and in the Big O chapter, there is an explanation on Amortized Time. The classic example of something such asn ArrayList needing to grow is used here. When an array needs to grow, insertion will take O(N) time assuming it has to copy N elements to the new Array. This is fine.

What I don't understand is that as the array is doubled in capacity, why would an amortized time for each insertion be O(1) From everything I understand, anytime you insert into an array, it is always an O(N) operation. How is it different for Amortized Time? I'm sure the text is correct, I'm just not grokking the O(1) amortized time concept.

randombits
  • 41,533
  • 69
  • 218
  • 394
  • 2
    Already has an answer here [Constant Amortized Time](https://stackoverflow.com/questions/200384/constant-amortized-time) – Dúthomhas Aug 31 '17 at 02:09
  • 3
    *Inserts*? Are you sure it didn't say appending? – user2357112 supports Monica Aug 31 '17 at 02:09
  • As the array is doubled in capacity, the probability of the array needing to grow decreases exponentially and approaches 0. – 4castle Aug 31 '17 at 02:14
  • @Dúthomhas my question is more or less how is an array insert ever O(1). I thought an Insert is always O(N) no matter the circumstances. – randombits Aug 31 '17 at 02:25
  • Time considerations only matter as ***n***→<_really big="" number_="">. In terms of the code that actually messes with the array, yes, it is O(n) (or worse). In terms of calling the function that messes with the array over the lifetime of your application, the amount of time it takes approaches a specific time interval, which you can take as 1. – Dúthomhas Aug 31 '17 at 02:30
  • Possible duplicate of [dynamic array's time complexity of putting an element](https://stackoverflow.com/questions/33331314/dynamic-arrays-time-complexity-of-putting-an-element) – Gene Aug 31 '17 at 03:15
  • Btw. doubling is not the only way to get amortized O(1) append time. Any constant *factor* will work, you could for example enlarge by 10 % whenever there is a need. – Henry Aug 31 '17 at 04:02

1 Answers1

7

I am answering the question that you seem confused about, rather than what you officially asked.

Your real question is how appending can be a O(1) operation. If space is already allocated for the next element of the array, then appending is just updating the record of how many elements are in, and copying the entry. This is a O(1) operation.

Appending only is expensive if you overflow available space. Then you have to allocate a larger region, move the whole array, and delete the previous. This is a O(n) operation. But if we're only doing it every O(1/n) times, then on average it can still come out to O(n * 1/n) = O(1).

Whether or not the average matters depends on your task. If you're controlling heavy machinery, spending too long on an individual operation can mean you don't get back to the rotating blade soon enough, which can be Officially Bad. If you're generating a web page then all that matters is the total time taken for a sequence of operations, and that will be the number of operations times the average time each.

For most programmers, the average is what matters.

btilly
  • 35,214
  • 3
  • 46
  • 74