2

this is not homework, I am studying Amortized analysis. There are something confuse me .I can't totally understand the meaning between Amortized and Average complexity. Not sure this is right or not. Here is a question:

--

We know that the runtime complexity of a program depends on the program input combinations --- Suppose the probability of the program with runtime complexity O(n) is p, where p << 1, and in other cases (i.e for the (1-p)possible cases), the runtime complexity is O(logn). If we are running the program with K different input combinations, where K is a very large number, we can say that the amortized and average runtime complexity of this program is:

--

First question is: I have read the question here:Difference between average case and amortized analysis

So, I think there is no answer for the average runtime complexity. Because we have no idea about what average input. But it seems to be p*O(n)+(1-p)*O(logn). Which is correct and why?

Second, the amortized part. I have read Constant Amortized Time and we already know that the Amortized analysis differs from average-case analysis in that probability is not involved; an amortized analysis guarantees the average performance of each operation in the worst case.

Can I just say that the amortized runtime is O(n). But the answer is O(pn). I'm a little confuse about why the probability involved. Although O(n)=O(pn), but I really can't have any idea why p could appear there? I change the way of thinking. Suppose we do lost of times then K becomes very big so the amortized runtime is (KpO(n)+K*(1-p)O(logn))/k = O(pn). It seems to be the same idea with Average case.

Sorry for that confuse, help me please, thanks first!

Community
  • 1
  • 1
zongyuwu
  • 305
  • 3
  • 8
  • "an amortized analysis guarantees the average-case performance of each operation in the worst case." No it doesn't. It guarantees the worst-case performance of a _large number_ of operations. Reread the +10 answer by RossFabricant from your first link. – Mooing Duck Jan 29 '14 at 01:29
  • @MooingDuck: Actually it guarantees the *worst case* complexity of performing a sequence of operations – Niklas B. Jan 29 '14 at 01:31
  • @NiklasB.: Yeah, You're probably right there – Mooing Duck Jan 29 '14 at 01:33
  • Why do you say "But the answer is O(p*n)"? And in your fourth paragraph you list one idea and then ask "which is correct" which makes no sense. – Mooing Duck Jan 29 '14 at 01:35
  • "If we are running the program with K different input combinations, where K is a very large number, we can say that the amortized and average runtime complexity of this program is" No, neither of these methods works empirically by choosing inputs at random and checking the runtime. Amortized analysis actually has nothing to do with probabilities at all. – Niklas B. Jan 29 '14 at 01:38
  • Thanks, I read it form Introduction to Algorithms Amortized analysis chapter but missing to put the whole complete paragraph here – zongyuwu Jan 29 '14 at 01:38
  • And yes, the worst-case runtime of each operation is O(n). Amortization doesn't help you at all here, at least if we don't know more about the sequence of operation types. – Niklas B. Jan 29 '14 at 01:39
  • It seems like this might be a reading comprehension problem before anything else. We can explain the subject, and hopefully that will help with the comprehension -- try re-reading that section one more time! – comingstorm Jan 29 '14 at 02:29
  • @comingstorm: I doubt that will help, it's really doesn't seem to be a good resource... – Niklas B. Jan 29 '14 at 02:32

1 Answers1

7

With "average" or "expected" complexity, you are making assumptions about the probability distribution of the problem. If you are unlucky, (or if your problem generator maliciously fails to match your assumption 8^), all your operations will be very expensive, and your program might take a much greater time than you expect.

Amortized complexity is a guarantee on the total cost of any sequence of operations. That means, no matter how malicious your problem generator is, you don't have to worry about a sequence of operations taking a much greater time than you expect.

(Depending on the algorithm, it is not hard to accidentally stumble on the worst case. The classic example is the naive Quicksort, which does very badly on mostly-sorted input, even though the "average" case is fast)

comingstorm
  • 23,012
  • 2
  • 38
  • 64