1

So I have a course on algorithms next semester and I am trying to prepare for it. I am starting out with a asymptotic analysis. The book seems to state as long as I can find a constant C and some number n > no, where f(x) <= C*g(x), then I can conclude f(x) = O(g(x)). It would seem inductive reasoning would be required to prove it holds true for all n, which the book doesn't state anything about! I am also seeing some people answering questions using a definition of big O(it does for little o though) I am not seeing in the book. Namely O is:

f(x) = O(g(x)) if lim n->∞ f(x) / g(x) exists

Is this considered a more rigorous proof of determing asymptotic bounds?

If that is the case I take it Ω is:

f(x) = Ω(g(n)) if lim n->∞ f(x) / g(x) > 0

and Θ is: f(x) = Ω(g(n)) Λ f(x) = Ω(g(n))

Not having a teacher to ask, I am not sure what is the best way of going about these sort of questions.

An example would be determine the relationship between f(n) = nlog n and g(n) =n*√n

templatetypedef
  • 328,018
  • 92
  • 813
  • 992
  • While certainly programming related, this is more of a maths question, and probably best asked over on https://math.stackexchange.com, or at the very least "also" asked there. – Mike 'Pomax' Kamermans Apr 25 '20 at 21:10
  • For big O notation, the limit definition you provided is equivalent to finding a constant `C` such that `|f(x)| <= C*g(x)` for `x >= x_0`. Well, to be more precise, you only need the absolute [limit superior](https://en.m.wikipedia.org/wiki/Limit_superior_and_limit_inferior) rather than the full limit. You can read more [here](https://en.m.wikipedia.org/wiki/Big_O_notation#Formal_definition). For strictly increasing positive functions (as commonly used in algorithm analysis), the limit superior of the ratios is equal to the limit of the ratios. – ljeabmreosn Apr 25 '20 at 23:50
  • FWIW I personally would prefer a proof that did NOT rely on limits. Induction may or may not be helpful depending on the case. There's nothing wrong with using limits for this from a theoretical perspective provided you're careful. – Patrick87 May 01 '20 at 13:20

1 Answers1

2

You’ve asked several questions here. Let’s address each in turn:

The book seems to state as long as I can find a constant C and some number n > no, where f(x) <= C*g(x), then I can conclude f(x) = O(g(x)). It would seem inductive reasoning would be required to prove it holds true for all n, which the book doesn't state anything about!

That is indeed the formal definition of big-O notation. However, nothing about this definition necessitates the use of induction. For example, suppose I want to prove that 2n + 137 = O(n). I’ll pick c = 3 and n0 and show these constants have the properties we want. Specifically, pick any n ≥ 137. Then

2n + 137

≤ 2n + n (since 137 ≤ n)

= 3n,

and we’ve arrived here without using any induction.

I am also seeing some people answering questions using a definition of big O(it does for little o though) I am not seeing in the book. Namely O is:

f(x) = O(g(x)) if lim n->∞ f(x) / g(x) exists

Is this considered a more rigorous proof of determing asymptotic bounds?

This one is interesting. Imagine that f and g are functions where limn → ∞ (f(n) / g(n)) exists. Then indeed you will have f(n) = O(g(n)). One way to see this is to use the formal definition of limits at infinity. Specifically, if L is finite, then by definition

limn → ∞ h(n) = L if and only if for any ε > 0, there is an nε such that |h(n) - L| ≤ ε for all n ≥ nε.

So suppose that limn → ∞ = L for some constant L. Now, pick ε = 1, so there’s some n1 where for any n ≥ n1 we have

|f(n) / g(n) - L| ≤ 1.

In particular, that means that

f(n) / g(n) - L ≤ 1,

So

f(n) ≤ (1 + L)g(n)

Picking c = 1 + L and n0 = n1 then gives us the constants we need to prove f(n) = O(g(n)).

Stated differently, if that limit exists, then f(n) = O(g(n)).

However, it’s possible that f(n) = O(g(n)) even if that limit doesn’t exist. For example, pick f(n) = 2 + sin n and g(n) = 1. Then f(n) = O(g(n)), but then limit of f(n) / g(n) doesn’t exist because the values oscillate out toward infinity. So the converse isn’t true.

I think - but am not 100% sure - that if you replace the limit with a limit superior that you get an equivalence, but that’s something I’ll need to think more about.

Is this a more “rigorous” way to do big-O proofs? I wouldn’t say so. It’s just as rigorous as doing it the other way, though this may be more beneficial if you’re trying to build an intuition.

If that is the case I take it Ω is:

f(x) = Ω(g(n)) if lim n->∞ f(x) / g(x) > 0

and Θ is: f(x) = Ω(g(n)) Λ f(x) = Ω(g(n))

Very close! As mentioned above, the limit trick isn’t a “definition” of big-O notation because it doesn’t handle all cases, so you can’t define Ω notation this way. However, your definition of Θ notation is spot-on.

Hope this helps, and good luck!

Community
  • 1
  • 1
templatetypedef
  • 328,018
  • 92
  • 813
  • 992
  • Hmm but if you prove it's true for n0 don't I need to show the same holds true for all integers greater than n0? That is where the induction part comes in. Well, where I think it would. – Megan Vineyard Apr 26 '20 at 02:18
  • You do need to prove it’s true for all those integers, and induction is one way to do this, but it’s not the only way. You can generally prove that all objects of type X have property Y by picking an arbitrary object of type X and then showing it has property Y. – templatetypedef Apr 26 '20 at 04:03
  • @MeganVineyard You might also find that f(n) <= c * g(n) implies f(n) > n0 just by simple algebraic (or other) manipulations. For instance, for f(n) = 25n + 10 and g(n) = 30n, you might guess c = 1 and find that f(n) <= c * g(n) iff 25n + 10 <= 30n iff 10 < 5n iff n > 2. In such cases as those you can simply choose n0 = 2 and you have your proof that it works for all n > n0 right there. – Patrick87 Apr 30 '20 at 16:48