236

I am to show that log(n!) = Θ(n·log(n)).

A hint was given that I should show the upper bound with nn and show the lower bound with (n/2)(n/2). This does not seem all that intuitive to me. Why would that be the case? I can definitely see how to convert nn to n·log(n) (i.e. log both sides of an equation), but that's kind of working backwards.

What would be the correct approach to tackle this problem? Should I draw the recursion tree? There is nothing recursive about this, so that doesn't seem like a likely approach..

nbro
  • 12,226
  • 19
  • 85
  • 163
Mark
  • 5,623
  • 12
  • 38
  • 51

9 Answers9

330

Remember that

log(n!) = log(1) + log(2) + ... + log(n-1) + log(n)

You can get the upper bound by

log(1) + log(2) + ... + log(n) <= log(n) + log(n) + ... + log(n)
                                = n*log(n)

And you can get the lower bound by doing a similar thing after throwing away the first half of the sum:

log(1) + ... + log(n/2) + ... + log(n) >= log(n/2) + ... + log(n) 
                                       = log(n/2) + log(n/2+1) + ... + log(n-1) + log(n)
                                       >= log(n/2) + ... + log(n/2)
                                        = n/2 * log(n/2) 
nbro
  • 12,226
  • 19
  • 85
  • 163
Mick
  • 4,389
  • 1
  • 17
  • 22
  • Interesting... Pushing this into a sum, there is an approximation that results in the theta notation itself (source: http://en.wikipedia.org/wiki/Summation) a matter of how that is derived is entirely separate. Nice lead, thanks! – Mark Jan 19 '10 at 17:40
  • 5
    This is a very nice proof for the upper bound: log(n!) = log(1) + ... + log(n) <= n log(n) => log(n!) = O(n log n). However, for proving the lower bound (and consequently big-tetha), you'll probably need Stirling's Approximation. – mmx Jan 19 '10 at 20:34
  • 36
    You don't need Sterling's approximation for a lower bound. log(n!) = log(1) + ... + log(n) >= log(n/2) + ... + log(n) >= n/2 * log(n/2) = Omega(n log n). – Keith Randall Jan 19 '10 at 22:40
  • 2
    @Keith: I don't get it yet. Could you (or someone) expand a few more terms for me in the "..." part of "log(n/2) + ... + log(n)" please? Thanks! – j_random_hacker Jan 21 '10 at 09:31
  • 6
    @j_random_hacker: `log(n/2) + log(n/2 + 1) + ... + log(n - 1) + log(n)` (larger half of the terms of `log(n!)`). Actually, I just read the question and saw that the clue is stated in the question. Basically, `(n/2)^(n/2) <= n! <= n^n` => `log((n/2)^(n/2))<=log(n!)<=log(n^n)` => `Θ(n/2 * log(n/2))<=log(n!)<=Θ(n*log(n))` – mmx Jan 21 '10 at 14:07
  • 5
    this explanation is similar to accepted answer, but has a bit more details: http://www.mcs.sdsmt.edu/ecorwin/cs372/handouts/theta_n_factorial.htm – gayavat Oct 02 '15 at 08:15
  • For the lower bound, you can even get the constant correct by choosing the `1-1/\log n` largest terms instead of only the `1/2`, then you get `\log(n!) >= n(1-1/logn) * log(n/logn) ~ nlogn - nloglog n`. – Thomas Ahle Jan 16 '18 at 20:21
44

I realize this is a very old question with an accepted answer, but none of these answers actually use the approach suggested by the hint.

It is a pretty simple argument:

n! (= 1*2*3*...*n) is a product of n numbers each less than or equal to n. Therefore it is less than the product of n numbers all equal to n; i.e., n^n.

Half of the numbers -- i.e. n/2 of them -- in the n! product are greater than or equal to n/2. Therefore their product is greater than the product of n/2 numbers all equal to n/2; i.e. (n/2)^(n/2).

Take logs throughout to establish the result.

Nemo
  • 65,634
  • 9
  • 110
  • 142
  • 10
    This is actually just the same as the log version in the accepted answer but taking the logarithm after instead of before. (it more clearly uses the hint though) – hugomg Nov 03 '11 at 04:21
18

enter image description here

Sorry, I don't know how to use LaTeX syntax on stackoverflow..

Samuel
  • 2,467
  • 27
  • 62
  • 1
    This is a great explanation! I could follow this until step 7, but then I cannot decode the mathemagic that happens between step 7 and step 8... :-( – Z3d4s Oct 31 '19 at 16:15
  • 3
    @Z3d4s The argument in step 7 is basically, that the first term on the right hand side is the dominant term and that log(n!) can therefore approximated by n*log(n) or that it is of order n*log(n) which is expressed by the big O notation O(n*log(n)). – Samuel Oct 31 '19 at 18:44
  • 1
    @Z3d4s the what steps 7-8 conversion is saying that n*logn == log(n^n) and for showing the bound here you can say the first term is always greater than the second term you can check for any larger values, and for expressing big-O complexity we will always take the dominating item of all. So n*logn contributes to the big-O time. – Shiv Prakash Apr 18 '20 at 13:18
  • I am a beginner. Does step 7 means that log(n!) = nlog(n) + {some positive but non-dominating term} => log(n!) = O(nlog(n)) ? But I think should it not be Omega(nlog(n)) instead of O(nlog(n)) ? clear my doubts please. – rsonx Oct 07 '20 at 12:28
  • 1
    @rsonx It's both - see the accepted answer, for example. Regarding the {some positive but non-dominating term} - actually all the elements of multiplication are < 1, therefore the product is < 1 as well. `log` brings it to negative values so the asymptotic result here is actually `O(nlogn) - {something}`. It remains to be proved if that something is insignificant asymptotically compared to `nlogn` - and this answer doesn't provide it. It doesn't affect the big-O but can affect the Omega. Anyway, this is overcomplication, see other answers – SomeWittyUsername Nov 12 '20 at 00:58
12

See Stirling's Approximation:

ln(n!) = n*ln(n) - n + O(ln(n))

where the last 2 terms are less significant than the first one.

Elrond_EGLDer
  • 47,430
  • 25
  • 189
  • 180
dsimcha
  • 64,236
  • 45
  • 196
  • 319
9

For lower bound,

lg(n!) = lg(n)+lg(n-1)+...+lg(n/2)+...+lg2+lg1
       >= lg(n/2)+lg(n/2)+...+lg(n/2)+ ((n-1)/2) lg 2 (leave last term lg1(=0); replace first n/2 terms as lg(n/2); replace last (n-1)/2 terms as lg2 which will make cancellation easier later)
       = n/2 lg(n/2) + (n/2) lg 2 - 1/2 lg 2
       = n/2 lg n - (n/2)(lg 2) + n/2 - 1/2
       = n/2 lg n - 1/2

lg(n!) >= (1/2) (n lg n - 1)

Combining both bounds :

1/2 (n lg n - 1) <= lg(n!) <= n lg n

By choosing lower bound constant greater than (1/2) we can compensate for -1 inside the bracket.

Thus lg(n!) = Theta(n lg n)

  • 2
    This extended derivation is needed because, "something" >= n/2 * lg(n/2) is not equal to omega(n lg n) which was mentioned in one of the previous comment. – Vivek Anand Sampath Feb 21 '15 at 03:13
  • This should read "a constant SMALLER than (1/2)" as we are trying to find a lower bound. Any constant, c, smaller than (1/2) will eventually make c*n*logn <= (1/2)n*logn-(1/2)n, for a large enough n. – Matthew Aug 14 '19 at 04:08
5

Helping you further, where Mick Sharpe left you:

It's deriveration is quite simple: see http://en.wikipedia.org/wiki/Logarithm -> Group Theory

log(n!) = log(n * (n-1) * (n-2) * ... * 2 * 1) = log(n) + log(n-1) + ... + log(2) + log(1)

Think of n as infinitly big. What is infinite minus one? or minus two? etc.

log(inf) + log(inf) + log(inf) + ... = inf * log(inf)

And then think of inf as n.

Pindatjuh
  • 10,215
  • 1
  • 37
  • 65
3

Thanks, I found your answers convincing but in my case, I must use the Θ properties:

log(n!) = Θ(n·log n) =>  log(n!) = O(n log n) and log(n!) = Ω(n log n)

to verify the problem I found this web, where you have all the process explained: http://www.mcs.sdsmt.edu/ecorwin/cs372/handouts/theta_n_factorial.htm

WyrmxD
  • 61
  • 2
1

This might help:

eln(x) = x

and

(lm)n = lm*n
glmxndr
  • 42,138
  • 27
  • 90
  • 115
Anycorn
  • 46,748
  • 41
  • 153
  • 250
1

http://en.wikipedia.org/wiki/Stirling%27s_approximation Stirling approximation might help you. It is really helpful in dealing with problems on factorials related to huge numbers of the order of 10^10 and above.

enter image description here