2

Suppose we have a recursive function which only terminates if a randomly generated parameter meets some condition:

e.g:

{
define (some-recursive-function)

    x = (random in range of 1 to 100000);
    if (x == 10)
    {
        return "this function has terminated";
    }
    else
    {
        (some-recursive-function)
    }
}

I understand that for infinite loops, there would not be an complexity defined. What about some function that definitely terminates, but after an unknown amount of time?

Finding the average time complexity for this would be fine. How would one go about finding the worse case time complexity, if one exists?

Thank you in advance!

EDIT: As several have pointed out, I've completely missed the fact that there is no input to this function. Suppose instead, we have:

{define (some-recursive-function n)

    x = (random in range of 1 to n);
    if (x == 10)
    {
        return "this function has terminated";
    }
    else
    {
        (some-recursive-function)
    }
}

Would this change anything?

Evan Li
  • 33
  • 4
  • It would be challenging to find a complexity of the form O(f(n)) given that there's no n here – Leeor Oct 26 '17 at 19:50
  • https://en.wikipedia.org/wiki/Bogosort 's worst case is unbounded. Uses random numbers to sort as well. – Caramiriel Oct 26 '17 at 20:23
  • 1
    Isn't worst case infinite? There is no guarantee that a given number in a range will be selected by random. – Moop Oct 26 '17 at 20:24
  • Did you mean to pass `n` in your recursive call, or `x`? – Jim Mischel Oct 26 '17 at 22:22
  • @Moop actually there is, at least theoretically - every number in range should have exactly the same probability, which means that if RNG generated some number - after some time it will generate any other number in range – Iłya Bursov Oct 26 '17 at 23:10
  • 1
    @IlyaBursov That still isn't a guarantee, just very improbable. Worst case means worst case, that means poor little 10 never gets called upon. – Moop Oct 26 '17 at 23:26
  • @Moop usually we're speaking about with uniform distribution, which guarantees that even in worst case we will get all numbers, in some time... probably long time – Iłya Bursov Oct 27 '17 at 04:48
  • @IlyaBursov Even with a uniform distribution, there is no guarantee any number will be picked, even if extended to the end of time. Highly improbable, but that's not the same as guarantee. Think of even the simple case of flipping a coin. There is nothing to guarantee that you will flip a heads at some point. Of course it is highly unlikely after multiple trials never to get a heads, but it is still possible. – Moop Oct 27 '17 at 05:01
  • @Moop cumulative probability of getting one particular outcome from the finite set converges absolutely to 1, of course with infinite number of throws – Iłya Bursov Oct 27 '17 at 06:03

2 Answers2

0

If there is no function of n which bounds the runtime of the function from above, then there just isn't an upper bound on the runtime. There could be an lower bound on the runtime, depending on the case. We can also speak about the expected runtime, and even put bounds on the expected runtime, but that is distinct from, on the one hand, bounds on the average case and, on the other hand, bounds on the runtime itself.

As it's currently written, there are no bounds at all when n is under 10: the function just doesn't terminate in any event. For n >= 10, there is still no upper bound on any of the cases - it can take arbitrarily long to finish - but the lower bound in any case is as low as linear (you must at least read the value of n, which consists of N = ceiling(log n) bits; your method of choosing a random number no greater than n may require additional time and/or space). The case behavior here is fairly uninteresting.

If we consider the expected runtime of the function in terms of the value (not length) of the input, we observe that there is a 1/n chance that any particular invocation picks the right random number (again, for n >= 10); we recognize that the number of times we need to try to get one is given by a geometric distribution and that the expectation is 1/(1/n) = n. So, the expected recursion depth is a linear function of the value of the input, n, and therefore an exponential function of the input size, N = log n. We recover an exact expression for the expectation; the upper and lower bounds are therefore both linear as well, and this covers all cases (best, worst, average, etc.) I say recursion depth since the runtime will also have an additional factor of N = log n, or more, owing to the observation in the preceding paragraph.

Patrick87
  • 25,592
  • 3
  • 33
  • 69
-1

You need to know that there are "simple" formulas to calculate the complexity of a recursive algorithm, using of course recurrence.

In this case we obviously need to know what is that recursive algorithm, because in the best case, it is O(1) (temporal complexity), but in the worst case, we need to add O(n) (having into account that numbers may repeat) to the complexity of the algorithm itself.

I'll put this question/answer for more facility:

Determining complexity for recursive functions (Big O notation)

M.K
  • 1,451
  • 2
  • 20
  • 34