4

Here is a recursive function. Which traverses a map of strings(multimap<string, string> graph). Checks the itr -> second (s_tmp) if the s_tmp is equal to the desired string(Exp), prints it (itr -> first) and the function is executed for that itr -> first again.

string findOriginalExp(string Exp){
    cout<<"*****findOriginalExp Function*****"<<endl;
    string str;
    if(graph.empty()){
        str ="map is empty";
    }else{
        for(auto itr=graph.begin();itr!=graph.end();itr++){
        string s_tmp = itr->second;
        string f_tmp = itr->first;
        string nll = "null";
        //s_tmp.compare(Exp) == 0
        if(s_tmp == Exp){
            if(f_tmp.compare(nll) == 0){
            cout<< Exp <<" :is original experience.";
            return Exp;
            }else{
                return findOriginalExp(itr->first);

            }
        }else{
            str="No element is equal to Exp.";
        }
     }

    }
    return str;
    }

There are no rules for stopping and it seems to be completely random. How is the time complexity of this function calculated?

JDługosz
  • 4,053
  • 2
  • 20
  • 39
simsim
  • 93
  • 6
  • @largest_prime_is_463035818 Is the code correct now? – simsim Mar 12 '21 at 14:52
  • 3
    BTW can you post a minimal, reproducible example? – GaryNLOL Mar 12 '21 at 14:57
  • 2
    note that I saw your comment only by chance. Ping via @ only works when there is a previous comment of that user. The UB is gone, but you can improve the question by adding the definition of `graph` and maybe some example of its contents. – 463035818_is_not_a_number Mar 12 '21 at 14:58
  • 2
    If you don't need an analytical answer, you can run experiments with [google-benchmark](https://github.com/google/benchmark#calculating-asymptotic-complexity-big-o). – Mansoor Mar 12 '21 at 15:02
  • 2
    You should probably indent your code properly. I'm not sure if you're missing a brace (on mobile so can't check easily) – Mad Physicist Mar 12 '21 at 15:12
  • 1
    I think I have to enter all the code. I just want to know, is it possible to calculate the time complexity when a function (recursive function) ends randomly? Contrary to what has happened here(https://stackoverflow.com/questions/13467674/determining-complexity-for-recursive-functions-big-o-notation). That is, a rule can be found to terminate the recursive function. – simsim Mar 12 '21 at 15:14

3 Answers3

4

I am not going to analyse your function but instead try to answer in a more general way. It seems like you are looking for an simple expression such as O(n) or O(n^2) for the complexity for your function. However, not always complexity is that simple to estimate.

In your case it strongly depends on what are the contents of graph and what the user passes as parameter.

As an analogy consider this function:

int foo(int x){
    if (x == 0) return x;
    if (x == 42) return foo(42);        
    if (x > 0) return foo(x-1);            
    return foo(x/2);
}

In the worst case it never returns to the caller. If we ignore x >= 42 then worst case complexity is O(n). This alone isn't that useful as information for the user. What I really need to know as user is:

  • Don't ever call it with x >= 42.
  • O(1) if x==0
  • O(x) if x>0
  • O(ln(x)) if x < 0

Now try to make similar considerations for your function. The easy case is when Exp is not in graph, in that case there is no recursion. I am almost sure that for the "right" input your function can be made to never return. Find out what cases those are and document them. In between you have cases that return after a finite number of steps. If you have no clue at all how to get your hands on them analytically you can always setup a benchmark and measure. Measuring the runtime for input sizes 10,50, 100,1000.. should be sufficient to distinguish between linear, quadratic and logarithmic dependence.

PS: Just a tip: Don't forget what the code is actually supposed to do and what time complexity is needed to solve that problem (often it is easier to discuss that in an abstract way rather than diving too deep into code). In the silly example above the whole function can be replaced by its equivalent int foo(int){ return 0; } which obviously has constant complexity and does not need to be any more complex than that.

463035818_is_not_a_number
  • 64,173
  • 8
  • 58
  • 126
  • thanks a lot. I understand. I will do what you said. – simsim Mar 12 '21 at 15:30
  • 3
    nit: asymptotic complexity is a well defined term and in the example is easy to estimate (since it's obvious when n approaches infinity). It's not always useful though. And on could always write some especially contrived functions that are really tricky to analyze. – Dan M. Mar 12 '21 at 15:50
  • 3
    @DanM. your nitpick is completely appropriate. I sensed a slight misconception and used an example to illustrate something. At best this is half an answer. I didn't aim for the top answer, and I am sorry if I decreased chances for OP to get a less handwavy answer.... – 463035818_is_not_a_number Mar 12 '21 at 15:53
4

This function takes a directed graph and a vertex in that graph and chases edges going into it backwards to find a vertex with no edge pointing into it. The operation of finding the vertex "behind" any given vertex takes O(n) string comparisons in n the number of k/v pairs in the graph (this is the for loop). It does this m times, where m is the length of the path it must follow (which it does through the recursion). Therefore, it has time complexity O(m * n) string comparisons in n the number of k/v pairs and m the length of the path.

Note that there's generally no such thing as "the" time complexity for just some function you see written in code. You have to define what variables you want to describe the time in terms of, and also the operations with which you want to measure the time. E.g. if we want to write this purely in terms of n the number of k/v pairs, you run into a problem, because if the graph contains a suitably placed cycle, the function doesn't terminate! If you further constrain the graph to be acyclic, then the maximum length of any path is constrained by m < n, and then you can also get that this function does O(n^2) string comparisons for an acyclic graph with n edges.

HTNW
  • 22,326
  • 1
  • 24
  • 53
2

You should approximate the control flow of the recursive calling by using a recurrence relation. It's been like 30 years since I took college classes in Discrete Math, but generally you do like pseuocode, just enough to see how many calls there are. In some cases just counting how many are on the longest condition on the right hand side is useful, but you generally need to plug one expansion back in and from that derive a polynomial or power relationship.

JDługosz
  • 4,053
  • 2
  • 20
  • 39
  • 1
    I think I have to enter all the code. I just want to know, is it possible to calculate the time complexity when a function (recursive function) ends randomly? Contrary to what has happened here(stackoverflow.com/questions/13467674/…). That is, a rule can be found to terminate the recursive function. – simsim Mar 12 '21 at 15:18
  • 1
    Maybe... look at how radioactive decay (random) gives you a predictable half life. If the recursions are terminated by a probability, you might get a converging series of fractions. – JDługosz Mar 12 '21 at 15:25