57

I have seen that in most cases the time complexity is related to the space complexity and vice versa. For example in an array traversal:

for i=1 to length(v)
    print (v[i])
endfor

Here it is easy to see that the algorithm complexity in terms of time is O(n), but it looks to me like the space complexity is also n (also represented as O(n)?).

My question: is it possible that an algorithm has different time complexity than space complexity?

Ben N
  • 2,725
  • 4
  • 23
  • 43
Little
  • 2,841
  • 6
  • 33
  • 57
  • 2
    thanks, this helped me understand some basic things in complexity – doniyor Nov 09 '13 at 16:07
  • I somewhere heared the quote: _"In finite time, one can write only to a finite amount of memory, but you need only very limited memory to iterate upon it forever"_ – Codor Feb 21 '17 at 09:31
  • 1
    @codor Yes, one frequently hears very silly things. – Andrea Asperti Feb 21 '17 at 10:42
  • @AndreaAsperti Thanks for the comment; but did I miss some irony here? Despite the admittedly nerdish wording, it is true, isn't it? – Codor Feb 21 '17 at 10:44
  • @codor No, as I explain in my answer. This kind of loops can be detected: you KNOW you are looping, hence you can stop the computation. – Andrea Asperti Feb 21 '17 at 10:46
  • @AndreaAsperti Good point; however, did I understand your argument right: To check for repetition of state, one needs space exponential in the space the actual program iterates upon? – Codor Feb 21 '17 at 10:53
  • @codor time exponential in space. Yes. It is not a very tight bound, but still interesting. If your program works in log space, you know it has polynomial complexity in time. That is why log space is also considered as an alternative to P for "feasible" computations. – Andrea Asperti Feb 21 '17 at 10:57
  • @AndreaAsperti Yes, it is interesting. I just wanted to make sure not to misunderstand your statement as _"The halting problem can be solved"_ – Codor Feb 21 '17 at 11:13

7 Answers7

109

The time and space complexities are not related to each other. They are used to describe how much space/time your algorithm takes based on the input.

  • For example when the algorithm has space complexity of:

    • O(1) - constant - the algorithm uses a fixed (small) amount of space which doesn't depend on the input. For every size of the input the algorithm will take the same (constant) amount of space. This is the case in your example as the input is not taken into account and what matters is the time/space of the print command.
    • O(n), O(n^2), O(log(n))... - these indicate that you create additional objects based on the length of your input. For example creating a copy of each object of v storing it in an array and printing it after that takes O(n) space as you create n additional objects.
  • In contrast the time complexity describes how much time your algorithm consumes based on the length of the input. Again:

    • O(1) - no matter how big is the input it always takes a constant time - for example only one instruction. Like

      function(list l) {
          print("i got a list");
      }
      
    • O(n), O(n^2), O(log(n)) - again it's based on the length of the input. For example

      function(list l) {
           for (node in l) {
              print(node);
          }
      }
      

Note that both last examples take O(1) space as you don't create anything. Compare them to

function(list l) {
    list c;
    for (node in l) {
        c.add(node);
    }
}

which takes O(n) space because you create a new list whose size depends on the size of the input in linear way.

Your example shows that time and space complexity might be different. It takes v.length * print.time to print all the elements. But the space is always the same - O(1) because you don't create additional objects. So, yes, it is possible that an algorithm has different time and space complexity, as they are not dependent on each other.

victor
  • 479
  • 5
  • 13
stan0
  • 10,239
  • 5
  • 41
  • 57
  • 2
    O(1) doesn't mean fixed or constant, it means bounded. For example, the function that takes a list and bubble sorts the first up-to-1 trillion elements is O(1) in time and space complexity, but the number of comparisons performed varies from 0 to 5e23. Even if you limit the input to large lists, the runtime varies from 1e12 to 5e23. – Paul Hankin Mar 22 '15 at 02:09
  • 1
    instead of calling O(1) a constant, id say it represents that the space occupied is independent of the input size. – LoveMeow Apr 28 '16 at 22:29
  • @LoveMeow that's also false. I can remember up to 100 first elements of an array. Then the space occupied is O(1) but not independent of the input size. – John Dvorak May 11 '16 at 19:38
  • Agree with Aviad, this answer is wrong. Time and Space complexity are interrelated. – Andrea Asperti Feb 20 '17 at 22:30
  • @AndreaAsperti explain, please? – stan0 Feb 21 '17 at 08:30
  • added and extended answer – Andrea Asperti Feb 21 '17 at 09:17
60

Time and Space complexity are different aspects of calculating the efficiency of an algorithm.

Time complexity deals with finding out how the computational time of an algorithm changes with the change in size of the input.

On the other hand, space complexity deals with finding out how much (extra)space would be required by the algorithm with change in the input size.

To calculate time complexity of the algorithm the best way is to check if we increase in the size of the input, will the number of comparison(or computational steps) also increase and to calculate space complexity the best bet is to see additional memory requirement of the algorithm also changes with the change in the size of the input.

A good example could be of Bubble sort.

Lets say you tried to sort an array of 5 elements. In the first pass you will compare 1st element with next 4 elements. In second pass you will compare 2nd element with next 3 elements and you will continue this procedure till you fully exhaust the list.

Now what will happen if you try to sort 10 elements. In this case you will start with comparing comparing 1st element with next 9 elements, then 2nd with next 8 elements and so on. In other words if you have N element array you will start of by comparing 1st element with N-1 elements, then 2nd element with N-2 elements and so on. This results in O(N^2) time complexity.

But what about size. When you sorted 5 element or 10 element array did you use any additional buffer or memory space. You might say Yes, I did use a temporary variable to make the swap. But did the number of variables changed when you increased the size of array from 5 to 10. No, Irrespective of what is the size of the input you will always use a single variable to do the swap. Well, this means that the size of the input has nothing to do with the additional space you will require resulting in O(1) or constant space complexity.

Now as an exercise for you, research about the time and space complexity of merge sort

Saharsh
  • 939
  • 8
  • 25
Prateek
  • 1,816
  • 1
  • 10
  • 21
7

First of all, the space complexity of this loop is O(1) (the input is customarily not included when calculating how much storage is required by an algorithm).

So the question that I have is if its possible that an algorithm has different time complexity from space complexity?

Yes, it is. In general, the time and the space complexity of an algorithm are not related to each other.

Sometimes one can be increased at the expense of the other. This is called space-time tradeoff.

NPE
  • 438,426
  • 93
  • 887
  • 970
7

There is a well know relation between time and space complexity.

First of all, time is an obvious bound to space consumption: in time t you cannot reach more than O(t) memory cells. This is usually expressed by the inclusion

                            DTime(f) ⊆ DSpace(f)

where DTime(f) and DSpace(f) are the set of languages recognizable by a deterministic Turing machine in time (respectively, space) O(f). That is to say that if a problem can be solved in time O(f), then it can also be solved in space O(f).

Less evident is the fact that space provides a bound to time. Suppose that, on an input of size n, you have at your disposal f(n) memory cells, comprising registers, caches and everything. After having written these cells in all possible ways you may eventually stop your computation, since otherwise you would reenter a configuration you already went through, starting to loop. Now, on a binary alphabet, f(n) cells can be written in 2^f(n) different ways, that gives our time upper bound: either the computation will stop within this bound, or you may force termination, since the computation will never stop.

This is usually expressed in the inclusion

                          DSpace(f) ⊆ Dtime(2^(cf))

for some constant c. the reason of the constant c is that if L is in DSpace(f) you only know that it will be recognized in Space O(f), while in the previous reasoning, f was an actual bound.

The above relations are subsumed by stronger versions, involving nondeterministic models of computation, that is the way they are frequently stated in textbooks (see e.g. Theorem 7.4 in Computational Complexity by Papadimitriou).

Andrea Asperti
  • 745
  • 8
  • 13
  • but how do you force termination at $2^f(n)$ if you don't actually know how many states you have visited? – Guillermo Mosse Jun 12 '17 at 19:47
  • 1
    @SenorBilly In fact, you do not need to count and you do not need to force termination. You _know_ the machine will halt within that time bound, because otherwise it would be in loop (and it cannot loop, otherwise space consumpion would be undefined, since it is measured at the end of the computation). – Andrea Asperti Jul 27 '17 at 12:57
5

Yes, this is definitely possible. For example, sorting n real numbers requires O(n) space, but O(n log n) time. It is true that space complexity is always a lowerbound on time complexity, as the time to initialize the space is included in the running time.

Mangara
  • 996
  • 1
  • 9
  • 21
  • 1
    Usually space complexity describes the extra space needed by the algorithm, and sorting can be done with O(1) (additional) space (so it doesn't require O(n) space). And I think you must be using the "extra space" definition of space complexity here, since otherwise you'd say that binary search of a sorted array needs O(n) space and runs in O(log n) time contradicting your claim about time complexity always dominating space complexity. – Paul Hankin Mar 22 '15 at 02:14
1

Sometimes yes they are related, and sometimes no they are not related, actually we sometimes use more space to get faster algorithms as in dynamic programming https://www.codechef.com/wiki/tutorial-dynamic-programming dynamic programming uses memoization or bottom-up, the first technique use the memory to remember the repeated solutions so the algorithm needs not to recompute it rather just get them from a list of solutions. and the bottom-up approach start with the small solutions and build upon to reach the final solution. Here two simple examples, one shows relation between time and space, and the other show no relation: suppose we want to find the summation of all integers from 1 to a given n integer: code1:

sum=0
for i=1 to n
   sum=sum+1
print sum

This code used only 6 bytes from memory i=>2,n=>2 and sum=>2 bytes therefore time complexity is O(n), while space complexity is O(1) code2:

array a[n]
a[1]=1
for i=2 to n
    a[i]=a[i-1]+i
print a[n]

This code used at least n*2 bytes from the memory for the array therefore space complexity is O(n) and time complexity is also O(n)

Ahmad Hassanat
  • 495
  • 4
  • 5
0

The way in which the amount of storage space required by an algorithm varies with the size of the problem it is solving. Space complexity is normally expressed as an order of magnitude, e.g. O(N^2) means that if the size of the problem (N) doubles then four times as much working storage will be needed.

Alam
  • 1