0

Can someone tell how the average and worst case SPACE complexity of Bucket sort is found?

Dreamer23
  • 91
  • 9
  • Possible duplicate of [What is the worst case complexity for bucket sort?](https://stackoverflow.com/questions/9792132/what-is-the-worst-case-complexity-for-bucket-sort) – zer00ne Apr 09 '19 at 14:18
  • [first google result of "bucket sort"](https://en.wikipedia.org/wiki/Bucket_sort#Worst-case_analysis) – SaiBot Apr 09 '19 at 14:18
  • They talk about time complexity and not space complexity. – Dreamer23 Apr 09 '19 at 14:46
  • Space and time complexity are calculated the same way; you're merely using a different resource. – Prune Apr 09 '19 at 16:59
  • Possible duplicate of [Big O, how do you calculate/approximate it?](https://stackoverflow.com/questions/3255/big-o-how-do-you-calculate-approximate-it) – Prune Apr 09 '19 at 17:02

1 Answers1

1

Well first off we need to understand how bucket sort works:

  1. Create an empty array
  2. Loop through the original array and put each object in a bucket
  3. Sort each of the non-empty buckets
  4. Check the buckets in order and then put all objects back into the original array.

This makes bucket sort a great algorithm for big lists that need to be sorted. The average time complexity is O(n+k) where n is the number of your buckets. The worst time complexity is Θ(n^2). The reason for that is because bucket sort is useful when input is uniformly distributed over a range since whenever there are keys that are close to each other they are probably going to end up in the same bucket otherwise we would need a bucket for each key in the original array.

The worst case space complexity is O(nk). Space complexity is a measure of the amount of working storage an algorithm needs. That means how much memory, in the worst case, is needed at any point in the algorithm. As with time complexity, we're mostly concerned with how the space needs grow, in big-Oh terms, as the size N of the input problem grows.

I hope that helped!