1

There is a well known result stating that any comparison-based sorting algorithm has a time complexity of at least O(n log n). Selection sort and merge sort are two common examples, with time complexity O(n^2) and O(n log n) respectively.

Merge sort is also a non-in place sorting algorithm, much like radix sort (which is not based on comparison).

Intuitively, it would seem that these non-in-place sorting algorithms have a lower bound on space complexity of at least O(n), because by definition they memorize at least parts of the original data structure.

Indeed, the main non-in-place sorting algorithms I know (merge sort, bucket sort, radix sort) have a space complexity of O(n) or greater.

However, it is possible that some sorting algorithm is not in place but manages to keep the space complexity of O(n), for example reaching O(log n).

Is there a known lower bound on space complexity of these non-in-place algorithms?

A. Darwin
  • 123
  • 3

2 Answers2

0

You might be interested in external sorting, which can be viewed as loosely related to the study of sorting algorithms where the space complexity is much less than $O(n)$. In particular, we assume there is some external storage that does not count against the storage cost, which we can read and write but each read/write is very slow, and some internal storage that is fast, but counts against the storage complexity.

D.W.
  • 167,959
  • 22
  • 232
  • 500
0

Assume you implemented a sort that takes f(n) extra space. Now take an array of n items, sort the first half using f(n/2) space, sort the second half, and merge both. You start by using f(n/2) items, then as you merge more free space is created either in the merge area or at the start of the second half of the array.

Now you have a method taking f(n/2) space, you use that to create another one wit f(n/4) space. Then find out where the limit is because you will have some overhead.

gnasher729
  • 32,238
  • 36
  • 56