There is a well known result stating that any comparison-based sorting algorithm has a time complexity of at least O(n log n). Selection sort and merge sort are two common examples, with time complexity O(n^2) and O(n log n) respectively.
Merge sort is also a non-in place sorting algorithm, much like radix sort (which is not based on comparison).
Intuitively, it would seem that these non-in-place sorting algorithms have a lower bound on space complexity of at least O(n), because by definition they memorize at least parts of the original data structure.
Indeed, the main non-in-place sorting algorithms I know (merge sort, bucket sort, radix sort) have a space complexity of O(n) or greater.
However, it is possible that some sorting algorithm is not in place but manages to keep the space complexity of O(n), for example reaching O(log n).
Is there a known lower bound on space complexity of these non-in-place algorithms?