Intro: This is an interview question I had which I couldn't solve. A code solution will be good (in any language), but an algorithm/pseudocode is also great.
The problem: This problem is about designing an algorithm that solves the following problem:
You are given a function int getRand(int x) that gets an int x and returns an int in the range from 0 to x randomly (exclusively -> [0, k) ). Each call to getRand() is performed in O(1) time.
You are also given an array int[] arr of size k containing integers.
Write a function getRandUnique() that when called will return a random member from arr such that after k requests exactly you will have a full permutation of the members of arr (this actually means that getRandUnique() will return a different member of arr for each call).
Each call to getRandUnique() has to be performed in O(1) time.
You are allowed to use/store global variables etc...
E.g.: Assume arr = [2, 3, 5 , 1]. A call to getRandUnique() will return either 2, 3, 5, 1 in 1/4 probability. A consequent call to getRandUnique() will return on of the remaining 3 members in 1/3 probability and so on...
Attempt: I actually solved this myself (after much trial and error) and posted the solution "Q&A Style". I would love to get some other possible ideas/solutions. I will accept any solution that works as the correct answer (I don't want to accept my own)!
Question: How can this be achieved with all the above restrictions?
Edit: Now I am aware that this problem corresponds to the Fisher–Yates shuffle, even though the specifications are a little different/more strict here.