The algorithm is as follows:
a = rand % a random number between 0 and 1
b = a
while b == a
b = rand
end
Here rand is a function that returns a random number, generated uniformly, between 0 and 1. Let us say that this is the MATLAB function rand.
What is the time complexity of this algorithm?
It looks like the best and average complexities are $O(1)$ and the worst complexity is unbounded.