I have to write a rule as part of a small step operational semantics (https://en.wikipedia.org/wiki/Operational_semantics).
Whenever this rule is applied, the next bit belonging to an infinite bit string must be taken. Therefore, in the premise of the rule, I have to express that the next bit of a (previously) fixed bit string is taken.
To represent the set of infinite (bit) strings, it is usually used the notation $\{0,1\}^{\omega}$.
So, the first idea is to write the premise of the rule in the following way: $ b= nextBit(str) \quad str \in \{0,1\}^{\omega}$.
What do you think about that? In my opinion, it doesn't make explicit that the infinite string $str$ is chosen once and then fixed. Is it something like
$ b = nextBit(str \rvert_{str \in \{0,1\}^{\omega}})$
reasonable or it is too fuzzy/ non standard?
How would you express it explicitly that $str$ is fixed and it doesn't change during the run?