Assume that Trent has a binary random vector generator which creates vectors of length $n$. Each element of these vectors can be either zero or one with equal probability. Trent creates a set of $M$ such i.i.d random vectors $A=\{a_1,\dots,a_M\}$ and gives it to Alice. Trent also creates another set of $M$ such i.i.d random vectors $B=\{b_1,\dots,b_M\}$ and gives it to Bob.
Alice is interested in a base vector $e_a$ which only has one non-zero element. Bob is also interested in another base vector $e_b$ which only has one non-zero element. If $M$ is not large enough, the probability that $e_a$ belongs to the vector space of the random vectors in $A$ is small. The same thing can be said for $e_b$ and $B$. Therefore, in this case Trent creates $l$ more i.i.d. random vectors $V=\{v_1,v_2,\dots,v_l\}$ and broadcasts a linear combination of them $\mathbf{W}= \sum_{i=1}^l c_i v_i$ where $c_i$ is either 0 or 1 for $i=1,2,\dots,n$ to Alice and Bob. When Alice and Bob receive this linear combination of random vectors in $V$, they use it along with their own vector spaces to create their desired base vectors. Assuming that all such vector spaces and calculations are in GF(2), what is the expected value of the minimum required number of the newly created random vectors $\mathbb{E}[l]$ such that both Alice and Bob can reconstruct their desired vectors for any choice of $e_a$ and $e_b$? Is this even possible to span every pair of base vectors in this way using only one such transmission from Trent?
This problem is related to the one in Expected number of random binary vectors to make matrix of order n . So if only Alice was interested in a base vector, then the expected value of such vectors would have been equal to $n-M+E$ where $E \approx 1.6067$ is the Erdos-Borwein constant. I think this problem might be solvable using Markov chains but I do not know how to model it. I appreciate any thoughts on it.