We study the problem of generating an approximately i.i.d. string at the
output of a discrete memoryless channel using a limited amount of randomness at
its input in presence of causal noiseless feedback...Feedback does not decrease
the channel resolution, the minimum entropy rate required to achieve an
accurate approximation of an i.i.d. output string. However, we show that, at
least over a binary symmetric channel, a significantly larger resolvability
exponent (the exponential decay rate of the divergence between the output
distribution and product measure), compared to the best known achievable
resolvability exponent in a system without feedback, is possible. We show that
by employing a variable-length resolvability scheme and using an average number
of coin-flips per channel use, the average divergence between the distribution
of the output sequence and product measure decays exponentially fast in the
average length of output sequence with an exponent equal to $[R-I(U;V)]^+$
where $I(U;V)$ is the mutual information developed across the channel.(read more)