On the Insensitivity of Bit Density to Read Noise in One-bit Quanta Image Sensors

11 Mar 2022  ·  Stanley H. Chan ·

The one-bit quanta image sensor is a photon-counting device that produces binary measurements where each bit represents the presence or absence of a photon. In the presence of read noise, the sensor quantizes the analog voltage into the binary bits using a threshold value $q$. The average number of ones in the bitstream is known as the bit-density and is often the sufficient statistics for signal estimation. An intriguing phenomenon is observed when the quanta exposure is at the unity and the threshold is $q = 0.5$. The bit-density demonstrates a complete insensitivity as long as the read noise level does not exceeds a certain limit. In other words, the bit density stays at a constant independent of the amount of read noise. This paper provides a mathematical explanation of the phenomenon by deriving conditions under which the phenomenon happens. It was found that the insensitivity holds when some forms of the symmetry of the underlying Poisson-Gaussian distribution holds.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here