Picks \(K \in [1:N]\) such that \(P_{X_K} \approx P_{X \mid Y}\).
Encodes \(K\) using \(\approx \log K\) bits.
4.3. Coding Efficiency
When common randomness \(S\) available, there exists an algorithm, such that (Li and El Gamal, 2017):
\[
{\color{red} I[X; Y]} \leq \mathbb{H}[X \mid S] \leq {\color{red} I[X; Y]} + {\color{blue} \log (I[X; Y] + 1) + 4}
\]
\(I[X; Y]\) can be finite even when \(\mathbb{H}[X]\) is infinite!
GPRS is a rejection sampler using Poisson processes
Can be used for relative entropy coding
Has an optimally efficient variant for 1D, unimodal distributions
10. References
10.1. References I
E. Agustsson and L. Theis. "Universally quantized neural compression" In NeurIPS 2020.
C. Blundell, J. Cornebise, K. Kavukcuoglu and D. Wierstra. Weight uncertainty in neural network. In ICML 2015.
E. Dupont, A. Golinski, M. Alizadeh, Y. W. Teh and Arnaud Doucet. "COIN: compression with implicit neural representations" arXiv preprint arXiv:2103.03123, 2021.
10.2. References II
G. F. āGreedy Poisson Rejection Samplingā NeurIPS 2023, to appear.
G. F.*, S. Markou*, and J. M. Hernandez-Lobato. "Fast relative entropy coding with A* coding". In ICML 2022.
D. Goc and G. F. āOn Channel Simulation Conjecturesā unpublished.
10.3. References III
Z. Guo*, G. F.*, J. He, Z. Chen and J. M. Hernandez Lobato, āCompression with Bayesian Implicit Neural Representationsā NeurIPS 2023, to appear.
P. Harsha, R. Jain, D. McAllester, and J. Radhakrishnan, āThe communication complexity of correlation,ā IEEE Transactions on Information Theory, vol. 56, no. 1, pp. 438ā449, 2010.
M. Havasi, R. Peharz, and J. M. HernaĢndez-Lobato. "Minimal Random Code Learning: Getting Bits Back from Compressed Model Parameters" In ICLR 2019.
10.4. References IV
J. He*, G. F.*, Z. Guo and J. M. Hernandez Lobato, āRECOMBINER: Robust and Enhanced Compression with Bayesian Implicit Neural Representationsā unpublished.
C. T. Li and A. El Gamal, āStrong functional representation lemma and applications to coding theorems,ā IEEE Transactions on Information Theory, vol. 64, no. 11, pp. 6967ā6978, 2018.
10.5. References V
L. Theis and E. Agustsson. On the advantages of stochastic encoders. arXiv preprint arXiv:2102.09270.
L. Theis, T. Salimans, M. D. Hoffman and F. Mentzer (2022). Lossy compression with Gaussian diffusion. arXiv preprint arXiv:2206.08889.