Computationally Lightweight ML-based data compression
Data Compression with INRs
Image from Dupont et al. [4]
computationally lightweight
short codelength
Compress variational INRs!
Image from Blundell et al. [7]
💡Gradient descent is the transform!
Compress variational INRs!
Compress variational INRs!
Theory: What next?
Might not need perfect solution: think of error correcting codes (e.g. LDPC)
Exploit different types of structure
Duality between source and channel coding
Applications: What next?
Realism constraints for INR-based compression
More sophisticated coding distributions
Apply to different types of neural representations
Contributions
First linear-in-the-mutual-information runtime algorithm
Established more precise lower bounds on sampling-based channel simulation algorithms
Created state-of-the-art INR codec
References I
[1] Careil, M., Muckley, M. J., Verbeek, J., & Lathuilière, S. Towards image compression with perfect realism at ultra-low bitrates. ICLR 2024.
[2] C. T. Li and A. El Gamal, “Strong functional representation lemma and applications to coding theorems,” IEEE Transactions on Information Theory, vol. 64, no. 11, pp. 6967–6978, 2018.
[3] E. Agustsson and L. Theis. "Universally quantized neural compression" In NeurIPS 2020.
References II
[4] E. Dupont, A. Golinski, M. Alizadeh, Y. W. Teh and Arnaud Doucet. "COIN: compression with implicit neural representations" arXiv preprint arXiv:2103.03123, 2021.
[5] G. F., L. Wells, Some Notes on the Sample Complexity of Approximate Channel Simulation. To appear at Learning to Compress workshop @ ISIT 2024.
[6] D. Goc, G. F. On Channel Simulation with Causal Rejection Samplers. To appear at ISIT 2024
References III
[7] C. Blundell, J. Cornebise, K. Kavukcuoglu and D. Wierstra. Weight uncertainty in neural network. In ICML 2015.