Gergely Flamich

PhD Student in Machine Learning, CBL, University of Cambridge. Curriculum Vitae

profile_pic_3_cropped.jpeg

Computational and Biological Learning Lab

Dept. of Engineering

University of Cambridge

Trumpington Street, Cambridge CB2 1PZ, UK

Hello there! I’m Gergely Flamich (pronounced gher - gey flah - mih but in English, I usually go by Greg), and I’m originally from Vác, Hungary. I am a PhD student in Advanced Machine Learning at the Computational and Biological Learning Lab (since Oct 2020), supervised by José Miguel Hernández Lobato. I hold an MPhil degree in Machine Learning and Machine Intelligence from the University of Cambridge and a Joint BSc Honours degree in Mathematics and Computer Science from the University of St Andrews.

My research focuses on the theory of relative entropy coding/channel simulation and its application to neural data compression. Relative entropy coding algorithms allow us to efficiently encode a random sample from both discrete and continuous distributions, and they are a natural alternative to quantization and entropy coding in lossy compression codecs. Furthermore, they bring unique advantages to lossy compression once we go beyond the standard rate-distortion framework: they allow us to design optimally efficient artefact-free/perfectly realistic lossy compression codecs using generative models and perform differentially private federated learning with optimal communication cost. Unfortunately, relative entropy coding hasn’t seen widespread adoption, as all current algorithms are either too slow or have limited applicability.

Hence, I am focusing on addressing this issue by developing fast, general-purpose coding algorithms, such as A* coding and greedy Poisson rejection sampling, and providing mathematical guarantees on their coding efficiency and runtime. In addition to my theoretical work, I am also interested in applying relative entropy coding algorithms to neural compression, utilizing generative models such as variational autoencoders and implicit neural representations.

From July 2022 until Dec 2022, I worked with Lucas Theis as a Student Researcher at Google Brain, during which time we developed adaptive greedy rejection sampling and bits-back quantization.

Judit, my wonderful sister, is an amazing painter. Check out her work on Instagram!

selected publications

  1. getting_free_bits_back_from_llms.png
    Getting Free Bits Back from Rotational Symmetries in LLMs
    Jiajun He, Gergely Flamich, and José Miguel Hernández-Lobato
    2024
    https://arxiv.org/abs/2410.01309
    Received Oral at Compression Workshop @ NeurIPS 2024
  2. gprs.gif
    Greedy Poisson Rejection Sampling
    Gergely Flamich
    In Advances in Neural Information Processing Systems, 2023
  3. combiner-architecture.png
    Compression with Bayesian Implicit Neural Representations
    Zongyu Guo*, Gergely Flamich*, Jiajun He, and 2 more authors
    In Advances in Neural Information Processing Systems, 2023
    Received Spotlight
  4. agrs.png
    Adaptive Greedy Rejection Sampling
    Gergely Flamich, and Lucas Theis
    In 2023 IEEE International Symposium on Information Theory (ISIT), 2023
  5. rec.png
    Compressing images by encoding their latent representations with relative entropy coding
    Gergely Flamich*, Marton Havasi*, and José Miguel Hernández-Lobato
    In Advances in Neural Information Processing Systems, 2020