Loading Events

ESE Ph.D. Thesis Defense: “Neural Compression: Estimating and Achieving the Fundamental Limits”

April 18, 2025 at 12:00 PM - 2:00 PM
Details
Date: April 18, 2025
Time: 12:00 PM - 2:00 PM
  • Event Tags:
  • Organizer
    Electrical and Systems Engineering
    Phone: 215-898-6823
    Venue
    Amy Gutmann Hall, Room 515 3317 Chestnut Street
    Philadelphia
    19104
    Google Map

    Neural compression, which pertains to compression schemes that are learned from data using neural networks, has emerged as a powerful approach for compressing real-world data. Neural compressors often outperform classical schemes, especially in settings where reconstructions that are perceptually similar to the source are desired. Despite their empirical success, the fundamental principles governing how neural compressors operate, perform, and trade off performance with complexity are not well-understood compared to classical schemes.

    We aim to develop some of the fundamental principles of neural compression. We first introduce neural estimation methods that can estimate the theoretical rate-distortion limits of lossy compression for high dimensional sources using techniques from generative models. These methods illustrate that recent neural compressors are sub-optimal. Next, we build on these insights to discuss neural compressors that approach optimality yet remain low-complexity through the use of lattice coding techniques. These are shown to approach the rate-distortion limits on high-dimensional sources without incurring a significant increase in complexity. Finally, we develop low-complexity compressors for the rate-distortion-perception setting, where an additional perception constraint ensures the source and reconstruction distributions are close in terms of a statistical divergence. These compressors combine lattice coding with the use of shared randomness via dithering over the lattice cells, and provably achieve the fundamental rate-distortion-perception limits on the Gaussian source.