鶹ýӳ

[Deadline extended to March 28] "Learn to Compress & Compress to Learn" Workshop @ ISIT 2025
This workshop explores the dual themes of learning-based compression and using compression as a tool for learning tasks. We also welcome recent works that have been published or are currently under review elsewhere.
Mar 14, 2025

The submission deadline for the "Learn to Compress & Compress to Learn" Workshop @ ISIT 2025 is now extended to March 28, 2025. For more detailed information, please visit and .

This workshop aims to unite experts from machine learning, computer science, and information theory to delve into the dual themes of learning-based compression and using compression as a tool for learning tasks.

The program will feature invited talks ڰdz:

* (University of Pennsylvania)

* (University of Cambridge)

* (Granica)

* (Apple)

* (Texas A&M University)

* (Chan Zuckerberg Initiative)

We invite researchers from machine learning, compression and related fields to submit their latest work to the workshop.We welcome submissions of recent work that has been presented, published, or is currently under review elsewhere, if the authors opt out of publishing their paper on 鶹ýӳ Xplore.

All accepted papers will be presented as posters during the poster session. Some papers will also be selected for spotlight presentations. Topics of interest include but are not limited to:

  • "Learn to Compress” – Advancing Compression with Learning

    • Learning-Based Data Compression:New techniques for compressing data (e.g., images, video, audio), model weights, and emerging modalities (e.g., 3D content and AR/VR applications).

    • Efficiency for Large-Scale Foundation Models:Accelerating training and inference for large-scale foundation models, particularly in distributed and resource-constrained settings

    • Theoretical Foundations of Neural Compression:Fundamental limits (e.g., rate-distortion bounds), distortion/perceptual/realism metrics, distributed compression, compression without quantization (e.g., channel simulation, relative entropy coding), and stochastic/probabilistic coding techniques.

  • "Compress to Learn” – Leveraging Principles of Compression to Improve Learning

    • Compression as a Tool for Learning:Leveraging principles of compression and source coding to understand and improve learning and generalization.

    • Compression as a Proxy for Learning:Understanding the information-theoretic role of compression in tasks like unsupervised learning, representation learning, and semantic understanding.

    • Interplay of Algorithmic Information Theory and Source Coding:Exploring connections between Algorithmic Information Theory concepts (e.g., Kolmogorov complexity, Solomonoff induction) and emerging source coding methods.


Important Dates

  • Paper submission deadline: [Extended]March 28, 2025(11:59 PM AoE, Anywhere on Earth).
  • Decision notification:April 18, 2025
  • Camera-ready paper deadline:May 1, 2025
  • Workshop date:June 26, 2025

Organizing Committee

* (NYU)

* (University of Cambridge / Imperial College London)

* (Imperial College London)

* (NYU)