This special issue will focus on the mathematical foundations of deep learning as well as applications across information science. Prospective authors are invited to submit original manuscripts on topics within this broad scope including, but not limited to:
- Information theoretic methods for deep learning
- Robustness for training and inference
- Understanding generalization in over-parametrized models
- Efficient and compressed model representations
- Deep generative models and inverse problems
- Large-scale efficient training of large models
- Non-convex optimization in deep learning
- Deep learning for source and channel coding
Guest Editors
Lead Guest Editor:
Alex Dimakis,
UT Austin
:
Richard Baraniuk:
Rice University
,
[email protected]
Sewoong Oh:
University of Washington
,
Negar Kiyavash, EPFL:
[email protected]
Rebecca Willett,
University of Chicago
:
Submission Guidelines
Prospective authors must follow the Â鶹´«Ã½Ó³» Journal on Selected Areas in Information Theory guidelines regarding the manuscript and its format. For details and templates, please refer to the Â鶹´«Ã½Ó³» Journal on Selected Areas in Information Theory Author Information webpage . All papers should be submitted through Scholar One according to the schedule below.
Important Dates
Manuscript Due:
Extended to October 15. No further extensions will be given.
Ìý
Acceptance Notification:
15 March 2020
Final to Publisher:
5 April 2020
Expected Publication:
April/May 2020
Submission site: Ìý