Tutorial Lecturers
Prof. Dr.-Ing. Stephan ten Brink
Institute of Telecommunications, University of Stuttgart
Ìý
|
ÌýÌýÌýÌý |
The course is structured as follows:
- how it all began: the LDPC codes and serially concatenated codes of the 1960s; - re-discovery and innovation: extrinsic information and parallel concatenated codes of the early 1990s; - design of concatenated coding schemes with the EXIT chart; - iterative detection and decoding; - new trends: polar codes and spatially coupled codes . Ìý
Ìý
Ìý
Ìý
|
---|
Biography
Stephan ten Brink joined the Institute of Telecommunications in July 2013. Prior to his assignment at the University of Stuttgart, he worked in various positions in industry, research and development.
Prior assignments include
- one year researcher at Bell Laboratories, Lucent Technologies in Swindon, U.K. (mobile wireless communications, GSM, UMTS)
- 5 years as researcher at Bell Laboratories in Holmdel, New Jersey, U.S.A. (channel coding and signal detection for multiple antenna communications)
- 7 years at Realtek Semiconductor Corp. in Irvine, California, U.S.A., as director of wireless ASIC development (WLAN, UWB baseband)
- 3 years at Bell Laboratories, Alcatel-Lucent in Stuttgart, Germany, as department head in wireless physical layer research (signal processing and channel coding for wireless and optical communication systems; LTE, long-haul)
The common theme across his appointments is in digital modem design, particularly signal processing and channel coding for communications, for improving data rate, receiver sensitivity, and power efficiency.
He is member of the VDE/ITG, and Senior Member of the Â鶹´«Ã½Ó³» Communications and Information Theory Society.
He is member of the VDE/ITG, and Senior Member of the Â鶹´«Ã½Ó³» Communications and Information Theory Society.
In October 2013, he was elected to the Board of Governors of the Â鶹´«Ã½Ó³» Information Theory Society.
Prof. Imre Csiszár
Alfréd Rényi Mathematical Institute of the Hungarian Academy of Sciences
Ìý
|
ÌýÌýÌýÌý |
Contemporary techniques of data security are primarily based on computational complexity, typically on the infeasibility of inverting certain functions via currently available mathematical tools and computing power. Future progress, for example quantum computers, may render these techniques insecure.
Information theoretic secrecy offers provable security even against adversaries of unlimited computational power. It requires some kind of correlated randomness available toÌý the legal parties, and offers tools to use this resource to achieve perfect secrecy from an adversary who may have some but not complete information about this randomness.
In these lectures, two of the main subjects of information theoretic secrecy will be addressed: Secure communication over an insecure channel, and generating a common secret key for two or more parties relying upon public communication. Attention will be restricted to models requiring secrecy against a passive adversary who listens to the legal parties’ communication but does not interfere with it.
Starting with basic prerequisites, a state-of-the art insight will be given into the mentioned subjects. The emphasis will be on fundamental limits, i.e., on optimal performance theoretically achievable. Some applications will also be mentioned.
|
---|
Biography
Imre Csiszár was born in Miskolc, Hungary, on February 7, 1938. He received diploma in mathematics from the L. Eötvös University, Budapest, in 1961, and the Doctor of Mathematical Science degree from the Hungarian Academy of Sciences in 1977.
Dr. Csiszár has been with the Mathematical Institute, now calledÌýAlfréd Rényi Institute of Mathematics, of the Hungarian Academy of Sciences since 1961, and Head of the Information Theory Group there from 1968 until retirement in 2008. Also, he had been Professor of Mathematics at the L.ÌýEötvös University, Budapest, the University of Technology and Economics, Budapest (he is now Professor Emeritus of the latter), and has held Visiting Professorships at several Universities in Europe and the US. His research interests are centered on information theory and its applications in mathematics, primarily in probability and statistics. He is coauthor of the books Information Theory: Coding Theorems for Discrete Memoryless Systems (New York: Academic Press, 1981; second edition Cambridge: Cambridge University Press, 2012) and Information Theory and Statistics: A Tutorial (Hanover: now Publishers, 2004).
Dr. Csiszár has been with the Mathematical Institute, now calledÌýAlfréd Rényi Institute of Mathematics, of the Hungarian Academy of Sciences since 1961, and Head of the Information Theory Group there from 1968 until retirement in 2008. Also, he had been Professor of Mathematics at the L.ÌýEötvös University, Budapest, the University of Technology and Economics, Budapest (he is now Professor Emeritus of the latter), and has held Visiting Professorships at several Universities in Europe and the US. His research interests are centered on information theory and its applications in mathematics, primarily in probability and statistics. He is coauthor of the books Information Theory: Coding Theorems for Discrete Memoryless Systems (New York: Academic Press, 1981; second edition Cambridge: Cambridge University Press, 2012) and Information Theory and Statistics: A Tutorial (Hanover: now Publishers, 2004).
Dr.ÌýCsiszár is Regular Member of the Hungarian Academy of Sciences, Honorary President of the J.Bolyai Mathematical Society (Hungarian Mathematical Society), and Life Fellow of the Â鶹´«Ã½Ó³». He has been recepient of several academic awards, including the 1988 Prize Paper Award of the Â鶹´«Ã½Ó³» Information Theory Society, the Award for Interdisciplinary Research of the Hungarian Academy of Sciences in 1989, the Shannon Award of the Â鶹´«Ã½Ó³» Information Theory Society in 1996, the Bolzano Medal of the Czech Academy of Sciences in 2006, the Széchenyi Prize of the Hungarian Republic in 2007, and the Dobrushin Prize of the International Dobrushin Foundation in 2013.
Dr. Richard Durbin
Wellcome Trust Sanger Institute
Ìý
|
ÌýÌý ÌýÌý |
DNA sequencing technology is increasing in throughput and decreasing in cost faster than Moore's law. ÌýTens ofÌý thousands of whole genome sequences are being obtained now, with large centres producing petabytes of raw data. ÌýIn the near future we expect millions of whole genome sequences. ÌýThe data in the genome, while conceptually made of 46 strings of A,C,G,T "letters", has a rich structure originating from our evolution and genetic history that allows for efficient storage and search. ÌýI will talk about the basic informational structure of genome sequences, how we assemble them from current sequencing technology (like a big jigsaw puzzle with overlapping pieces) and how we can build methods for rapid search and inference of missing data. ÌýRecently there has been extensive use of the Burrows-Wheeler transform (BWT) methods related to enhanced suffix arrays for self-indexed compressed storage of information, including extensions to the theory coming from the computational genomics field.
Ìý
Ìý
Ìý
Ìý
Ìý
|
---|
Biography
Richard Durbin is a Senior Group Leader and Acting Head of Computational Genomics at the Wellcome Trust Sanger Institute, in Cambridge, England. Following a first degree in mathematics he received his PhD at the MRC Laboratory of Molecular Biology in Cambridge, and then following postdocs at Cambridge and Stanford joined the Sanger Institute when it was founded in 1992.
Dr. Durbin’s primary research interests are in human genetic variation and computational genomics. He is the leader or co-leader of the 1000 Genomes Project to characterize global genetic variation, the UK10K consortium to extend sequence based genetics to samples with clinically relevant phenotypes, and with Fiona Watt the HipSci consortium to make a panel of human IPS cell lines and carry out cellular genetics studies on them. His group has also made theoretical and algorithmic contributions to biological sequence and evolutionary analysis, and developed important computational methodology and software for genomic data processing analysis, leading to him being senior author on the initial bwa software, and on BAM and VCF file formats and associated toolkits. He also led the gorilla genome sequencing project, and led or helped lead the development of a number of bioinformatics database resources including the Pfam database of protein families, WormBase (the model organism database for C. elegans), the TreeFam database of gene trees, and the Ensembl genome data resource. Dr Durbin is a Fellow of the Royal Society, a Member of EMBO, and Honorary Professor of Computational Genomics at the University of Cambridge.
Dr. Durbin’s primary research interests are in human genetic variation and computational genomics. He is the leader or co-leader of the 1000 Genomes Project to characterize global genetic variation, the UK10K consortium to extend sequence based genetics to samples with clinically relevant phenotypes, and with Fiona Watt the HipSci consortium to make a panel of human IPS cell lines and carry out cellular genetics studies on them. His group has also made theoretical and algorithmic contributions to biological sequence and evolutionary analysis, and developed important computational methodology and software for genomic data processing analysis, leading to him being senior author on the initial bwa software, and on BAM and VCF file formats and associated toolkits. He also led the gorilla genome sequencing project, and led or helped lead the development of a number of bioinformatics database resources including the Pfam database of protein families, WormBase (the model organism database for C. elegans), the TreeFam database of gene trees, and the Ensembl genome data resource. Dr Durbin is a Fellow of the Royal Society, a Member of EMBO, and Honorary Professor of Computational Genomics at the University of Cambridge.
Associate Prof. Young-Han Kim
University of California, San Diego
Ìý
|
ÌýÌýÌýÌý |
Cooperative relaying is an important component in wireless communication. Over the past four decades, several information-theoretic relaying techniques have been proposed.Ìý In this lecture, we discuss some of these techniques and their performance bounds.
We begin with partial decode-forward (digital-to-digital interface) and compress-toward (analog-to-digital interface) as well as the cutset bound on the capacity for the three-node relay channel. We then discuss how these classical results can be extended to general network and flow models, focusing on two recently proposed relaying techniques calledÌý noisy network coding (NNC) for multihop multiple access and distributed decode-forward (DDF) for multihop broadcast as well as the cutlet bound on the capacity region for a general multimessage relay network. Ìý
Ìý
Ìý
Ìý
Ìý
Ìý
Ìý
|
---|
Young-Han Kim is an associate professor in the Department of Electrical and Computer Engineering at the University of California, San Diego. Professor Kim's research primarily focuses on network information theory and the role of feedback in communication networks. More broadly, he is interested in statistical signal processing and information theory, with applications in communication, control, computation, networking, data compression, and learning.
Professor Kim received his B.S. degree with honors in Electrical Engineering from Seoul National University, in 1996, where he was a recipient of the General Electric Foundation Scholarship. After a three-and-half-year stint as a software architect at Tong Yang Systems, Seoul, Korea, working on several industry projects such as developing the communication infrastructure for then newly opening Incheon International Airport, he resumed his graduate studies at Stanford University, and received his Ph.D. degree in Electrical Engineering (M.S. degrees in Statistics and in Electrical Engineering) in 2006. Professor Kim is a recipient of the 2008 NSF Faculty Early Career Development (CAREER) Award the 2009 US-Israel Binational Science Foundation Bergmann Memorial Award, and the 2012 Â鶹´«Ã½Ó³» Information Theory Paper Award. He is currently on the Editorial Board of the Â鶹´«Ã½Ó³» Transactions on Information Theory, serving as an Associate Editor for Shannon theory. He is also serving as a Distinguished Lecturer for the Â鶹´«Ã½Ó³» Information Theory Society.
Associate Prof. Michael Langberg
State University of New York at Buffalo
|
ÌýÌýÌýÌý |
The network information theory literature includes beautiful results describing codes and performance limits for many different networks. While common tools and themes are evident in the proofs of these results, the field is still very far from a unifying theory that not only explains how and why the optimal strategies for well-studied networks differ but also predicts solutions for networks to be studied in the future. Ìý
In this lecture, I will present one step towards the derivation of such a theory based on the paradigm of reduction. A reduction shows that a problem A can be solved by turning it into a different problem B and then applying a solution for B. While the notion of a reduction is extremely simple, the tool of reduction turns out to be incredibly powerful. The key to its power is the observation that reductive proofs are possible even when solutions to both A and B are unavailable. The paradigm of reduction can be used to show that a communication problem is "easy" if it can be mapped to other problems for which solutions are well understood. It can be used to show that a problem is "hard" if a notorious open problem can be reduced to it. It can be used to show that proof techniques and results from the study of one problem apply to other problems as well. It can be used to derive provable, quantifiable connections between problems that are intuitively very similar or problems that are apparently unrelated. In this lecture I will give an overview of several reductive connections that have emerged recently in the literature between seemingly disparate information theoretic problems. These include connections between network coding and index coding, connections between secure and standard communication, connections between noisy and noiseless networks, and more. |
---|
Biography
Prof. Langberg received his B.Sc. in mathematics and computer science from Tel-Aviv University in 1996, and his M.Sc. and Ph.D. in computer science from the Weizmann Institute of Science in 1998 and 2003 respectively. Between 2003 and 2006, he was a postdoctoral scholar in the Electrical Engineering and Computer Science departments at the California Institute of Technology. Prof. Langberg is currently an associate professor in the Department of Electrical Engineering at the State University of New York at Buffalo.
Prof. Langberg's research addresses the algorithmic and combinatorial aspects of information in communication, management, and storage; focusing on the study of information theory, coding theory, network communication and network coding, big data in the form of succinct data representation, and probabilistic methods in combinatorics. Prof. Langberg is an associate editor for the Â鶹´«Ã½Ó³» Transactions on Information Theory and is the editor of theÌýÂ鶹´«Ã½Ó³» Information Theory Society Newsletter.
Prof. Langberg's research addresses the algorithmic and combinatorial aspects of information in communication, management, and storage; focusing on the study of information theory, coding theory, network communication and network coding, big data in the form of succinct data representation, and probabilistic methods in combinatorics. Prof. Langberg is an associate editor for the Â鶹´«Ã½Ó³» Transactions on Information Theory and is the editor of theÌýÂ鶹´«Ã½Ó³» Information Theory Society Newsletter.
Associate Prof. Stephanie Wehner
Delft University of Technology
Ìý |
Title: INTRODUCTION TO QUANTUM INFORMATION THEORY ( ) Quantum information differs fundamentally from classical information: quantum bits cannot be copied, and entanglement allows for much stronger than classical correlations. This short tutorial will give an introduction to the field of quantum information theory. In particular, we will learn about quantum bits, quantum channels and basic concepts of the field. We will also examine differences between classical and quantum information with the help of a number of examples.Ìý
Ìý ÌýÌýÌýÌýÌý |
Ìý
Ìý
Ìý
Ìý
|
---|
Biography
Stephanie Wehner is an Associate Professor at QuTech, Delft University of Technology. From 2010 to 2014 she was an Assistant and later Dean's Chair Associate Professor at the School of Computing at the National University of Singapore and the Centre for Quantum Technologies. Prior to coming to Singapore, she spent two years as a Postdoctoral Fellow at the California Institute of Technology. Stephanie is one of the founders of QCRYPT, presently the largest conference in quantum cryptography. Before entering academia, she worked in industry as a professional hacker. Her research interests include quantum cryptography, quantum information theory, and the application of information-theoretic techniques to physics.
Ìý