Information Theory

Typ: Vorlesung + Übung/Tutorium
SWS: 4
Credit Points: 5
Homepage: lnt.ei.tum.de

Kursbeschreibung / -kommentar

Review of probability theory. Information theory for discrete and continuous variables: entropy, informational divergence, mutual information, inequalities. Coding of memoryless sources: rooted trees with probabilities, Kraft inequality, entropy bounds on source coding, Huffman codes, Tunstall codes. Coding of stationary sources: entropy bounds, Elias code for the positive integers, Elias-Willems universal source coding, hidden finite-memory sources. Channel coding: memoryless channels, block and bit error probability, random coding, converse, binary symmetric channel, binary erasure channel, symmetric channels, real and complex AWGN channels, parallel and vector AWGN channels, source and channel coding.