Information& Probability Theory (ILV)Back
em.o.Univ.-Prof. Dr.Jürgen Pilz
|Semester of degree program||Semester 1|
|Mode of delivery||Presence- and Telecourse|
|Language of instruction||English|
The students know about the different definitions of probability and are familiar with the concepts of random variables and probability distributions.
They are able to model and compute conditional, total and inverse probabilities.
They know about the most important discrete and continous probability distributions, their application and implementation in the statistical programming system R.
They are able to compute important characteristics of distributions such as means, variances, quantiles and can determine the distribution of transformed random variables.
They are familiar with the most important multivariate distributions and are able to compute conditional distributions, correlations and the distributions of transformed random vectors.
They have a command of the basic concepts of information theory, in particular entropy and divergence, and of the algebraic basics of data compression. They know about the important coding strategies.
They are familiar with the basic information measures, their multivariate extensions and know about their relations with the energy measures used in statistical physics.
Basic knowledge from Calculus and Linear Algebra
The module covers the following topics/contents:
- Different definitions of probability: classical approach, axiomatic approach, relative frequency
- Conditional and total probability, independence, inverse probability (Bayes Formula), random variables and probability distributions
- Basic discrete distributions: Binomial distribution, Poisson distribution, geometric distribution
- Basic continous distributions: uniform distribution, Gaussian distribution, Student-t-, Exponential-, Gamma- and Lognormal distribution
- Characteristic quantities and functions of probability distributions:
- mean, variance, skewness, excess, p-quantile, moment generating function, Jacobi-transformation, law of large numbers and central limit theorem
- Multivariate distributions: multinomial distribution, multivariate Gaussian distribution, Dirichlet distribution, correlations and covariances,transformations und Jacobian matrix
- Introduction into information theory: binary entropy, Hamming code, entropy and (Kullback-Leibler) divergence
- Algebraic concepts of data compression: Coding strategies, source coding, Huffman code, Lempel-Ziv code
- (Shannon-) information measures und channel capacity: Gaussian channel, noisy channel coding, error correcting codes, Hash-codes, Minimum Description Length, redundancy, conditional and mutual (multivariate) information, thermodynamic entropy, Helmholtz and Gibbs energy
Lecture notes as provided in the course (required)
J.K. Blitzstein and J. Hwang: Introduction to Probability. 2nd ed., Chapman and Hall/CRC 2019
D.J.C. MacKay: Information Theory, Inference, and Learning Algorithms. Cambridge University Press 2003
A. Ben-Naim: Information Theory. Part I: An Introduction to the Fundamental Concepts. World Scientific 2017
Lectures with integrated exercises
Immanent examination character: presentation, assignment reports, written/oral exam