Artificial Neural Networks and Deep Learning (I) (ILV)

Course numberM2.08760.11.131
Course codeAI1
Semester of degree program Semester 2
Mode of delivery Presence- and Telecourse
ECTS credits5,0
Language of instruction English

Students are familiar with areas of artificial neural networks and deep learning: Motivation for Neural Networks

  • Data preprocessing
  • Simple and fully networked deep networks
  • Convolution Networks (CNN),
  • Sequential Neural Networks (RNN)
  • Application of Linear Algebra
  • Statistics and Optimization for Artificial Neural Networks
They are able to identify and analyze different network architectures and know the most important design environments.
Student can explain the pros and cons of individual network types, the error behavior, and the most important sources of information for this very dynamically evolving area of knowledge.
The students can independently develop, test, modify and evaluate networks and use the appropriately up-to-date tools.

Completion of the modules "Statistics", "Introduction to Machine Learning", "Unsupervised Learning", and "Supervised Learning"

The area of ANN and DL are a very dynamic area and this description is exemplary, to be applied as an example andincludes the state of knowledge at the beginning of 2020:
Motivation for Neural Networks
Data preprocessing, normalization
Simple and fully networked deep networks

  • Feedford and backward Algorithms, Hiddene Layer
  • Gradient-based learning
  • Capacity, under- and overfitting
  • Regularization and other methods against overfitting
  • Hyperparameters and validation
  • Prediction and classifier tasks
  • Principles of building a network algorithm
  • Optimization of the training of deep learning networks
Convolutional neural networks (CNN)
  • Motivation, folding, pooling
  • Variants of the folding function
  • Structured output, data types
  • architectures, tensors as data types
  • Unsupervised features
Recurrent neural networks (RNN)
  • Unfolding of the calculation graph
  • Recurrent NN
  • Bidirectional NN
  • Disappearing and rapidly growing gradients and the long
  • time-memory problem
  • LSTM and GRU cells
  • Optimization of RNN
Application of Linear Algebra ,Statistics and Optimization for Artificial Neural Networks
  • Linearity problem and piece-by-piece linear mapping
  • Networks as matrice composition, hyperplanes
  • Graph of the learning function
  • Shift-Invariance and Toeplitz matrices
  • Application of the chain rule of differentiation for backpropagation (Derivatives of tensors)
  • Hyperparameter space

Lecture script as provided in the course (required)
Ian Goodfellow et all, Deep Learning, The MIT Press, 2016
Simon Haykin, Neural Networks and Learning Machines, Pearson, 2009
Francois Chollet, Deep Learning with Python, Manning, 2nd ed. 2020
Gilbert Strange, Linear Algebra and Learning from Data, Wellesley Cambridge 2019

Integrated course - teaching & discussion, demonstration, practical examples, group work in teams, home work

Immanent examination character: presentation, assignment reports, written/oral exam