Skip to the content.

(2025/2026) Artificial Neural Networks and Deep Learning

The notes are taken from the books required for the course:

You can view/download the PDF here. In the notes folder, you can also see the source code.

For any issue, use the appropriate section.

Course Syllabus

According to the official course syllabus:

  1. From the Perceptron to Neural Networks and the Feedforward architecture
  2. Backpropagation and Neural Networks training algorithms, e.g., Adagrad, adam, etc.
  3. Best practices in neural network training: overfitting and cross-validation, stopping criteria, weight decay, dropout, data resampling and augmentation.
  4. Image Classification problem and Neural Networks
  5. Recurrent Neural Networks and other relevant architectures such as (Sparse) Neural Autoencoders
  6. Theoretical results: Neural Networks as universal approximation tools, vanishing and exploding gradients, etc.
  7. Introduction to the Deep Learning paradigm and its main differences with respect to classical Machine Learning
  8. Convolutional Neural Networks (CNN) architecture
  9. The breakthrough of CNN and their interpretation
  10. CNN training and data-augmentation
  11. Structural learning, Long-Short Term Memories, and their applications to text and speech
  12. Autoencoders and data embedding, word2vec, variational autoencoders
  13. Transfer Learning for pre-trained Deep models
  14. Extended models including Fully Convolutional CNN, networks for image segmentation (U-net) and object detection (e.g., R-CNN, YOLO )
  15. Generative Models (e.g., Generative Adversarial Networks)