>/tta/

← back to teachings
jan 14, 2026 tamás takács, imre molnár, viktor varga 10 min read
share: copied!

Deep Network Development

2025/26/1

This is the continuation of our master’s course on Deep Network Development (IPM-21FMIDNDEG), taught by myself and my PhD colleague Imre Molnár at Eötvös Loránd University. This semester, we were joined by Viktor Varga, one of the department’s most distinguished researchers. As the largest and most popular AI master’s course at the faculty, significant adjustments were made to accommodate the growing number of students.

With a record-breaking 225 students enrolled this semester, the scale was beyond anything we had anticipated. Compulsory or not, we are grateful for every student’s engagement and attention. We worked hard to make the course as accessible and enjoyable as possible. This version builds upon our previous iteration, with refined materials and expanded content.

Please note that the materials may contain small mistakes, typos, or even implementation bugs. I would appreciate any notifications about these issues sent to my email.


Lecture and Practice Content

The early lectures have been slowed down to allow more time on deep learning fundamentals and gradient-based optimization, helping students build a stronger intuition for how these methods differ from other learning paradigms.

1
Lecture

Course introduction, course technical and administrative details.

2
Lecture

Linear Regression.

2
Practice

Life Expectancy calculation with regression.

3
Lecture

Logistic Regression.

3
Practice

Image Classification using CNNs.

4
Lecture

Overfitting and Hyperparameters.

4
Practice

Transfer Learning in PyTorch.

5
Lecture

MLPs.

5
Practice

Object Detection with YOLO.

7
Lecture

Image Classification, Convolutional Neural Networks, Transfer Learning.

8
Lecture

Autumn Break.

9
Lecture

Object Detection and Image Segmentation.

10
Lecture

NLP Basics and Recurrent Neural Networks.

10
Practice

The Attention Mechanism and Transformers

11
Lecture

Transformers, Vision Transformers.

11
Practice

The Vision Transformer and Superpixel Tokenization

12
Lecture

Deep Learning Tools for Computer Vision.

12
Practice

Depth Estimation and Optical Flow in PyTorch.

13
Lecture

Assignment 2 Defense.

13
Practice

Generative Modelling and Neural Rendering.

14
Lecture

Assignment 2 Defense.


Assignments

The assessment is divided into three components. Students must complete two smaller, individual assignments that contribute to the practice grade and do not require oral defense. A larger, in-depth assignment, also part of the practice grade, must be defended orally, and the defense contributes to the lecture grade as well.

After experimenting with Canvas quizzes in previous semesters, we returned to two written midterms for assessing theoretical knowledge—a format that proved to be a much better fit. The midterms are available under Week 7 and Week 13 practice materials.

1
Assignment

Image Colorization.

2
Assignment

Simplified Object Detection.

3
Assignment

Image Captioning.


Exams

The exam consists of two main parts and has a total duration of 2 hours. The first part is a 30-minute coding pre-exam, which must be completed successfully in order to proceed to the second part. This initial exercise involves building a simple PyTorch architecture and serves as a prerequisite for continuing the exam.

The second part is a written, paper-based exam that tests the theoretical concepts covered throughout the course. It is also evaluated on a pass/fail basis. Students who pass both parts will receive their final course grade, calculated based on the weighted average of their lecture and practice performance. Here are some examples of past exams for reference:

Mock Exam 1

Sample Exam 1.

Mock Exam 2

Sample Exam 2.

Mock Exam 3

Sample Exam 3.


Course Syllabus

Schedule

Lecture:

  • Schedule: Tuesdays 14:00 – 16:00
  • Location: South Building 0-822

Note:

  • Hungarian: Déli Tömb 0-822 Mogyoródi József terem

Practice:

  • Schedule: Tuesdays 16:00 – 18:00
  • Location: South Building 0-822

Note:

  • Hungarian: Déli Tömb 0-822 Mogyoródi József terem

Description

This course is designed to provide students with an in-depth exploration of Deep Learning, particularly focusing on Neural Network architectures. Throughout the semester, students will gain a comprehensive understanding of how Deep Neural Networks work, from the fundamental theory behind their design to practical implementation skills. The course primarily covers Supervised Deep Learning techniques and equips students with hands-on experience using PyTorch, a popular Deep Learning framework. By working through practices and assignments in PyTorch, students will learn to build, train, and optimize neural networks effectively.

The course also emphasizes ethical considerations in AI development, ensuring that students not only learn the technical aspects of Deep Learning but also understand its broader impact on society.

Grading

Your final grade is determined by:

  • Submission and defense of the assignment and homeworks
  • Midterm grades
  • Passing the final exam

Assignment Defense

Each assignment defense consists of two parts:

  • Part 1: Code Defense – You will answer questions about the solution you have submitted.
  • Part 2: Theoretical Defense – You will answer questions related to the lecture material.

Grade Calculation

Your grades for the lecture and practice components are calculated as follows:

Lecture Grade:

Lecture = 0.35 × (M1 + M2) + 0.3 × D

Where:

  • M1, M2 – Midterm grades
  • D – Defense of the Assignment
Midterm Score Range Grade
> 42 5
38 - 42 4
33 - 37 3
20 - 32 2
< 20 F

Practice Grade:

Practice = 0.2 × (H1 + H2) + 0.6 × A

Where:

  • H1, H2 – Homework grades
  • A – Assignment solution grade

Exam Eligibility and Final Grade

To be eligible for the final exam, you must achieve at least a grade of 2 in both Lecture and Practice.

If you pass the exam, your final grade is determined as:

Final Grade = (Lecture + Practice) / 2

Prerequisites

  • Linear Algebra
  • Probability Theory
  • Programming Skills (for practices)

Tools and Frameworks

  • Programming Language: Python
  • Frameworks: PyTorch
  • Libraries: NumPy, Matplotlib, torchvision, torchaudio
  • Additional Tools: Google Colab

Learning Objectives

  • Understand the basics of Deep Learning
  • Understand and implement Neural Network architectures
  • Learn a popular Deep Learning framework (PyTorch)
  • Be able to use open-source Neural Network software
  1. Goodfellow, I., Bengio, Y., & Courville, A. (2016). Deep Learning. MIT Press.
  2. Paszke, A., et al. (2019). PyTorch: An Imperative Style, High-Performance Deep Learning Library.
  3. Hastie, T., Tibshirani, R., & Friedman, J. (2009). The Elements of Statistical Learning.
  4. Brownlee, J. (2019). Deep Learning for Computer Vision. Machine Learning Mastery.
  5. Montreal Declaration for Responsible AI.

For any questions related to the course material, please message me or my colleagues, Imre and Viktor. For inquiries regarding access to the AI Lab, please contact Imre and Viktor only or Kristian Fenech, the course owner. Happy learning!


1974 words updated: 2026-01-19 19:51:06 +0100

6301abd

← back to teachings