AI and Data Science:

Understanding Transformers

Explore the power of transformers in AI with this hands-on 2-day course. Understand the theory behind neural networks and generative AI.

This is a 2-day course on the topic of transformers architecture, where we will dive into the theory and application use cases of this powerhouse in modern AI. At the beginning of the course we will build up an intuition for neural networks and generative AI in general, before we dive into the deeper details of transformers, such that we gain a gradual and intuitive understanding of the AI landscape overall. 

After going through the theoretical aspects, the practical task will be to build a mini-chatgpt system from scratch, in a bottom-up manner, illustrating all the building blocks in code.

Learning goals

Day 1: Theoretical session

  • Introduction to neural nets in general
  • Introduction to generative AI / genAI
  • Theory and applications of transformers

Day 2: Practical session

  • Build all the components of a transformer that we touched on in Day 1
  • Assemble a transformer out of all the building block components
  • Setup a mini-chatgpt system that emulates speech via predicting next characters.

Course date

Register now: November 17–18, 2025

For more information on how to register, please follow the link on the course date (to be published in summer 2025).

Prerequisites

Target group

The course is targeted at a heterogeneous audience with various levels of AI knowledge, including researchers that may or may not have had exposure to AI topics beforehand, but still want to learn about the inner workings of transformers to apply them in their research tasks.

This course is free of charge.

Alternativ-Text

Subscribe newsletter