AI • Math • Machine learning

Understand the math behind
modern AI, step by step.

I’m a developer documenting my journey to understand the math behind neural networks, transformers, and other machine learning models. Here you’ll find videos, slides, and notes you can reuse to study too.

Current topic: Vectors, shapes & matrix multiplication (linear algebra foundations)

Featured video The AI Lab Journal

Topic 1 – Vectors, shapes & matrix multiplication

A beginner-friendly walkthrough of vectors, matrices, shapes, and how matrix multiplication works – using the same math that shows up inside many modern AI models (including attention).

Episodes

All videos and audio episodes from The AI Lab Journal in one place.

Video

NotebookLM overview – “Attention Is All You Need”

A high-level overview generated with Google NotebookLM, summarizing the original transformer paper to set the stage for the math deep dives in AI and deep learning.

Audio series

The AI Lab Podcast

A playlist with audio-only overviews, ideal for listening while commuting or cooking.

Study session

Topic 1 – Vectors, shapes & matrix multiplication

A more hands-on session where we go through the math slowly, connecting vectors and matrices to what actually happens in modern AI models (including attention).

Study materials

Slides, PDFs, and notes you can download and reuse. Think of this as the “lab notebook” for the series.

Topic 1

Topic 1 – Vectors, shapes & matrix multiplication

PDF slides covering scalars, vectors, matrices, dimensions, shapes, dot products, and the rules of matrix multiplication.

  • Scalar, vector, and matrix definitions
  • Shape notation and intuition
  • Matrix multiplication rules & inner dimensions
Soon

Topic 2 – Q, K, V & attention scores

Coming next: a visual and intuitive breakdown of Q @ Kᵀ, scaling by √dk, and turning scores into attention weights with softmax, connecting these ideas to the linear algebra foundations from Topic 1.

About The AI Lab Journal

I’m Ronaldo, a developer who wants to be able to read and truly understand the math in AI research papers, not just run the code.

This project is my way of learning in public: every video and PDF here started as my own study notes, cleaned up so other people can reuse them. I started with the transformer paper “Attention Is All You Need”, but the goal is bigger: to build the math foundations for modern AI in general – linear algebra, probability, optimization, and how they show up in deep learning models.

If you’re also a developer who feels “okay with coding but shaky with math”, this lab is for you. We’ll go slowly, ask “stupid” questions, and build intuition step by step, with a focus on the math that really appears in AI and machine learning.

How to follow the journey

  • Start with Topic 1 – Vectors, shapes & matrix multiplication (linear algebra foundations).
  • Subscribe to the YouTube channel for new episodes.
  • Revisit this page for updated PDFs and future topics.
Subscribe on YouTube