NotebookLM overview – “Attention Is All You Need”
A high-level overview generated with Google NotebookLM, summarizing the original transformer paper to set the stage for the math deep dives in AI and deep learning.
AI • Math • Machine learning
I’m a developer documenting my journey to understand the math behind neural networks, transformers, and other machine learning models. Here you’ll find videos, slides, and notes you can reuse to study too.
Current topic: Vectors, shapes & matrix multiplication (linear algebra foundations)
A beginner-friendly walkthrough of vectors, matrices, shapes, and how matrix multiplication works – using the same math that shows up inside many modern AI models (including attention).
All videos and audio episodes from The AI Lab Journal in one place.
A high-level overview generated with Google NotebookLM, summarizing the original transformer paper to set the stage for the math deep dives in AI and deep learning.
A playlist with audio-only overviews, ideal for listening while commuting or cooking.
A more hands-on session where we go through the math slowly, connecting vectors and matrices to what actually happens in modern AI models (including attention).
Slides, PDFs, and notes you can download and reuse. Think of this as the “lab notebook” for the series.
PDF slides covering scalars, vectors, matrices, dimensions, shapes, dot products, and the rules of matrix multiplication.
Coming next: a visual and intuitive breakdown of
Q @ Kᵀ, scaling by √dk,
and turning scores into attention weights with softmax,
connecting these ideas to the linear algebra foundations from Topic 1.
I’m Ronaldo, a developer who wants to be able to read and truly understand the math in AI research papers, not just run the code.
This project is my way of learning in public: every video and PDF here started as my own study notes, cleaned up so other people can reuse them. I started with the transformer paper “Attention Is All You Need”, but the goal is bigger: to build the math foundations for modern AI in general – linear algebra, probability, optimization, and how they show up in deep learning models.
If you’re also a developer who feels “okay with coding but shaky with math”, this lab is for you. We’ll go slowly, ask “stupid” questions, and build intuition step by step, with a focus on the math that really appears in AI and machine learning.