Prerequisites
Python programming (loops, functions, classes). Basic linear algebra (vectors, matrices, dot products). No prior ML experience required; Chapter 0 covers foundations.
Focus: Foundations, using LLM APIs, building basic agents. Students leave able to build and deploy LLM-powered applications. This pathway spends the first five weeks on foundations because undergraduates typically lack exposure to attention mechanisms and tokenization; skipping this material would leave them unable to debug prompt failures or understand why a model generates unexpected output. The second half jumps to the applied chapters (APIs, RAG, agents) because the goal is practical competency.
14-Week Syllabus
| Week | Topics | Lab / Assignment |
|---|---|---|
| 1 | ML and PyTorch Foundations | Build and train an image classifier in PyTorch |
| 2 | NLP and Text Representation | Build a TF-IDF search engine |
| 3 | Tokenization and Subword Models | Train a BPE tokenizer from scratch |
| 4 | Attention and Transformers (Ch 03 through 04) | Implement scaled dot-product attention |
| 5 | Decoding and Text Generation | Compare decoding strategies on GPT-2 |
| 6 | Working with LLM APIs | Build a multi-provider API client |
| 7 | Prompt Engineering | Prompt optimization challenge (few-shot, CoT) |
| 8 | Embeddings and Vector Databases | Build a semantic search system |
| 9 | RAG Fundamentals | Build a document QA system with RAG |
| 10 | Conversational AI | Build a multi-turn chatbot with memory |
| 11 | AI Agents and Tool Use (Ch 22 through 23) | Build an agent with MCP tool integration |
| 12 | Evaluation and Observability (Ch 29 through 30) | Evaluate an LLM system with automated metrics |
| 13 | Production Engineering | Deploy an LLM application with monitoring |
| 14 | Final project presentations; further reading: Emerging Architectures (Ch 34), AI and Society (Ch 35) | End-to-end LLM application (team project) |
- Appendix D: Environment Setup – set up your development environment before Week 1
- Appendix K: HuggingFace: Transformers, Datasets, and Hub – access pretrained models and datasets for labs
- Appendix C: Python for LLM Development – review Python patterns used throughout the course
- Appendix U: Docker and Containers – containerize your final project for deployment
What Comes Next
Return to the Course Syllabi overview to explore other courses and reading tracks, or proceed to FM.4: How to Use This Book for a quick orientation on conventions and callout types.