01Module 17 - Embeddings EngineeringA complete guide to embeddings - models, evaluation (MTEB), fine-tuning, Matryoshka embeddings, quantization, multimodal embeddings, and production pipelines.02What Are Embeddings and Why They MatterThe fundamental concept of embeddings - mapping meaning to geometric space, cosine similarity, Word2Vec, the king-queen analogy, and why dense retrieval replaced keyword search.03Embedding Models - The LandscapeA comprehensive survey of the embedding model ecosystem - SBERT, contrastive learning, SimCSE, E5, BGE, GTE, OpenAI, Voyage AI, Cohere, and the MTEB leaderboard.04OpenAI Embeddings and API-Based Embedding Servicestext-embedding-3, Matryoshka training, Voyage AI, Cohere Embed, cost analysis, batch processing patterns, and when to choose API vs self-hosted embeddings.05Fine-Tuning Embedding Models for Your DomainContrastive fine-tuning with triplet loss, hard negative mining, in-batch negatives, synthetic data generation, TSDAE, GPL, and a full worked example on domain adaptation.06Matryoshka Representation Learning (MRL)Nested embeddings where any prefix of dimensions is informative - training MRL, adaptive retrieval, 10x FLOP reduction, and how OpenAI's text-embedding-3 uses MRL internally.07Evaluating Embedding ModelsMTEB benchmark deep dive, nDCG@10, Recall@K, MRR, MAP, building domain-specific evaluation sets, running MTEB locally, and avoiding the contamination problem.08Embedding QuantizationReducing embedding storage and search costs - float32 to float16, int8, and binary quantization, Hamming distance search, the rescoring trick, and implementation with FAISS and Qdrant.09Multimodal EmbeddingsCLIP, SigLIP, ImageBind, ColPali, and CLAP - embedding images, text, audio, and documents in shared vector spaces for cross-modal search and zero-shot classification.10Embeddings in ProductionBuild, deploy, and operate production-grade embedding pipelines - caching, incremental indexing, staleness management, vector DB selection, and cost optimization at scale.