# Linear Algebra Knowledge Graph - Complete Dependency Map ## 🌳 Knowledge Tree Structure This document maps all Linear Algebra concepts with their dependencies and optimal learning order. --- ## Level 1: Foundation Concepts (No Prerequisites) ### 1.1 Basic Algebra Review ``` β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Arithmetic Operations β”‚ β”‚ - Addition, Subtraction β”‚ β”‚ - Multiplication, Division β”‚ β”‚ - Order of operations β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”œβ”€> Algebraic Expressions β”œβ”€> Solving Linear Equations β”œβ”€> Polynomials └─> Summation Notation (Ξ£) ``` ### 1.2 Coordinate Systems ``` β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Cartesian Coordinates β”‚ β”‚ - 2D plane (x, y) β”‚ β”‚ - 3D space (x, y, z) β”‚ β”‚ - Points and plotting β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”œβ”€> Distance Formula β”œβ”€> Midpoint Formula └─> Graphing Functions ``` --- ## Level 2: Vectors - Geometric (Requires Level 1) ### 2.1 Vector Basics ``` β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Vector Definition β”‚ β”‚ - Magnitude and Direction β”‚ β”‚ - Component Form β”‚ β”‚ - Position Vectors β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”œβ”€> Vector Notation (bold, arrow) β”œβ”€> ℝ² and ℝ³ vectors β”œβ”€> n-dimensional vectors (ℝⁿ) └─> Column vs Row Vectors β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Vector Visualization β”‚ β”‚ - Geometric arrows β”‚ β”‚ - Head and tail β”‚ β”‚ - Parallel vectors β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ ``` ### 2.2 Vector Operations (Geometric) ``` β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Vector Addition β”‚ β”‚ - Parallelogram Rule β”‚ β”‚ - Tip-to-tail Method β”‚ β”‚ - Component-wise Addition β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”œβ”€> Vector Subtraction β”œβ”€> Scalar Multiplication (Scaling) β”œβ”€> Linear Combinations └─> Zero Vector β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Special Vectors β”‚ β”‚ - Unit Vectors β”‚ β”‚ - Standard Basis (i, j, k) β”‚ β”‚ - Normalization β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ ``` --- ## Level 3: Vector Products (Requires Level 2) ### 3.1 Dot Product (Inner Product) ``` β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Dot Product β”‚ β”‚ - a Β· b = Ξ£ aα΅’bα΅’ β”‚ β”‚ - a Β· b = ||a|| ||b|| cos ΞΈβ”‚ β”‚ - Scalar result β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”œβ”€> Vector Length (Norm): ||v|| = √(v Β· v) β”œβ”€> Distance: ||a - b|| β”œβ”€> Angle Between Vectors β”œβ”€> Orthogonality (a Β· b = 0) β”œβ”€> Vector Projection └─> Cauchy-Schwarz Inequality β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Properties of Dot Product β”‚ β”‚ - Commutative: a Β· b = b Β· aβ”‚ β”‚ - Distributive β”‚ β”‚ - Linearity β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ ``` ### 3.2 Cross Product (3D Only) ``` β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Cross Product (a Γ— b) β”‚ β”‚ - Vector result β”‚ β”‚ - Perpendicular to both β”‚ β”‚ - Right-hand rule β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”œβ”€> Magnitude: ||a Γ— b|| = ||a|| ||b|| sin ΞΈ β”œβ”€> Area of Parallelogram β”œβ”€> Determinant Form β”œβ”€> Anti-commutative: a Γ— b = -(b Γ— a) └─> Triple Scalar Product β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Applications β”‚ β”‚ - Normal vectors β”‚ β”‚ - Torque calculations β”‚ β”‚ - Area and volume β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ ``` --- ## Level 4: Matrices - Basics (Requires Level 2-3) ### 4.1 Matrix Fundamentals ``` β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Matrix Definition β”‚ β”‚ - m Γ— n array of numbers β”‚ β”‚ - Rows and columns β”‚ β”‚ - Matrix indexing Aα΅’β±Ό β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”œβ”€> Matrix Addition/Subtraction β”œβ”€> Scalar Multiplication β”œβ”€> Transpose (Aα΅€) β”œβ”€> Special Matrices (I, O, Diagonal) └─> Matrix Equality ``` ### 4.2 Matrix Multiplication ``` β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Matrix Product β”‚ β”‚ - (AB)α΅’β±Ό = Ξ£ Aα΅’β‚–Bβ‚–β±Ό β”‚ β”‚ - Dimension compatibility β”‚ β”‚ - Non-commutative β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”œβ”€> Properties (Associative, Distributive) β”œβ”€> Identity: AI = IA = A β”œβ”€> Matrix Powers: AΒ², AΒ³, ... β”œβ”€> Matrix as Linear Transformation └─> Block Matrix Multiplication ``` --- ## Level 5: Linear Systems (Requires Level 4) ### 5.1 Systems of Linear Equations ``` β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ System Representation β”‚ β”‚ - Ax = b β”‚ β”‚ - Augmented Matrix [A|b] β”‚ β”‚ - Coefficient Matrix β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”œβ”€> Gaussian Elimination β”œβ”€> Row Operations β”œβ”€> Row Echelon Form (REF) β”œβ”€> Reduced Row Echelon Form (RREF) └─> Back Substitution β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Solution Types β”‚ β”‚ - Unique Solution β”‚ β”‚ - Infinite Solutions β”‚ β”‚ - No Solution β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”œβ”€> Consistency β”œβ”€> Homogeneous Systems β”œβ”€> Parametric Solutions └─> Geometric Interpretation ``` --- ## Level 6: Matrix Inverses & Determinants (Requires Level 5) ### 6.1 Matrix Inverse ``` β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Inverse Definition β”‚ β”‚ - AA⁻¹ = A⁻¹A = I β”‚ β”‚ - Exists iff det(A) β‰  0 β”‚ β”‚ - Unique if exists β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”œβ”€> Computing Inverses (Gauss-Jordan) β”œβ”€> Inverse Properties: (AB)⁻¹ = B⁻¹A⁻¹ β”œβ”€> Inverse and Transpose: (Aα΅€)⁻¹ = (A⁻¹)α΅€ β”œβ”€> Solving Systems: x = A⁻¹b └─> Invertible Matrix Theorem ``` ### 6.2 Determinants ``` β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Determinant β”‚ β”‚ - det(A) or |A| β”‚ β”‚ - Scalar value β”‚ β”‚ - Invertibility test β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”œβ”€> 2Γ—2: ad - bc β”œβ”€> 3Γ—3: Rule of Sarrus or Cofactor β”œβ”€> nΓ—n: Cofactor Expansion β”œβ”€> Properties: det(AB) = det(A)det(B) β”œβ”€> det(Aα΅€) = det(A) β”œβ”€> Row Operations Effect β”œβ”€> Cramer's Rule └─> Geometric Meaning (Area/Volume) ``` --- ## Level 7: Vector Spaces (Requires Level 2-6) ### 7.1 Abstract Vector Spaces ``` β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Vector Space Definition β”‚ β”‚ - 10 Axioms β”‚ β”‚ - Closure under + and Β· β”‚ β”‚ - Examples: ℝⁿ, Polynomials β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”œβ”€> Subspaces β”œβ”€> Span of Vectors β”œβ”€> Linear Independence β”œβ”€> Linear Dependence β”œβ”€> Basis └─> Dimension β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Important Subspaces β”‚ β”‚ - Null Space (Kernel) β”‚ β”‚ - Column Space (Range) β”‚ β”‚ - Row Space β”‚ β”‚ - Left Null Space β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”œβ”€> Rank of Matrix β”œβ”€> Nullity of Matrix β”œβ”€> Rank-Nullity Theorem └─> Fundamental Theorem of Linear Algebra ``` ### 7.2 Basis & Dimension ``` β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Basis β”‚ β”‚ - Linearly independent β”‚ β”‚ - Spans the space β”‚ β”‚ - Minimum spanning set β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”œβ”€> Standard Basis β”œβ”€> Dimension = # basis vectors β”œβ”€> Change of Basis β”œβ”€> Coordinates Relative to Basis └─> Uniqueness of Dimension ``` --- ## Level 8: Linear Transformations (Requires Level 7) ### 8.1 Linear Transformations ``` β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Transformation T: V β†’ W β”‚ β”‚ - T(u + v) = T(u) + T(v) β”‚ β”‚ - T(cv) = cT(v) β”‚ β”‚ - Matrix representation β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”œβ”€> Kernel (Null Space): ker(T) = {v : T(v) = 0} β”œβ”€> Range (Image): range(T) = {T(v) : v ∈ V} β”œβ”€> Rank-Nullity Theorem β”œβ”€> One-to-one Transformations β”œβ”€> Onto Transformations └─> Isomorphisms β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Standard Transformations β”‚ β”‚ - Rotation β”‚ β”‚ - Reflection β”‚ β”‚ - Projection β”‚ β”‚ - Scaling β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ └─> Composition of Transformations ``` --- ## Level 9: Eigenvalues & Eigenvectors (Requires Level 6-8) ### 9.1 Eigen-Theory ``` β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Eigenvalue Problem β”‚ β”‚ - Av = Ξ»v β”‚ β”‚ - Characteristic Polynomial β”‚ β”‚ - det(A - Ξ»I) = 0 β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”œβ”€> Computing Eigenvalues β”œβ”€> Computing Eigenvectors β”œβ”€> Eigenspace β”œβ”€> Algebraic Multiplicity β”œβ”€> Geometric Multiplicity └─> Diagonalization β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Diagonalization β”‚ β”‚ - A = PDP⁻¹ β”‚ β”‚ - D diagonal (eigenvalues) β”‚ β”‚ - P columns (eigenvectors) β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”œβ”€> Diagonalizable Matrices β”œβ”€> Similar Matrices β”œβ”€> Powers: Aⁿ = PDⁿP⁻¹ └─> Applications: Differential Equations, Markov Chains ``` --- ## Level 10: Orthogonality (Requires Level 3, 7) ### 10.1 Orthogonal Sets ``` β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Orthogonality β”‚ β”‚ - v Β· w = 0 β”‚ β”‚ - Perpendicular vectors β”‚ β”‚ - Orthogonal sets β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”œβ”€> Orthogonal Basis β”œβ”€> Orthonormal Basis β”œβ”€> Orthogonal Complement └─> Orthogonal Decomposition β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Gram-Schmidt Process β”‚ β”‚ - Orthogonalization β”‚ β”‚ - Creates orthonormal basis β”‚ β”‚ - QR Decomposition β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ └─> Applications: Least Squares, QR Algorithm ``` ### 10.2 Orthogonal Matrices ``` β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Orthogonal Matrix Q β”‚ β”‚ - Qα΅€Q = QQα΅€ = I β”‚ β”‚ - Columns orthonormal β”‚ β”‚ - Preserves lengths β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”œβ”€> Rotation Matrices β”œβ”€> Reflection Matrices β”œβ”€> det(Q) = Β±1 └─> Q⁻¹ = Qα΅€ ``` --- ## Level 11: Inner Product Spaces (Requires Level 3, 7, 10) ### 11.1 Inner Products ``` β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Inner Product ⟨u, v⟩ β”‚ β”‚ - Generalizes dot product β”‚ β”‚ - 4 Axioms β”‚ β”‚ - Induces norm & metric β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”œβ”€> Cauchy-Schwarz Inequality β”œβ”€> Triangle Inequality β”œβ”€> Parallelogram Law β”œβ”€> Pythagorean Theorem └─> Norm: ||v|| = √⟨v, v⟩ β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Applications β”‚ β”‚ - Function spaces β”‚ β”‚ - Polynomial inner products β”‚ β”‚ - Weighted inner products β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ ``` --- ## Level 12: Matrix Decompositions (Requires Level 6, 9, 10) ### 12.1 LU Decomposition ``` β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ LU Factorization β”‚ β”‚ - A = LU β”‚ β”‚ - L: Lower triangular β”‚ β”‚ - U: Upper triangular β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”œβ”€> Existence Conditions β”œβ”€> Computing LU β”œβ”€> Solving Systems with LU β”œβ”€> Computational Efficiency └─> PLU (with Pivoting) ``` ### 12.2 QR Decomposition ``` β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ QR Factorization β”‚ β”‚ - A = QR β”‚ β”‚ - Q: Orthogonal β”‚ β”‚ - R: Upper triangular β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”œβ”€> Gram-Schmidt Method β”œβ”€> Householder Reflections β”œβ”€> Givens Rotations β”œβ”€> Least Squares Solutions └─> QR Algorithm for Eigenvalues ``` ### 12.3 Eigenvalue Decomposition (Spectral) ``` β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Spectral Decomposition β”‚ β”‚ - A = QΞ›Qα΅€ β”‚ β”‚ - Symmetric matrices β”‚ β”‚ - Real eigenvalues β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”œβ”€> Orthogonal Eigenvectors β”œβ”€> Spectral Theorem β”œβ”€> Applications └─> Positive Definite Matrices ``` ### 12.4 Singular Value Decomposition (SVD) ``` β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ SVD: A = UΞ£Vα΅€ β”‚ β”‚ - U: Left singular vectors β”‚ β”‚ - Ξ£: Singular values β”‚ β”‚ - V: Right singular vectors β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”œβ”€> Always Exists (any matrix) β”œβ”€> Singular Values β”œβ”€> Relationship to Eigenvalues β”œβ”€> Pseudoinverse (A⁺) β”œβ”€> Low-rank Approximation β”œβ”€> Image Compression β”œβ”€> Data Analysis (PCA) └─> Recommender Systems ``` --- ## Level 13: Advanced Theory (Requires Level 7-12) ### 13.1 Abstract Algebra Connections ``` β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Algebraic Structures β”‚ β”‚ - Groups β”‚ β”‚ - Rings β”‚ β”‚ - Fields β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”œβ”€> Vector Space as Module β”œβ”€> Linear Algebra over Fields └─> Quotient Spaces ``` ### 13.2 Norms & Metrics ``` β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Vector Norms β”‚ β”‚ - LΒΉ norm: Ξ£|vα΅’| β”‚ β”‚ - LΒ² norm (Euclidean) β”‚ β”‚ - L∞ norm: max|vα΅’| β”‚ β”‚ - p-norms β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”œβ”€> Matrix Norms β”œβ”€> Frobenius Norm β”œβ”€> Operator Norm β”œβ”€> Condition Number └─> Error Analysis β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Metric Spaces β”‚ β”‚ - Distance Function β”‚ β”‚ - Metric Properties β”‚ β”‚ - Induced by Norms β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ ``` --- ## Level 14: Applications - Machine Learning (Requires All Previous) ### 14.1 ML Fundamentals ``` β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Linear Regression β”‚ β”‚ - Normal Equations β”‚ β”‚ - ΞΈ = (Xα΅€X)⁻¹Xα΅€y β”‚ β”‚ - Least Squares β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”œβ”€> Gradient Descent β”œβ”€> Ridge Regression (L2) β”œβ”€> Lasso Regression (L1) └─> Regularization β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Dimensionality Reduction β”‚ β”‚ - PCA (Principal Components)β”‚ β”‚ - SVD for PCA β”‚ β”‚ - Explained Variance β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”œβ”€> Eigenfaces β”œβ”€> Feature Extraction β”œβ”€> Data Visualization └─> Compression ``` ### 14.2 Neural Networks ``` β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Neural Network Math β”‚ β”‚ - Forward Pass: y = Wx + b β”‚ β”‚ - Backpropagation β”‚ β”‚ - Gradient Computation β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”œβ”€> Weight Matrices β”œβ”€> Activation Functions β”œβ”€> Loss Gradients └─> Optimization (SGD, Adam) ``` --- ## Level 15: Applications - Graphics (Requires Level 4, 8) ### 15.1 Geometric Transformations ``` β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ 2D Transformations β”‚ β”‚ - Translation β”‚ β”‚ - Rotation β”‚ β”‚ - Scaling β”‚ β”‚ - Shearing β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”œβ”€> Homogeneous Coordinates β”œβ”€> Transformation Matrices β”œβ”€> Composition └─> Inverse Transformations β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ 3D Graphics β”‚ β”‚ - 3D Rotations β”‚ β”‚ - View Transformations β”‚ β”‚ - Projection (Orthographic) β”‚ β”‚ - Projection (Perspective) β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”œβ”€> Camera Matrices β”œβ”€> Model-View-Projection β”œβ”€> Quaternions └─> Euler Angles ``` --- ## πŸ”— Dependency Map Summary ### Critical Learning Path ``` Level 1 (Algebra Review) ↓ Level 2 (Vectors - Geometric) ↓ Level 3 (Vector Products) ↓ Level 4 (Matrices - Basics) ↓ Level 5 (Linear Systems) ↓ Level 6 (Inverses & Determinants) ↓ Level 7 (Vector Spaces) [Theoretical Branch] ↓ Level 8 (Linear Transformations) ↓ Level 9 (Eigenvalues) ←─────────┐ ↓ β”‚ Level 10 (Orthogonality) ──────── Can parallelize ↓ β”‚ Level 11 (Inner Products) β”€β”€β”€β”€β”€β”€β”˜ ↓ Level 12 (Decompositions) ↓ Level 13 (Advanced Theory) ←────┐ ↓ β”‚ Can parallelize Level 14 (ML Applications) ────── ↓ β”‚ Level 15 (Graphics Applications)β”˜ ``` ### Parallel Learning Opportunities - Levels 9, 10, 11 can be learned in parallel after Level 8 - Level 13 (theory) can parallel with Levels 14-15 (applications) - Applications (14-15) depend on decompositions but can be learned in any order --- ## πŸ“Š Prerequisite Matrix | Topic | Must Know First | Can Learn In Parallel | |-------|----------------|----------------------| | Dot Product | Vector basics | Cross product | | Matrices | Vectors | - | | Matrix Mult | Matrix basics | Transpose | | Linear Systems | Matrix multiplication | - | | Determinants | Matrix multiplication | Inverses | | Inverses | Determinants, Systems | - | | Vector Spaces | Linear systems, Span | - | | Eigenvalues | Determinants, Vector spaces | - | | Orthogonality | Dot product, Basis | Inner products | | SVD | Eigenvalues, Orthogonality | - | | PCA | SVD, Statistics basics | - | --- ## 🎯 Learning Strategies ### Geometric First, Then Abstract 1. Start with 2D/3D vectors (can visualize) 2. Build geometric intuition 3. Generalize to n dimensions 4. Then study abstract theory 5. Apply to real problems ### Computation Supports Theory 1. Solve many numerical examples 2. Use Python/MATLAB to verify 3. See patterns emerge 4. Then learn why (proofs) 5. Deepen understanding ### Applications Motivate Learning 1. See where linear algebra is used 2. Understand why we need it 3. Learn concepts to solve problems 4. Apply immediately 5. Build projects --- This knowledge graph ensures you build strong foundations before tackling abstract concepts and applications!