first working version
This commit is contained in:
709
learning_plans/linear_algebra/01_KNOWLEDGE_GRAPH.md
Normal file
709
learning_plans/linear_algebra/01_KNOWLEDGE_GRAPH.md
Normal file
@@ -0,0 +1,709 @@
|
||||
# Linear Algebra Knowledge Graph - Complete Dependency Map
|
||||
|
||||
## 🌳 Knowledge Tree Structure
|
||||
|
||||
This document maps all Linear Algebra concepts with their dependencies and optimal learning order.
|
||||
|
||||
---
|
||||
|
||||
## Level 1: Foundation Concepts (No Prerequisites)
|
||||
|
||||
### 1.1 Basic Algebra Review
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Arithmetic Operations │
|
||||
│ - Addition, Subtraction │
|
||||
│ - Multiplication, Division │
|
||||
│ - Order of operations │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Algebraic Expressions
|
||||
├─> Solving Linear Equations
|
||||
├─> Polynomials
|
||||
└─> Summation Notation (Σ)
|
||||
```
|
||||
|
||||
### 1.2 Coordinate Systems
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Cartesian Coordinates │
|
||||
│ - 2D plane (x, y) │
|
||||
│ - 3D space (x, y, z) │
|
||||
│ - Points and plotting │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Distance Formula
|
||||
├─> Midpoint Formula
|
||||
└─> Graphing Functions
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Level 2: Vectors - Geometric (Requires Level 1)
|
||||
|
||||
### 2.1 Vector Basics
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Vector Definition │
|
||||
│ - Magnitude and Direction │
|
||||
│ - Component Form │
|
||||
│ - Position Vectors │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Vector Notation (bold, arrow)
|
||||
├─> ℝ² and ℝ³ vectors
|
||||
├─> n-dimensional vectors (ℝⁿ)
|
||||
└─> Column vs Row Vectors
|
||||
|
||||
┌──────────────────────────────┐
|
||||
│ Vector Visualization │
|
||||
│ - Geometric arrows │
|
||||
│ - Head and tail │
|
||||
│ - Parallel vectors │
|
||||
└──────────────────────────────┘
|
||||
```
|
||||
|
||||
### 2.2 Vector Operations (Geometric)
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Vector Addition │
|
||||
│ - Parallelogram Rule │
|
||||
│ - Tip-to-tail Method │
|
||||
│ - Component-wise Addition │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Vector Subtraction
|
||||
├─> Scalar Multiplication (Scaling)
|
||||
├─> Linear Combinations
|
||||
└─> Zero Vector
|
||||
|
||||
┌──────────────────────────────┐
|
||||
│ Special Vectors │
|
||||
│ - Unit Vectors │
|
||||
│ - Standard Basis (i, j, k) │
|
||||
│ - Normalization │
|
||||
└──────────────────────────────┘
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Level 3: Vector Products (Requires Level 2)
|
||||
|
||||
### 3.1 Dot Product (Inner Product)
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Dot Product │
|
||||
│ - a · b = Σ aᵢbᵢ │
|
||||
│ - a · b = ||a|| ||b|| cos θ│
|
||||
│ - Scalar result │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Vector Length (Norm): ||v|| = √(v · v)
|
||||
├─> Distance: ||a - b||
|
||||
├─> Angle Between Vectors
|
||||
├─> Orthogonality (a · b = 0)
|
||||
├─> Vector Projection
|
||||
└─> Cauchy-Schwarz Inequality
|
||||
|
||||
┌──────────────────────────────┐
|
||||
│ Properties of Dot Product │
|
||||
│ - Commutative: a · b = b · a│
|
||||
│ - Distributive │
|
||||
│ - Linearity │
|
||||
└──────────────────────────────┘
|
||||
```
|
||||
|
||||
### 3.2 Cross Product (3D Only)
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Cross Product (a × b) │
|
||||
│ - Vector result │
|
||||
│ - Perpendicular to both │
|
||||
│ - Right-hand rule │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Magnitude: ||a × b|| = ||a|| ||b|| sin θ
|
||||
├─> Area of Parallelogram
|
||||
├─> Determinant Form
|
||||
├─> Anti-commutative: a × b = -(b × a)
|
||||
└─> Triple Scalar Product
|
||||
|
||||
┌──────────────────────────────┐
|
||||
│ Applications │
|
||||
│ - Normal vectors │
|
||||
│ - Torque calculations │
|
||||
│ - Area and volume │
|
||||
└──────────────────────────────┘
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Level 4: Matrices - Basics (Requires Level 2-3)
|
||||
|
||||
### 4.1 Matrix Fundamentals
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Matrix Definition │
|
||||
│ - m × n array of numbers │
|
||||
│ - Rows and columns │
|
||||
│ - Matrix indexing Aᵢⱼ │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Matrix Addition/Subtraction
|
||||
├─> Scalar Multiplication
|
||||
├─> Transpose (Aᵀ)
|
||||
├─> Special Matrices (I, O, Diagonal)
|
||||
└─> Matrix Equality
|
||||
```
|
||||
|
||||
### 4.2 Matrix Multiplication
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Matrix Product │
|
||||
│ - (AB)ᵢⱼ = Σ AᵢₖBₖⱼ │
|
||||
│ - Dimension compatibility │
|
||||
│ - Non-commutative │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Properties (Associative, Distributive)
|
||||
├─> Identity: AI = IA = A
|
||||
├─> Matrix Powers: A², A³, ...
|
||||
├─> Matrix as Linear Transformation
|
||||
└─> Block Matrix Multiplication
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Level 5: Linear Systems (Requires Level 4)
|
||||
|
||||
### 5.1 Systems of Linear Equations
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ System Representation │
|
||||
│ - Ax = b │
|
||||
│ - Augmented Matrix [A|b] │
|
||||
│ - Coefficient Matrix │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Gaussian Elimination
|
||||
├─> Row Operations
|
||||
├─> Row Echelon Form (REF)
|
||||
├─> Reduced Row Echelon Form (RREF)
|
||||
└─> Back Substitution
|
||||
|
||||
┌──────────────────────────────┐
|
||||
│ Solution Types │
|
||||
│ - Unique Solution │
|
||||
│ - Infinite Solutions │
|
||||
│ - No Solution │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Consistency
|
||||
├─> Homogeneous Systems
|
||||
├─> Parametric Solutions
|
||||
└─> Geometric Interpretation
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Level 6: Matrix Inverses & Determinants (Requires Level 5)
|
||||
|
||||
### 6.1 Matrix Inverse
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Inverse Definition │
|
||||
│ - AA⁻¹ = A⁻¹A = I │
|
||||
│ - Exists iff det(A) ≠ 0 │
|
||||
│ - Unique if exists │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Computing Inverses (Gauss-Jordan)
|
||||
├─> Inverse Properties: (AB)⁻¹ = B⁻¹A⁻¹
|
||||
├─> Inverse and Transpose: (Aᵀ)⁻¹ = (A⁻¹)ᵀ
|
||||
├─> Solving Systems: x = A⁻¹b
|
||||
└─> Invertible Matrix Theorem
|
||||
```
|
||||
|
||||
### 6.2 Determinants
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Determinant │
|
||||
│ - det(A) or |A| │
|
||||
│ - Scalar value │
|
||||
│ - Invertibility test │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> 2×2: ad - bc
|
||||
├─> 3×3: Rule of Sarrus or Cofactor
|
||||
├─> n×n: Cofactor Expansion
|
||||
├─> Properties: det(AB) = det(A)det(B)
|
||||
├─> det(Aᵀ) = det(A)
|
||||
├─> Row Operations Effect
|
||||
├─> Cramer's Rule
|
||||
└─> Geometric Meaning (Area/Volume)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Level 7: Vector Spaces (Requires Level 2-6)
|
||||
|
||||
### 7.1 Abstract Vector Spaces
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Vector Space Definition │
|
||||
│ - 10 Axioms │
|
||||
│ - Closure under + and · │
|
||||
│ - Examples: ℝⁿ, Polynomials │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Subspaces
|
||||
├─> Span of Vectors
|
||||
├─> Linear Independence
|
||||
├─> Linear Dependence
|
||||
├─> Basis
|
||||
└─> Dimension
|
||||
|
||||
┌──────────────────────────────┐
|
||||
│ Important Subspaces │
|
||||
│ - Null Space (Kernel) │
|
||||
│ - Column Space (Range) │
|
||||
│ - Row Space │
|
||||
│ - Left Null Space │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Rank of Matrix
|
||||
├─> Nullity of Matrix
|
||||
├─> Rank-Nullity Theorem
|
||||
└─> Fundamental Theorem of Linear Algebra
|
||||
```
|
||||
|
||||
### 7.2 Basis & Dimension
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Basis │
|
||||
│ - Linearly independent │
|
||||
│ - Spans the space │
|
||||
│ - Minimum spanning set │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Standard Basis
|
||||
├─> Dimension = # basis vectors
|
||||
├─> Change of Basis
|
||||
├─> Coordinates Relative to Basis
|
||||
└─> Uniqueness of Dimension
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Level 8: Linear Transformations (Requires Level 7)
|
||||
|
||||
### 8.1 Linear Transformations
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Transformation T: V → W │
|
||||
│ - T(u + v) = T(u) + T(v) │
|
||||
│ - T(cv) = cT(v) │
|
||||
│ - Matrix representation │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Kernel (Null Space): ker(T) = {v : T(v) = 0}
|
||||
├─> Range (Image): range(T) = {T(v) : v ∈ V}
|
||||
├─> Rank-Nullity Theorem
|
||||
├─> One-to-one Transformations
|
||||
├─> Onto Transformations
|
||||
└─> Isomorphisms
|
||||
|
||||
┌──────────────────────────────┐
|
||||
│ Standard Transformations │
|
||||
│ - Rotation │
|
||||
│ - Reflection │
|
||||
│ - Projection │
|
||||
│ - Scaling │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
└─> Composition of Transformations
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Level 9: Eigenvalues & Eigenvectors (Requires Level 6-8)
|
||||
|
||||
### 9.1 Eigen-Theory
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Eigenvalue Problem │
|
||||
│ - Av = λv │
|
||||
│ - Characteristic Polynomial │
|
||||
│ - det(A - λI) = 0 │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Computing Eigenvalues
|
||||
├─> Computing Eigenvectors
|
||||
├─> Eigenspace
|
||||
├─> Algebraic Multiplicity
|
||||
├─> Geometric Multiplicity
|
||||
└─> Diagonalization
|
||||
|
||||
┌──────────────────────────────┐
|
||||
│ Diagonalization │
|
||||
│ - A = PDP⁻¹ │
|
||||
│ - D diagonal (eigenvalues) │
|
||||
│ - P columns (eigenvectors) │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Diagonalizable Matrices
|
||||
├─> Similar Matrices
|
||||
├─> Powers: Aⁿ = PDⁿP⁻¹
|
||||
└─> Applications: Differential Equations, Markov Chains
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Level 10: Orthogonality (Requires Level 3, 7)
|
||||
|
||||
### 10.1 Orthogonal Sets
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Orthogonality │
|
||||
│ - v · w = 0 │
|
||||
│ - Perpendicular vectors │
|
||||
│ - Orthogonal sets │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Orthogonal Basis
|
||||
├─> Orthonormal Basis
|
||||
├─> Orthogonal Complement
|
||||
└─> Orthogonal Decomposition
|
||||
|
||||
┌──────────────────────────────┐
|
||||
│ Gram-Schmidt Process │
|
||||
│ - Orthogonalization │
|
||||
│ - Creates orthonormal basis │
|
||||
│ - QR Decomposition │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
└─> Applications: Least Squares, QR Algorithm
|
||||
```
|
||||
|
||||
### 10.2 Orthogonal Matrices
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Orthogonal Matrix Q │
|
||||
│ - QᵀQ = QQᵀ = I │
|
||||
│ - Columns orthonormal │
|
||||
│ - Preserves lengths │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Rotation Matrices
|
||||
├─> Reflection Matrices
|
||||
├─> det(Q) = ±1
|
||||
└─> Q⁻¹ = Qᵀ
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Level 11: Inner Product Spaces (Requires Level 3, 7, 10)
|
||||
|
||||
### 11.1 Inner Products
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Inner Product ⟨u, v⟩ │
|
||||
│ - Generalizes dot product │
|
||||
│ - 4 Axioms │
|
||||
│ - Induces norm & metric │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Cauchy-Schwarz Inequality
|
||||
├─> Triangle Inequality
|
||||
├─> Parallelogram Law
|
||||
├─> Pythagorean Theorem
|
||||
└─> Norm: ||v|| = √⟨v, v⟩
|
||||
|
||||
┌──────────────────────────────┐
|
||||
│ Applications │
|
||||
│ - Function spaces │
|
||||
│ - Polynomial inner products │
|
||||
│ - Weighted inner products │
|
||||
└──────────────────────────────┘
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Level 12: Matrix Decompositions (Requires Level 6, 9, 10)
|
||||
|
||||
### 12.1 LU Decomposition
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ LU Factorization │
|
||||
│ - A = LU │
|
||||
│ - L: Lower triangular │
|
||||
│ - U: Upper triangular │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Existence Conditions
|
||||
├─> Computing LU
|
||||
├─> Solving Systems with LU
|
||||
├─> Computational Efficiency
|
||||
└─> PLU (with Pivoting)
|
||||
```
|
||||
|
||||
### 12.2 QR Decomposition
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ QR Factorization │
|
||||
│ - A = QR │
|
||||
│ - Q: Orthogonal │
|
||||
│ - R: Upper triangular │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Gram-Schmidt Method
|
||||
├─> Householder Reflections
|
||||
├─> Givens Rotations
|
||||
├─> Least Squares Solutions
|
||||
└─> QR Algorithm for Eigenvalues
|
||||
```
|
||||
|
||||
### 12.3 Eigenvalue Decomposition (Spectral)
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Spectral Decomposition │
|
||||
│ - A = QΛQᵀ │
|
||||
│ - Symmetric matrices │
|
||||
│ - Real eigenvalues │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Orthogonal Eigenvectors
|
||||
├─> Spectral Theorem
|
||||
├─> Applications
|
||||
└─> Positive Definite Matrices
|
||||
```
|
||||
|
||||
### 12.4 Singular Value Decomposition (SVD)
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ SVD: A = UΣVᵀ │
|
||||
│ - U: Left singular vectors │
|
||||
│ - Σ: Singular values │
|
||||
│ - V: Right singular vectors │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Always Exists (any matrix)
|
||||
├─> Singular Values
|
||||
├─> Relationship to Eigenvalues
|
||||
├─> Pseudoinverse (A⁺)
|
||||
├─> Low-rank Approximation
|
||||
├─> Image Compression
|
||||
├─> Data Analysis (PCA)
|
||||
└─> Recommender Systems
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Level 13: Advanced Theory (Requires Level 7-12)
|
||||
|
||||
### 13.1 Abstract Algebra Connections
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Algebraic Structures │
|
||||
│ - Groups │
|
||||
│ - Rings │
|
||||
│ - Fields │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Vector Space as Module
|
||||
├─> Linear Algebra over Fields
|
||||
└─> Quotient Spaces
|
||||
```
|
||||
|
||||
### 13.2 Norms & Metrics
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Vector Norms │
|
||||
│ - L¹ norm: Σ|vᵢ| │
|
||||
│ - L² norm (Euclidean) │
|
||||
│ - L∞ norm: max|vᵢ| │
|
||||
│ - p-norms │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Matrix Norms
|
||||
├─> Frobenius Norm
|
||||
├─> Operator Norm
|
||||
├─> Condition Number
|
||||
└─> Error Analysis
|
||||
|
||||
┌──────────────────────────────┐
|
||||
│ Metric Spaces │
|
||||
│ - Distance Function │
|
||||
│ - Metric Properties │
|
||||
│ - Induced by Norms │
|
||||
└──────────────────────────────┘
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Level 14: Applications - Machine Learning (Requires All Previous)
|
||||
|
||||
### 14.1 ML Fundamentals
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Linear Regression │
|
||||
│ - Normal Equations │
|
||||
│ - θ = (XᵀX)⁻¹Xᵀy │
|
||||
│ - Least Squares │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Gradient Descent
|
||||
├─> Ridge Regression (L2)
|
||||
├─> Lasso Regression (L1)
|
||||
└─> Regularization
|
||||
|
||||
┌──────────────────────────────┐
|
||||
│ Dimensionality Reduction │
|
||||
│ - PCA (Principal Components)│
|
||||
│ - SVD for PCA │
|
||||
│ - Explained Variance │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Eigenfaces
|
||||
├─> Feature Extraction
|
||||
├─> Data Visualization
|
||||
└─> Compression
|
||||
```
|
||||
|
||||
### 14.2 Neural Networks
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Neural Network Math │
|
||||
│ - Forward Pass: y = Wx + b │
|
||||
│ - Backpropagation │
|
||||
│ - Gradient Computation │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Weight Matrices
|
||||
├─> Activation Functions
|
||||
├─> Loss Gradients
|
||||
└─> Optimization (SGD, Adam)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Level 15: Applications - Graphics (Requires Level 4, 8)
|
||||
|
||||
### 15.1 Geometric Transformations
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ 2D Transformations │
|
||||
│ - Translation │
|
||||
│ - Rotation │
|
||||
│ - Scaling │
|
||||
│ - Shearing │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Homogeneous Coordinates
|
||||
├─> Transformation Matrices
|
||||
├─> Composition
|
||||
└─> Inverse Transformations
|
||||
|
||||
┌──────────────────────────────┐
|
||||
│ 3D Graphics │
|
||||
│ - 3D Rotations │
|
||||
│ - View Transformations │
|
||||
│ - Projection (Orthographic) │
|
||||
│ - Projection (Perspective) │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Camera Matrices
|
||||
├─> Model-View-Projection
|
||||
├─> Quaternions
|
||||
└─> Euler Angles
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 🔗 Dependency Map Summary
|
||||
|
||||
### Critical Learning Path
|
||||
```
|
||||
Level 1 (Algebra Review)
|
||||
↓
|
||||
Level 2 (Vectors - Geometric)
|
||||
↓
|
||||
Level 3 (Vector Products)
|
||||
↓
|
||||
Level 4 (Matrices - Basics)
|
||||
↓
|
||||
Level 5 (Linear Systems)
|
||||
↓
|
||||
Level 6 (Inverses & Determinants)
|
||||
↓
|
||||
Level 7 (Vector Spaces) [Theoretical Branch]
|
||||
↓
|
||||
Level 8 (Linear Transformations)
|
||||
↓
|
||||
Level 9 (Eigenvalues) ←─────────┐
|
||||
↓ │
|
||||
Level 10 (Orthogonality) ───────┤ Can parallelize
|
||||
↓ │
|
||||
Level 11 (Inner Products) ──────┘
|
||||
↓
|
||||
Level 12 (Decompositions)
|
||||
↓
|
||||
Level 13 (Advanced Theory) ←────┐
|
||||
↓ │ Can parallelize
|
||||
Level 14 (ML Applications) ─────┤
|
||||
↓ │
|
||||
Level 15 (Graphics Applications)┘
|
||||
```
|
||||
|
||||
### Parallel Learning Opportunities
|
||||
- Levels 9, 10, 11 can be learned in parallel after Level 8
|
||||
- Level 13 (theory) can parallel with Levels 14-15 (applications)
|
||||
- Applications (14-15) depend on decompositions but can be learned in any order
|
||||
|
||||
---
|
||||
|
||||
## 📊 Prerequisite Matrix
|
||||
|
||||
| Topic | Must Know First | Can Learn In Parallel |
|
||||
|-------|----------------|----------------------|
|
||||
| Dot Product | Vector basics | Cross product |
|
||||
| Matrices | Vectors | - |
|
||||
| Matrix Mult | Matrix basics | Transpose |
|
||||
| Linear Systems | Matrix multiplication | - |
|
||||
| Determinants | Matrix multiplication | Inverses |
|
||||
| Inverses | Determinants, Systems | - |
|
||||
| Vector Spaces | Linear systems, Span | - |
|
||||
| Eigenvalues | Determinants, Vector spaces | - |
|
||||
| Orthogonality | Dot product, Basis | Inner products |
|
||||
| SVD | Eigenvalues, Orthogonality | - |
|
||||
| PCA | SVD, Statistics basics | - |
|
||||
|
||||
---
|
||||
|
||||
## 🎯 Learning Strategies
|
||||
|
||||
### Geometric First, Then Abstract
|
||||
1. Start with 2D/3D vectors (can visualize)
|
||||
2. Build geometric intuition
|
||||
3. Generalize to n dimensions
|
||||
4. Then study abstract theory
|
||||
5. Apply to real problems
|
||||
|
||||
### Computation Supports Theory
|
||||
1. Solve many numerical examples
|
||||
2. Use Python/MATLAB to verify
|
||||
3. See patterns emerge
|
||||
4. Then learn why (proofs)
|
||||
5. Deepen understanding
|
||||
|
||||
### Applications Motivate Learning
|
||||
1. See where linear algebra is used
|
||||
2. Understand why we need it
|
||||
3. Learn concepts to solve problems
|
||||
4. Apply immediately
|
||||
5. Build projects
|
||||
|
||||
---
|
||||
|
||||
This knowledge graph ensures you build strong foundations before tackling abstract concepts and applications!
|
||||
|
||||
Reference in New Issue
Block a user