first working version
This commit is contained in:
798
learning_plans/linear_algebra/00_LINEAR_ALGEBRA_MASTER_PLAN.md
Normal file
798
learning_plans/linear_algebra/00_LINEAR_ALGEBRA_MASTER_PLAN.md
Normal file
@@ -0,0 +1,798 @@
|
||||
# Linear Algebra - Master Plan
|
||||
|
||||
## 🎯 Goal: Linear Algebra Mastery
|
||||
|
||||
This comprehensive plan will guide you from basic vector concepts to advanced linear algebra applications in machine learning, computer graphics, data science, and quantum computing.
|
||||
|
||||
## 📊 Learning Journey Overview
|
||||
|
||||
**Total Duration:** 6-10 months (depending on pace and mathematical background)
|
||||
**Target Level:** Advanced Linear Algebra with Applications
|
||||
**Daily Commitment:** 1-2 hours recommended
|
||||
**Prerequisites:** High school algebra (recommended but not required)
|
||||
|
||||
## 🗺️ Learning Path Structure
|
||||
|
||||
```
|
||||
Phase 1: Foundations (1.5-2 months)
|
||||
└─> Vectors, Matrices, Basic Operations
|
||||
|
||||
Phase 2: Core Theory (2-3 months)
|
||||
└─> Systems of Equations, Matrix Decompositions, Eigenvalues
|
||||
|
||||
Phase 3: Advanced Topics (1.5-2 months)
|
||||
└─> Vector Spaces, Linear Transformations, Orthogonality
|
||||
|
||||
Phase 4: Applications (1-2 months)
|
||||
└─> Machine Learning, Graphics, Data Science, Optimization
|
||||
|
||||
Phase 5: Specialization (Ongoing)
|
||||
└─> Choose your domain (ML, Graphics, Quantum, Data Science)
|
||||
```
|
||||
|
||||
## 📚 Learning Modules Breakdown
|
||||
|
||||
### Phase 1: Linear Algebra Foundations (Beginner)
|
||||
**Duration:** 1.5-2 months | **Difficulty:** ⭐⭐☆☆☆
|
||||
|
||||
1. **Module 1.1: Vectors Basics** (2 weeks)
|
||||
- Vector Definition & Notation
|
||||
- Vector Components
|
||||
- Geometric Interpretation (2D, 3D)
|
||||
- Vector Addition & Subtraction
|
||||
- Scalar Multiplication
|
||||
- Zero Vector & Unit Vectors
|
||||
- Standard Basis Vectors (i, j, k)
|
||||
- Linear Combinations
|
||||
|
||||
2. **Module 1.2: Dot Product & Vector Operations** (2 weeks)
|
||||
- Dot Product (Inner Product)
|
||||
- Geometric Interpretation
|
||||
- Angle Between Vectors
|
||||
- Vector Length (Magnitude/Norm)
|
||||
- Distance Between Vectors
|
||||
- Orthogonal Vectors
|
||||
- Vector Projection
|
||||
- Cross Product (3D)
|
||||
|
||||
3. **Module 1.3: Matrices Basics** (2 weeks)
|
||||
- Matrix Definition & Notation
|
||||
- Matrix Dimensions (m × n)
|
||||
- Special Matrices (Identity, Zero, Diagonal)
|
||||
- Matrix Addition & Subtraction
|
||||
- Scalar Multiplication
|
||||
- Matrix Multiplication
|
||||
- Transpose
|
||||
- Matrix Powers
|
||||
|
||||
4. **Module 1.4: Matrix Properties** (1 week)
|
||||
- Symmetric Matrices
|
||||
- Antisymmetric Matrices
|
||||
- Triangular Matrices (Upper/Lower)
|
||||
- Trace of a Matrix
|
||||
- Matrix Norms
|
||||
- Matrix Multiplication Properties
|
||||
- Non-commutativity
|
||||
- Associativity & Distributivity
|
||||
|
||||
---
|
||||
|
||||
### Phase 2: Core Linear Algebra (Intermediate)
|
||||
**Duration:** 2-3 months | **Difficulty:** ⭐⭐⭐☆☆
|
||||
|
||||
5. **Module 2.1: Systems of Linear Equations** (2 weeks)
|
||||
- Linear Equation Systems
|
||||
- Matrix Representation (Ax = b)
|
||||
- Gaussian Elimination
|
||||
- Row Echelon Form (REF)
|
||||
- Reduced Row Echelon Form (RREF)
|
||||
- Back Substitution
|
||||
- Consistency & Inconsistency
|
||||
- Homogeneous Systems
|
||||
|
||||
6. **Module 2.2: Matrix Inverses** (2 weeks)
|
||||
- Inverse Matrix Definition
|
||||
- Properties of Inverses
|
||||
- Computing Inverses (Gauss-Jordan)
|
||||
- Invertible vs Singular Matrices
|
||||
- Inverse of Products
|
||||
- Inverse and Transpose Relationship
|
||||
- Solving Systems with Inverses
|
||||
- Computational Considerations
|
||||
|
||||
7. **Module 2.3: Determinants** (2 weeks)
|
||||
- Determinant Definition
|
||||
- 2×2 and 3×3 Determinants
|
||||
- Cofactor Expansion
|
||||
- Properties of Determinants
|
||||
- Determinant and Invertibility
|
||||
- Determinant of Products
|
||||
- Cramer's Rule
|
||||
- Geometric Interpretation (Volume)
|
||||
|
||||
8. **Module 2.4: Vector Spaces** (3 weeks)
|
||||
- Vector Space Definition
|
||||
- Subspaces
|
||||
- Span of Vectors
|
||||
- Linear Independence
|
||||
- Linear Dependence
|
||||
- Basis and Dimension
|
||||
- Coordinate Systems
|
||||
- Change of Basis
|
||||
|
||||
9. **Module 2.5: Linear Transformations** (2 weeks)
|
||||
- Transformation Definition
|
||||
- Matrix Representation
|
||||
- Kernel (Null Space)
|
||||
- Range (Column Space)
|
||||
- Rank-Nullity Theorem
|
||||
- One-to-one & Onto
|
||||
- Isomorphisms
|
||||
- Composition of Transformations
|
||||
|
||||
10. **Module 2.6: Eigenvalues & Eigenvectors** (3 weeks)
|
||||
- Eigenvalue Definition
|
||||
- Eigenvector Definition
|
||||
- Characteristic Polynomial
|
||||
- Computing Eigenvalues
|
||||
- Computing Eigenvectors
|
||||
- Eigenspaces
|
||||
- Diagonalization
|
||||
- Similar Matrices
|
||||
- Algebraic vs Geometric Multiplicity
|
||||
|
||||
---
|
||||
|
||||
### Phase 3: Advanced Linear Algebra
|
||||
**Duration:** 1.5-2 months | **Difficulty:** ⭐⭐⭐⭐☆
|
||||
|
||||
11. **Module 3.1: Orthogonality** (2 weeks)
|
||||
- Orthogonal Vectors
|
||||
- Orthogonal Complement
|
||||
- Orthogonal Basis
|
||||
- Orthonormal Basis
|
||||
- Gram-Schmidt Process
|
||||
- QR Decomposition
|
||||
- Orthogonal Matrices
|
||||
- Orthogonal Projections
|
||||
|
||||
12. **Module 3.2: Inner Product Spaces** (2 weeks)
|
||||
- Inner Product Definition
|
||||
- Inner Product Properties
|
||||
- Cauchy-Schwarz Inequality
|
||||
- Triangle Inequality
|
||||
- Norm from Inner Product
|
||||
- Orthogonality in Inner Product Spaces
|
||||
- Best Approximation
|
||||
- Least Squares Problems
|
||||
|
||||
13. **Module 3.3: Matrix Decompositions** (3 weeks)
|
||||
- LU Decomposition
|
||||
- QR Decomposition (revisited)
|
||||
- Cholesky Decomposition
|
||||
- Singular Value Decomposition (SVD)
|
||||
- Spectral Theorem
|
||||
- Jordan Normal Form
|
||||
- Polar Decomposition
|
||||
- Applications of Decompositions
|
||||
|
||||
14. **Module 3.4: Norms & Conditioning** (1 week)
|
||||
- Vector Norms (L1, L2, L∞)
|
||||
- Matrix Norms
|
||||
- Frobenius Norm
|
||||
- Operator Norm
|
||||
- Condition Number
|
||||
- Numerical Stability
|
||||
- Ill-conditioned Problems
|
||||
- Error Analysis
|
||||
|
||||
---
|
||||
|
||||
### Phase 4: Applications (Advanced)
|
||||
**Duration:** 1-2 months | **Difficulty:** ⭐⭐⭐⭐☆
|
||||
|
||||
15. **Module 4.1: Linear Algebra in Machine Learning** (2 weeks)
|
||||
- Data Representation (Feature Vectors)
|
||||
- Linear Regression
|
||||
- Principal Component Analysis (PCA)
|
||||
- Dimensionality Reduction
|
||||
- Singular Value Decomposition in ML
|
||||
- Recommender Systems
|
||||
- Neural Network Foundations
|
||||
- Backpropagation Math
|
||||
|
||||
16. **Module 4.2: Computer Graphics Applications** (2 weeks)
|
||||
- Homogeneous Coordinates
|
||||
- Transformation Matrices (Translation, Rotation, Scaling)
|
||||
- 3D Graphics Pipeline
|
||||
- Camera Matrices
|
||||
- Projection Matrices
|
||||
- Quaternions (Rotation)
|
||||
- Lighting & Shading Math
|
||||
- Ray Tracing Fundamentals
|
||||
|
||||
17. **Module 4.3: Optimization & Numerical Methods** (2 weeks)
|
||||
- Gradient Descent
|
||||
- Convex Optimization
|
||||
- Lagrange Multipliers
|
||||
- Newton's Method
|
||||
- Iterative Methods (Jacobi, Gauss-Seidel)
|
||||
- Conjugate Gradient Method
|
||||
- Power Iteration
|
||||
- Krylov Subspace Methods
|
||||
|
||||
18. **Module 4.4: Data Science Applications** (1 week)
|
||||
- Covariance Matrices
|
||||
- Correlation Analysis
|
||||
- Linear Models
|
||||
- Feature Scaling & Normalization
|
||||
- Dimensionality Reduction
|
||||
- Matrix Factorization
|
||||
- Network Analysis (Graphs as Matrices)
|
||||
- Markov Chains
|
||||
|
||||
---
|
||||
|
||||
### Phase 5: Specializations (Choose Your Path)
|
||||
**Duration:** Ongoing | **Difficulty:** ⭐⭐⭐⭐⭐
|
||||
|
||||
19. **Specialization A: Machine Learning Deep Dive**
|
||||
- Deep Learning Mathematics
|
||||
- Tensor Operations
|
||||
- Automatic Differentiation
|
||||
- Backpropagation in Detail
|
||||
- Convolutional Neural Networks Math
|
||||
- Recurrent Networks Math
|
||||
- Attention Mechanisms
|
||||
- Optimization Algorithms
|
||||
|
||||
20. **Specialization B: Computational Linear Algebra**
|
||||
- Numerical Linear Algebra
|
||||
- Sparse Matrices
|
||||
- Iterative Solvers
|
||||
- Parallel Matrix Algorithms
|
||||
- GPU Computing for Linear Algebra
|
||||
- BLAS & LAPACK Libraries
|
||||
- Floating Point Arithmetic
|
||||
- Stability & Accuracy
|
||||
|
||||
21. **Specialization C: Quantum Computing**
|
||||
- Quantum States (Vectors in Hilbert Space)
|
||||
- Bra-ket Notation
|
||||
- Quantum Gates (Unitary Matrices)
|
||||
- Tensor Products
|
||||
- Entanglement
|
||||
- Quantum Circuits
|
||||
- Quantum Algorithms (Grover, Shor)
|
||||
- Quantum Error Correction
|
||||
|
||||
22. **Specialization D: Advanced Applications**
|
||||
- Graph Theory & Networks
|
||||
- Control Theory
|
||||
- Signal Processing
|
||||
- Cryptography
|
||||
- Robotics & Kinematics
|
||||
- Finite Element Methods
|
||||
- Image Processing
|
||||
- Computational Physics
|
||||
|
||||
---
|
||||
|
||||
## 📈 Progress Tracking
|
||||
|
||||
### Mastery Levels
|
||||
- **Level 0:** Unfamiliar - Never seen the concept
|
||||
- **Level 1:** Aware - Basic understanding, can't apply
|
||||
- **Level 2:** Competent - Can solve with reference
|
||||
- **Level 3:** Proficient - Can solve without reference
|
||||
- **Level 4:** Expert - Can teach, prove theorems, apply creatively
|
||||
|
||||
### Weekly Goals
|
||||
- Complete 1 module every 1-2 weeks
|
||||
- Solve 10-15 practice problems daily
|
||||
- Prove 2-3 theorems per week
|
||||
- Apply concepts in coding (Python/MATLAB)
|
||||
- Review previous week's material
|
||||
|
||||
### Monthly Assessments
|
||||
- Take comprehensive exam covering month's topics
|
||||
- Solve applied problems
|
||||
- Prove key theorems
|
||||
- Implement algorithms
|
||||
|
||||
---
|
||||
|
||||
## 🎓 Learning Resources
|
||||
|
||||
### Essential Textbooks
|
||||
1. **"Introduction to Linear Algebra"** by Gilbert Strang - Best intro
|
||||
2. **"Linear Algebra Done Right"** by Sheldon Axler - Theoretical
|
||||
3. **"Linear Algebra and Its Applications"** by David Lay - Applied
|
||||
4. **"Matrix Analysis"** by Horn & Johnson - Advanced
|
||||
5. **"Numerical Linear Algebra"** by Trefethen & Bau - Computational
|
||||
|
||||
### Video Lectures
|
||||
- MIT OCW: Gilbert Strang's Linear Algebra (legendary!)
|
||||
- 3Blue1Brown: "Essence of Linear Algebra" (visual intuition)
|
||||
- Khan Academy: Linear Algebra series
|
||||
- Stanford CS229: Linear Algebra Review
|
||||
|
||||
### Online Resources
|
||||
- WolframAlpha (matrix calculations)
|
||||
- SageMath / SymPy (symbolic computation)
|
||||
- MATLAB / Octave (numerical computation)
|
||||
- NumPy / SciPy (Python implementation)
|
||||
|
||||
### Practice Platforms
|
||||
- Math Stack Exchange
|
||||
- Brilliant.org (interactive problems)
|
||||
- Paul's Online Math Notes
|
||||
- MIT OpenCourseWare Problem Sets
|
||||
|
||||
---
|
||||
|
||||
## 🏆 Milestones & Achievements
|
||||
|
||||
### Milestone 1: Vector & Matrix Fundamentals (Month 2)
|
||||
- ✅ Master vector operations
|
||||
- ✅ Perform matrix arithmetic fluently
|
||||
- ✅ Understand geometric interpretations
|
||||
- ✅ Compute dot products, cross products, norms
|
||||
- 🎯 **Project:** Implement vector/matrix library in Python
|
||||
|
||||
### Milestone 2: Linear Systems & Decompositions (Month 4-5)
|
||||
- ✅ Solve systems of equations (Gaussian elimination)
|
||||
- ✅ Compute matrix inverses and determinants
|
||||
- ✅ Understand eigenvalues and eigenvectors
|
||||
- ✅ Apply basic decompositions (LU, QR)
|
||||
- 🎯 **Project:** Linear equation solver
|
||||
|
||||
### Milestone 3: Abstract Vector Spaces (Month 6-7)
|
||||
- ✅ Work with abstract vector spaces
|
||||
- ✅ Understand linear transformations
|
||||
- ✅ Apply orthogonalization (Gram-Schmidt)
|
||||
- ✅ Master SVD and applications
|
||||
- 🎯 **Project:** Image compression using SVD
|
||||
|
||||
### Milestone 4: Applications Mastery (Month 8-9)
|
||||
- ✅ Apply to machine learning (PCA, regression)
|
||||
- ✅ Implement graphics transformations
|
||||
- ✅ Solve optimization problems
|
||||
- ✅ Work with real-world data
|
||||
- 🎯 **Project:** Build ML model or graphics engine component
|
||||
|
||||
### Milestone 5: Specialization (Month 10+)
|
||||
- ✅ Deep expertise in chosen domain
|
||||
- ✅ Advanced applications
|
||||
- ✅ Research-level understanding
|
||||
- 🎯 **Project:** Advanced application in specialization
|
||||
|
||||
---
|
||||
|
||||
## 📝 Assessment Strategy
|
||||
|
||||
### Weekly Problem Sets
|
||||
- 15-20 computational problems
|
||||
- 3-5 proof-based problems
|
||||
- Mix of routine & challenging
|
||||
- Self-graded with solutions
|
||||
|
||||
### Monthly Exams
|
||||
- 20-30 problems
|
||||
- Mix: Computation (60%), Theory (20%), Applications (20%)
|
||||
- "I don't know" option available
|
||||
- Automatic grading for computational problems
|
||||
|
||||
### Project Assessments
|
||||
- Implement algorithms
|
||||
- Apply to real problems
|
||||
- Code review
|
||||
- Mathematical correctness
|
||||
|
||||
---
|
||||
|
||||
## 🚀 Getting Started
|
||||
|
||||
### Week 1 Action Plan
|
||||
1. Review basic algebra if needed
|
||||
2. Set up computation tools (Python + NumPy, or MATLAB)
|
||||
3. Watch 3Blue1Brown: "Essence of Linear Algebra" (first 3 videos)
|
||||
4. Start Module 1.1: Vectors Basics
|
||||
5. Solve first 10 vector problems
|
||||
6. Take first quiz
|
||||
|
||||
### Daily Study Routine
|
||||
- **Theory (30-40 min):** Read textbook, watch lectures
|
||||
- **Computation (30-40 min):** Solve numerical problems
|
||||
- **Proof (20-30 min):** Work on proof-based problems
|
||||
- **Implementation (optional 30 min):** Code in Python/MATLAB
|
||||
|
||||
### Weekend Activities
|
||||
- Build small projects (vector calculator, matrix solver)
|
||||
- Watch 3Blue1Brown videos for visual intuition
|
||||
- Solve challenging problems
|
||||
- Review week's material
|
||||
|
||||
---
|
||||
|
||||
## 💡 Learning Tips
|
||||
|
||||
1. **Visualize Everything:** Linear algebra is geometric - draw it!
|
||||
2. **Prove Key Theorems:** Understanding proofs builds deep intuition
|
||||
3. **Implement in Code:** Programming solidifies understanding
|
||||
4. **Use Multiple Resources:** Different explanations help
|
||||
5. **Work Many Problems:** Fluency requires practice
|
||||
6. **Connect to Applications:** See why it matters
|
||||
7. **Start Geometric, Then Abstract:** Intuition first, formalism later
|
||||
8. **Review Regularly:** Spaced repetition is key
|
||||
|
||||
---
|
||||
|
||||
## 🔗 Integration with Programming
|
||||
|
||||
### Python Implementation
|
||||
Use NumPy for all concepts:
|
||||
- `numpy.array()` for vectors/matrices
|
||||
- `numpy.dot()` for dot product
|
||||
- `numpy.linalg` for decompositions
|
||||
- `numpy.linalg.eig()` for eigenvalues
|
||||
- Visualize with matplotlib
|
||||
|
||||
### MATLAB/Octave
|
||||
Industry standard for numerical linear algebra:
|
||||
- Matrix operations built-in
|
||||
- Efficient computation
|
||||
- Rich visualization
|
||||
- Used in research and industry
|
||||
|
||||
### Applications
|
||||
- Machine learning libraries (scikit-learn, TensorFlow)
|
||||
- Graphics libraries (OpenGL matrices)
|
||||
- Data science (pandas, scipy)
|
||||
- Optimization (cvxpy, scipy.optimize)
|
||||
|
||||
---
|
||||
|
||||
## 🎯 Course Outline Detail
|
||||
|
||||
### Phase 1: Foundations (1.5-2 months)
|
||||
|
||||
#### Module 1.1: Vectors Basics
|
||||
**Concepts:**
|
||||
- Vectors in ℝⁿ
|
||||
- Geometric representation
|
||||
- Vector arithmetic
|
||||
- Linear combinations
|
||||
|
||||
**Key Skills:**
|
||||
- Visualize vectors in 2D/3D
|
||||
- Add/subtract vectors geometrically and algebraically
|
||||
- Compute scalar multiples
|
||||
- Express vectors as linear combinations
|
||||
|
||||
**Problems:** 30-40 practice problems
|
||||
|
||||
---
|
||||
|
||||
#### Module 1.2: Dot Product & Vector Operations
|
||||
**Concepts:**
|
||||
- Dot product: a · b = Σ aᵢbᵢ
|
||||
- Angle: cos(θ) = (a · b) / (||a|| ||b||)
|
||||
- Orthogonality: a ⊥ b ⟺ a · b = 0
|
||||
- Projection: proj_b(a) = ((a · b) / ||b||²) b
|
||||
|
||||
**Key Skills:**
|
||||
- Compute dot products
|
||||
- Find angles between vectors
|
||||
- Test orthogonality
|
||||
- Project vectors
|
||||
- Compute cross products (3D)
|
||||
|
||||
**Problems:** 40-50 practice problems
|
||||
|
||||
---
|
||||
|
||||
#### Module 1.3: Matrices Basics
|
||||
**Concepts:**
|
||||
- Matrix as array of numbers
|
||||
- Matrix multiplication: (AB)ᵢⱼ = Σ AᵢₖBₖⱼ
|
||||
- Identity matrix: AI = IA = A
|
||||
- Transpose: (Aᵀ)ᵢⱼ = Aⱼᵢ
|
||||
|
||||
**Key Skills:**
|
||||
- Add/subtract matrices
|
||||
- Multiply matrices correctly
|
||||
- Compute transposes
|
||||
- Work with special matrices
|
||||
|
||||
**Problems:** 50-60 practice problems
|
||||
|
||||
---
|
||||
|
||||
### Phase 2: Core Theory (2-3 months)
|
||||
|
||||
#### Module 2.1: Systems of Linear Equations
|
||||
**Concepts:**
|
||||
- Ax = b representation
|
||||
- Gaussian elimination algorithm
|
||||
- Row operations
|
||||
- Existence and uniqueness of solutions
|
||||
|
||||
**Key Skills:**
|
||||
- Solve systems using Gaussian elimination
|
||||
- Determine consistency
|
||||
- Find all solutions (unique, infinite, none)
|
||||
- Interpret geometrically
|
||||
|
||||
**Problems:** 40-50 systems to solve
|
||||
|
||||
---
|
||||
|
||||
#### Module 2.6: Eigenvalues & Eigenvectors
|
||||
**Concepts:**
|
||||
- Av = λv (defining equation)
|
||||
- det(A - λI) = 0 (characteristic equation)
|
||||
- Eigenspace for each eigenvalue
|
||||
- Diagonalization: A = PDP⁻¹
|
||||
|
||||
**Key Skills:**
|
||||
- Find eigenvalues (solve characteristic polynomial)
|
||||
- Find eigenvectors (solve (A - λI)v = 0)
|
||||
- Diagonalize matrices
|
||||
- Apply to differential equations, Markov chains
|
||||
|
||||
**Problems:** 30-40 eigenvalue problems
|
||||
|
||||
---
|
||||
|
||||
### Phase 3: Advanced Topics (1.5-2 months)
|
||||
|
||||
#### Module 3.1: Orthogonality
|
||||
**Concepts:**
|
||||
- Orthogonal sets
|
||||
- Orthonormal bases
|
||||
- Gram-Schmidt process
|
||||
- QR decomposition: A = QR
|
||||
|
||||
**Key Skills:**
|
||||
- Create orthonormal bases
|
||||
- Apply Gram-Schmidt
|
||||
- Compute QR decomposition
|
||||
- Solve least squares: Ax ≈ b
|
||||
|
||||
**Problems:** 25-30 orthogonalization problems
|
||||
|
||||
---
|
||||
|
||||
#### Module 3.3: Matrix Decompositions
|
||||
**Concepts:**
|
||||
- SVD: A = UΣVᵀ
|
||||
- Applications: Image compression, recommender systems
|
||||
- Spectral decomposition: A = QΛQᵀ
|
||||
- Low-rank approximations
|
||||
|
||||
**Key Skills:**
|
||||
- Compute SVD
|
||||
- Use SVD for compression
|
||||
- Apply to data analysis
|
||||
- Solve ill-conditioned problems
|
||||
|
||||
**Problems:** 20-25 decomposition problems
|
||||
|
||||
---
|
||||
|
||||
### Phase 4: Applications (1-2 months)
|
||||
|
||||
#### Module 4.1: Machine Learning
|
||||
**Concepts:**
|
||||
- Linear regression: θ = (XᵀX)⁻¹Xᵀy
|
||||
- PCA: eigenvectors of covariance matrix
|
||||
- Neural networks: matrix multiplications
|
||||
- Gradient descent: direction of steepest descent
|
||||
|
||||
**Key Skills:**
|
||||
- Implement linear regression
|
||||
- Apply PCA for dimensionality reduction
|
||||
- Understand neural network math
|
||||
- Optimize using gradients
|
||||
|
||||
**Projects:**
|
||||
- Linear regression from scratch
|
||||
- PCA implementation
|
||||
- Simple neural network
|
||||
|
||||
---
|
||||
|
||||
## 📊 Topics Coverage Matrix
|
||||
|
||||
### Computational Topics (60%)
|
||||
- Matrix operations and arithmetic
|
||||
- Solving linear systems
|
||||
- Computing inverses and determinants
|
||||
- Eigenvalue/eigenvector computation
|
||||
- Matrix decompositions
|
||||
- Numerical methods
|
||||
|
||||
### Theoretical Topics (25%)
|
||||
- Vector space theory
|
||||
- Linear transformations
|
||||
- Basis and dimension
|
||||
- Rank-nullity theorem
|
||||
- Spectral theory
|
||||
- Proofs and theorems
|
||||
|
||||
### Applied Topics (15%)
|
||||
- Machine learning applications
|
||||
- Computer graphics
|
||||
- Optimization
|
||||
- Data analysis
|
||||
- Real-world problem solving
|
||||
|
||||
---
|
||||
|
||||
## 🎓 Recommended Study Paths
|
||||
|
||||
### Path A: Theory-First (Pure Mathematics Background)
|
||||
1. Start with abstract definitions
|
||||
2. Prove theorems rigorously
|
||||
3. Then see applications
|
||||
4. Focus on "Linear Algebra Done Right" by Axler
|
||||
|
||||
### Path B: Computation-First (Engineering/CS Background) ⭐ RECOMMENDED
|
||||
1. Start with vectors and matrices
|
||||
2. Learn through computation
|
||||
3. Build geometric intuition
|
||||
4. Then add rigor
|
||||
5. Focus on Gilbert Strang's materials
|
||||
|
||||
### Path C: Application-Driven (Data Science/ML Focus)
|
||||
1. Start with applications (ML, graphics)
|
||||
2. Learn theory as needed
|
||||
3. Heavy Python/NumPy usage
|
||||
4. Focus on David Lay's book
|
||||
|
||||
---
|
||||
|
||||
## 📐 Mathematical Prerequisites
|
||||
|
||||
### Required (High School Level)
|
||||
- Basic algebra (equations, polynomials)
|
||||
- Functions and graphing
|
||||
- Arithmetic operations
|
||||
|
||||
### Helpful But Not Required
|
||||
- Calculus (for some applications)
|
||||
- Proof techniques (for theoretical parts)
|
||||
- Programming (for computational practice)
|
||||
|
||||
### Will Learn From Scratch
|
||||
- All linear algebra concepts
|
||||
- Mathematical notation
|
||||
- Proof methods (as needed)
|
||||
- Computational techniques
|
||||
|
||||
---
|
||||
|
||||
## 💻 Computational Tools
|
||||
|
||||
### Recommended: Python + NumPy
|
||||
```python
|
||||
import numpy as np
|
||||
import matplotlib.pyplot as plt
|
||||
|
||||
# Create vectors
|
||||
v = np.array([1, 2, 3])
|
||||
w = np.array([4, 5, 6])
|
||||
|
||||
# Dot product
|
||||
dot = np.dot(v, w)
|
||||
|
||||
# Matrix operations
|
||||
A = np.array([[1, 2], [3, 4]])
|
||||
B = np.linalg.inv(A) # Inverse
|
||||
eigs = np.linalg.eig(A) # Eigenvalues
|
||||
|
||||
# SVD
|
||||
U, S, Vt = np.linalg.svd(A)
|
||||
```
|
||||
|
||||
### Alternative: MATLAB/Octave
|
||||
- Industry standard
|
||||
- Built for matrix computation
|
||||
- Rich toolboxes
|
||||
- Used in academia and industry
|
||||
|
||||
### Visualization: 3Blue1Brown Style
|
||||
- matplotlib (Python)
|
||||
- manim (Python animation)
|
||||
- GeoGebra (interactive)
|
||||
- Desmos (2D graphing)
|
||||
|
||||
---
|
||||
|
||||
## 🏆 Success Metrics
|
||||
|
||||
### Knowledge Metrics
|
||||
- Can solve 90% of computational problems correctly
|
||||
- Can prove 70% of key theorems
|
||||
- Understand geometric meaning of all operations
|
||||
- Can implement algorithms efficiently
|
||||
|
||||
### Application Metrics
|
||||
- Build working ML models
|
||||
- Implement graphics transformations
|
||||
- Solve real-world optimization problems
|
||||
- Use linear algebra in projects
|
||||
|
||||
### Fluency Metrics
|
||||
- Solve standard problems in <5 minutes
|
||||
- Recognize patterns quickly
|
||||
- Choose appropriate methods
|
||||
- Debug matrix computations
|
||||
|
||||
---
|
||||
|
||||
## 🔗 Next Steps
|
||||
|
||||
1. Review detailed modules in `01_KNOWLEDGE_GRAPH.md`
|
||||
2. Assess your level in `02_INITIAL_ASSESSMENT.md`
|
||||
3. Set up Python + NumPy or MATLAB
|
||||
4. Watch 3Blue1Brown's series (11 videos)
|
||||
5. Start Module 1.1: Vectors Basics
|
||||
|
||||
---
|
||||
|
||||
## 📊 Estimated Timeline by Background
|
||||
|
||||
### No Math Background (Starting Fresh)
|
||||
- **Phase 1:** 2-3 months
|
||||
- **Phase 2:** 3-4 months
|
||||
- **Phase 3:** 2-3 months
|
||||
- **Phase 4:** 2 months
|
||||
- **Total:** 9-12 months to applications
|
||||
|
||||
### Some Math Background (Algebra comfortable)
|
||||
- **Phase 1:** 1.5-2 months
|
||||
- **Phase 2:** 2-3 months
|
||||
- **Phase 3:** 1.5-2 months
|
||||
- **Phase 4:** 1-2 months
|
||||
- **Total:** 6-9 months to applications
|
||||
|
||||
### Strong Math Background (Calculus/Proofs)
|
||||
- **Phase 1:** 3-4 weeks (rapid review)
|
||||
- **Phase 2:** 2 months
|
||||
- **Phase 3:** 1.5 months
|
||||
- **Phase 4:** 1 month
|
||||
- **Total:** 4-6 months to specialization
|
||||
|
||||
---
|
||||
|
||||
## 🌟 Why Learn Linear Algebra?
|
||||
|
||||
### Foundation for Advanced Fields
|
||||
- **Machine Learning:** PCA, neural networks, optimization
|
||||
- **Computer Graphics:** All transformations are matrices
|
||||
- **Data Science:** Dimensionality reduction, analysis
|
||||
- **Quantum Computing:** States are vectors, gates are matrices
|
||||
- **Optimization:** All modern optimization uses linear algebra
|
||||
- **Physics:** Quantum mechanics, relativity
|
||||
|
||||
### Practical Applications
|
||||
- Build ML models
|
||||
- Create graphics engines
|
||||
- Analyze large datasets
|
||||
- Solve engineering problems
|
||||
- Understand modern AI
|
||||
|
||||
### Career Opportunities
|
||||
- Data Scientist
|
||||
- Machine Learning Engineer
|
||||
- Computer Graphics Programmer
|
||||
- Quantitative Analyst
|
||||
- Research Scientist
|
||||
- Robotics Engineer
|
||||
|
||||
---
|
||||
|
||||
**Remember:** Linear Algebra is the language of modern mathematics, science, and technology. Master it and unlock countless opportunities! 📐🚀
|
||||
|
||||
709
learning_plans/linear_algebra/01_KNOWLEDGE_GRAPH.md
Normal file
709
learning_plans/linear_algebra/01_KNOWLEDGE_GRAPH.md
Normal file
@@ -0,0 +1,709 @@
|
||||
# Linear Algebra Knowledge Graph - Complete Dependency Map
|
||||
|
||||
## 🌳 Knowledge Tree Structure
|
||||
|
||||
This document maps all Linear Algebra concepts with their dependencies and optimal learning order.
|
||||
|
||||
---
|
||||
|
||||
## Level 1: Foundation Concepts (No Prerequisites)
|
||||
|
||||
### 1.1 Basic Algebra Review
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Arithmetic Operations │
|
||||
│ - Addition, Subtraction │
|
||||
│ - Multiplication, Division │
|
||||
│ - Order of operations │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Algebraic Expressions
|
||||
├─> Solving Linear Equations
|
||||
├─> Polynomials
|
||||
└─> Summation Notation (Σ)
|
||||
```
|
||||
|
||||
### 1.2 Coordinate Systems
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Cartesian Coordinates │
|
||||
│ - 2D plane (x, y) │
|
||||
│ - 3D space (x, y, z) │
|
||||
│ - Points and plotting │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Distance Formula
|
||||
├─> Midpoint Formula
|
||||
└─> Graphing Functions
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Level 2: Vectors - Geometric (Requires Level 1)
|
||||
|
||||
### 2.1 Vector Basics
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Vector Definition │
|
||||
│ - Magnitude and Direction │
|
||||
│ - Component Form │
|
||||
│ - Position Vectors │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Vector Notation (bold, arrow)
|
||||
├─> ℝ² and ℝ³ vectors
|
||||
├─> n-dimensional vectors (ℝⁿ)
|
||||
└─> Column vs Row Vectors
|
||||
|
||||
┌──────────────────────────────┐
|
||||
│ Vector Visualization │
|
||||
│ - Geometric arrows │
|
||||
│ - Head and tail │
|
||||
│ - Parallel vectors │
|
||||
└──────────────────────────────┘
|
||||
```
|
||||
|
||||
### 2.2 Vector Operations (Geometric)
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Vector Addition │
|
||||
│ - Parallelogram Rule │
|
||||
│ - Tip-to-tail Method │
|
||||
│ - Component-wise Addition │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Vector Subtraction
|
||||
├─> Scalar Multiplication (Scaling)
|
||||
├─> Linear Combinations
|
||||
└─> Zero Vector
|
||||
|
||||
┌──────────────────────────────┐
|
||||
│ Special Vectors │
|
||||
│ - Unit Vectors │
|
||||
│ - Standard Basis (i, j, k) │
|
||||
│ - Normalization │
|
||||
└──────────────────────────────┘
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Level 3: Vector Products (Requires Level 2)
|
||||
|
||||
### 3.1 Dot Product (Inner Product)
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Dot Product │
|
||||
│ - a · b = Σ aᵢbᵢ │
|
||||
│ - a · b = ||a|| ||b|| cos θ│
|
||||
│ - Scalar result │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Vector Length (Norm): ||v|| = √(v · v)
|
||||
├─> Distance: ||a - b||
|
||||
├─> Angle Between Vectors
|
||||
├─> Orthogonality (a · b = 0)
|
||||
├─> Vector Projection
|
||||
└─> Cauchy-Schwarz Inequality
|
||||
|
||||
┌──────────────────────────────┐
|
||||
│ Properties of Dot Product │
|
||||
│ - Commutative: a · b = b · a│
|
||||
│ - Distributive │
|
||||
│ - Linearity │
|
||||
└──────────────────────────────┘
|
||||
```
|
||||
|
||||
### 3.2 Cross Product (3D Only)
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Cross Product (a × b) │
|
||||
│ - Vector result │
|
||||
│ - Perpendicular to both │
|
||||
│ - Right-hand rule │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Magnitude: ||a × b|| = ||a|| ||b|| sin θ
|
||||
├─> Area of Parallelogram
|
||||
├─> Determinant Form
|
||||
├─> Anti-commutative: a × b = -(b × a)
|
||||
└─> Triple Scalar Product
|
||||
|
||||
┌──────────────────────────────┐
|
||||
│ Applications │
|
||||
│ - Normal vectors │
|
||||
│ - Torque calculations │
|
||||
│ - Area and volume │
|
||||
└──────────────────────────────┘
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Level 4: Matrices - Basics (Requires Level 2-3)
|
||||
|
||||
### 4.1 Matrix Fundamentals
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Matrix Definition │
|
||||
│ - m × n array of numbers │
|
||||
│ - Rows and columns │
|
||||
│ - Matrix indexing Aᵢⱼ │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Matrix Addition/Subtraction
|
||||
├─> Scalar Multiplication
|
||||
├─> Transpose (Aᵀ)
|
||||
├─> Special Matrices (I, O, Diagonal)
|
||||
└─> Matrix Equality
|
||||
```
|
||||
|
||||
### 4.2 Matrix Multiplication
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Matrix Product │
|
||||
│ - (AB)ᵢⱼ = Σ AᵢₖBₖⱼ │
|
||||
│ - Dimension compatibility │
|
||||
│ - Non-commutative │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Properties (Associative, Distributive)
|
||||
├─> Identity: AI = IA = A
|
||||
├─> Matrix Powers: A², A³, ...
|
||||
├─> Matrix as Linear Transformation
|
||||
└─> Block Matrix Multiplication
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Level 5: Linear Systems (Requires Level 4)
|
||||
|
||||
### 5.1 Systems of Linear Equations
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ System Representation │
|
||||
│ - Ax = b │
|
||||
│ - Augmented Matrix [A|b] │
|
||||
│ - Coefficient Matrix │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Gaussian Elimination
|
||||
├─> Row Operations
|
||||
├─> Row Echelon Form (REF)
|
||||
├─> Reduced Row Echelon Form (RREF)
|
||||
└─> Back Substitution
|
||||
|
||||
┌──────────────────────────────┐
|
||||
│ Solution Types │
|
||||
│ - Unique Solution │
|
||||
│ - Infinite Solutions │
|
||||
│ - No Solution │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Consistency
|
||||
├─> Homogeneous Systems
|
||||
├─> Parametric Solutions
|
||||
└─> Geometric Interpretation
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Level 6: Matrix Inverses & Determinants (Requires Level 5)
|
||||
|
||||
### 6.1 Matrix Inverse
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Inverse Definition │
|
||||
│ - AA⁻¹ = A⁻¹A = I │
|
||||
│ - Exists iff det(A) ≠ 0 │
|
||||
│ - Unique if exists │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Computing Inverses (Gauss-Jordan)
|
||||
├─> Inverse Properties: (AB)⁻¹ = B⁻¹A⁻¹
|
||||
├─> Inverse and Transpose: (Aᵀ)⁻¹ = (A⁻¹)ᵀ
|
||||
├─> Solving Systems: x = A⁻¹b
|
||||
└─> Invertible Matrix Theorem
|
||||
```
|
||||
|
||||
### 6.2 Determinants
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Determinant │
|
||||
│ - det(A) or |A| │
|
||||
│ - Scalar value │
|
||||
│ - Invertibility test │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> 2×2: ad - bc
|
||||
├─> 3×3: Rule of Sarrus or Cofactor
|
||||
├─> n×n: Cofactor Expansion
|
||||
├─> Properties: det(AB) = det(A)det(B)
|
||||
├─> det(Aᵀ) = det(A)
|
||||
├─> Row Operations Effect
|
||||
├─> Cramer's Rule
|
||||
└─> Geometric Meaning (Area/Volume)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Level 7: Vector Spaces (Requires Level 2-6)
|
||||
|
||||
### 7.1 Abstract Vector Spaces
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Vector Space Definition │
|
||||
│ - 10 Axioms │
|
||||
│ - Closure under + and · │
|
||||
│ - Examples: ℝⁿ, Polynomials │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Subspaces
|
||||
├─> Span of Vectors
|
||||
├─> Linear Independence
|
||||
├─> Linear Dependence
|
||||
├─> Basis
|
||||
└─> Dimension
|
||||
|
||||
┌──────────────────────────────┐
|
||||
│ Important Subspaces │
|
||||
│ - Null Space (Kernel) │
|
||||
│ - Column Space (Range) │
|
||||
│ - Row Space │
|
||||
│ - Left Null Space │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Rank of Matrix
|
||||
├─> Nullity of Matrix
|
||||
├─> Rank-Nullity Theorem
|
||||
└─> Fundamental Theorem of Linear Algebra
|
||||
```
|
||||
|
||||
### 7.2 Basis & Dimension
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Basis │
|
||||
│ - Linearly independent │
|
||||
│ - Spans the space │
|
||||
│ - Minimum spanning set │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Standard Basis
|
||||
├─> Dimension = # basis vectors
|
||||
├─> Change of Basis
|
||||
├─> Coordinates Relative to Basis
|
||||
└─> Uniqueness of Dimension
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Level 8: Linear Transformations (Requires Level 7)
|
||||
|
||||
### 8.1 Linear Transformations
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Transformation T: V → W │
|
||||
│ - T(u + v) = T(u) + T(v) │
|
||||
│ - T(cv) = cT(v) │
|
||||
│ - Matrix representation │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Kernel (Null Space): ker(T) = {v : T(v) = 0}
|
||||
├─> Range (Image): range(T) = {T(v) : v ∈ V}
|
||||
├─> Rank-Nullity Theorem
|
||||
├─> One-to-one Transformations
|
||||
├─> Onto Transformations
|
||||
└─> Isomorphisms
|
||||
|
||||
┌──────────────────────────────┐
|
||||
│ Standard Transformations │
|
||||
│ - Rotation │
|
||||
│ - Reflection │
|
||||
│ - Projection │
|
||||
│ - Scaling │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
└─> Composition of Transformations
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Level 9: Eigenvalues & Eigenvectors (Requires Level 6-8)
|
||||
|
||||
### 9.1 Eigen-Theory
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Eigenvalue Problem │
|
||||
│ - Av = λv │
|
||||
│ - Characteristic Polynomial │
|
||||
│ - det(A - λI) = 0 │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Computing Eigenvalues
|
||||
├─> Computing Eigenvectors
|
||||
├─> Eigenspace
|
||||
├─> Algebraic Multiplicity
|
||||
├─> Geometric Multiplicity
|
||||
└─> Diagonalization
|
||||
|
||||
┌──────────────────────────────┐
|
||||
│ Diagonalization │
|
||||
│ - A = PDP⁻¹ │
|
||||
│ - D diagonal (eigenvalues) │
|
||||
│ - P columns (eigenvectors) │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Diagonalizable Matrices
|
||||
├─> Similar Matrices
|
||||
├─> Powers: Aⁿ = PDⁿP⁻¹
|
||||
└─> Applications: Differential Equations, Markov Chains
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Level 10: Orthogonality (Requires Level 3, 7)
|
||||
|
||||
### 10.1 Orthogonal Sets
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Orthogonality │
|
||||
│ - v · w = 0 │
|
||||
│ - Perpendicular vectors │
|
||||
│ - Orthogonal sets │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Orthogonal Basis
|
||||
├─> Orthonormal Basis
|
||||
├─> Orthogonal Complement
|
||||
└─> Orthogonal Decomposition
|
||||
|
||||
┌──────────────────────────────┐
|
||||
│ Gram-Schmidt Process │
|
||||
│ - Orthogonalization │
|
||||
│ - Creates orthonormal basis │
|
||||
│ - QR Decomposition │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
└─> Applications: Least Squares, QR Algorithm
|
||||
```
|
||||
|
||||
### 10.2 Orthogonal Matrices
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Orthogonal Matrix Q │
|
||||
│ - QᵀQ = QQᵀ = I │
|
||||
│ - Columns orthonormal │
|
||||
│ - Preserves lengths │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Rotation Matrices
|
||||
├─> Reflection Matrices
|
||||
├─> det(Q) = ±1
|
||||
└─> Q⁻¹ = Qᵀ
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Level 11: Inner Product Spaces (Requires Level 3, 7, 10)
|
||||
|
||||
### 11.1 Inner Products
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Inner Product ⟨u, v⟩ │
|
||||
│ - Generalizes dot product │
|
||||
│ - 4 Axioms │
|
||||
│ - Induces norm & metric │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Cauchy-Schwarz Inequality
|
||||
├─> Triangle Inequality
|
||||
├─> Parallelogram Law
|
||||
├─> Pythagorean Theorem
|
||||
└─> Norm: ||v|| = √⟨v, v⟩
|
||||
|
||||
┌──────────────────────────────┐
|
||||
│ Applications │
|
||||
│ - Function spaces │
|
||||
│ - Polynomial inner products │
|
||||
│ - Weighted inner products │
|
||||
└──────────────────────────────┘
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Level 12: Matrix Decompositions (Requires Level 6, 9, 10)
|
||||
|
||||
### 12.1 LU Decomposition
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ LU Factorization │
|
||||
│ - A = LU │
|
||||
│ - L: Lower triangular │
|
||||
│ - U: Upper triangular │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Existence Conditions
|
||||
├─> Computing LU
|
||||
├─> Solving Systems with LU
|
||||
├─> Computational Efficiency
|
||||
└─> PLU (with Pivoting)
|
||||
```
|
||||
|
||||
### 12.2 QR Decomposition
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ QR Factorization │
|
||||
│ - A = QR │
|
||||
│ - Q: Orthogonal │
|
||||
│ - R: Upper triangular │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Gram-Schmidt Method
|
||||
├─> Householder Reflections
|
||||
├─> Givens Rotations
|
||||
├─> Least Squares Solutions
|
||||
└─> QR Algorithm for Eigenvalues
|
||||
```
|
||||
|
||||
### 12.3 Eigenvalue Decomposition (Spectral)
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Spectral Decomposition │
|
||||
│ - A = QΛQᵀ │
|
||||
│ - Symmetric matrices │
|
||||
│ - Real eigenvalues │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Orthogonal Eigenvectors
|
||||
├─> Spectral Theorem
|
||||
├─> Applications
|
||||
└─> Positive Definite Matrices
|
||||
```
|
||||
|
||||
### 12.4 Singular Value Decomposition (SVD)
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ SVD: A = UΣVᵀ │
|
||||
│ - U: Left singular vectors │
|
||||
│ - Σ: Singular values │
|
||||
│ - V: Right singular vectors │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Always Exists (any matrix)
|
||||
├─> Singular Values
|
||||
├─> Relationship to Eigenvalues
|
||||
├─> Pseudoinverse (A⁺)
|
||||
├─> Low-rank Approximation
|
||||
├─> Image Compression
|
||||
├─> Data Analysis (PCA)
|
||||
└─> Recommender Systems
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Level 13: Advanced Theory (Requires Level 7-12)
|
||||
|
||||
### 13.1 Abstract Algebra Connections
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Algebraic Structures │
|
||||
│ - Groups │
|
||||
│ - Rings │
|
||||
│ - Fields │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Vector Space as Module
|
||||
├─> Linear Algebra over Fields
|
||||
└─> Quotient Spaces
|
||||
```
|
||||
|
||||
### 13.2 Norms & Metrics
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Vector Norms │
|
||||
│ - L¹ norm: Σ|vᵢ| │
|
||||
│ - L² norm (Euclidean) │
|
||||
│ - L∞ norm: max|vᵢ| │
|
||||
│ - p-norms │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Matrix Norms
|
||||
├─> Frobenius Norm
|
||||
├─> Operator Norm
|
||||
├─> Condition Number
|
||||
└─> Error Analysis
|
||||
|
||||
┌──────────────────────────────┐
|
||||
│ Metric Spaces │
|
||||
│ - Distance Function │
|
||||
│ - Metric Properties │
|
||||
│ - Induced by Norms │
|
||||
└──────────────────────────────┘
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Level 14: Applications - Machine Learning (Requires All Previous)
|
||||
|
||||
### 14.1 ML Fundamentals
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Linear Regression │
|
||||
│ - Normal Equations │
|
||||
│ - θ = (XᵀX)⁻¹Xᵀy │
|
||||
│ - Least Squares │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Gradient Descent
|
||||
├─> Ridge Regression (L2)
|
||||
├─> Lasso Regression (L1)
|
||||
└─> Regularization
|
||||
|
||||
┌──────────────────────────────┐
|
||||
│ Dimensionality Reduction │
|
||||
│ - PCA (Principal Components)│
|
||||
│ - SVD for PCA │
|
||||
│ - Explained Variance │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Eigenfaces
|
||||
├─> Feature Extraction
|
||||
├─> Data Visualization
|
||||
└─> Compression
|
||||
```
|
||||
|
||||
### 14.2 Neural Networks
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ Neural Network Math │
|
||||
│ - Forward Pass: y = Wx + b │
|
||||
│ - Backpropagation │
|
||||
│ - Gradient Computation │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Weight Matrices
|
||||
├─> Activation Functions
|
||||
├─> Loss Gradients
|
||||
└─> Optimization (SGD, Adam)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Level 15: Applications - Graphics (Requires Level 4, 8)
|
||||
|
||||
### 15.1 Geometric Transformations
|
||||
```
|
||||
┌──────────────────────────────┐
|
||||
│ 2D Transformations │
|
||||
│ - Translation │
|
||||
│ - Rotation │
|
||||
│ - Scaling │
|
||||
│ - Shearing │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Homogeneous Coordinates
|
||||
├─> Transformation Matrices
|
||||
├─> Composition
|
||||
└─> Inverse Transformations
|
||||
|
||||
┌──────────────────────────────┐
|
||||
│ 3D Graphics │
|
||||
│ - 3D Rotations │
|
||||
│ - View Transformations │
|
||||
│ - Projection (Orthographic) │
|
||||
│ - Projection (Perspective) │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
├─> Camera Matrices
|
||||
├─> Model-View-Projection
|
||||
├─> Quaternions
|
||||
└─> Euler Angles
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 🔗 Dependency Map Summary
|
||||
|
||||
### Critical Learning Path
|
||||
```
|
||||
Level 1 (Algebra Review)
|
||||
↓
|
||||
Level 2 (Vectors - Geometric)
|
||||
↓
|
||||
Level 3 (Vector Products)
|
||||
↓
|
||||
Level 4 (Matrices - Basics)
|
||||
↓
|
||||
Level 5 (Linear Systems)
|
||||
↓
|
||||
Level 6 (Inverses & Determinants)
|
||||
↓
|
||||
Level 7 (Vector Spaces) [Theoretical Branch]
|
||||
↓
|
||||
Level 8 (Linear Transformations)
|
||||
↓
|
||||
Level 9 (Eigenvalues) ←─────────┐
|
||||
↓ │
|
||||
Level 10 (Orthogonality) ───────┤ Can parallelize
|
||||
↓ │
|
||||
Level 11 (Inner Products) ──────┘
|
||||
↓
|
||||
Level 12 (Decompositions)
|
||||
↓
|
||||
Level 13 (Advanced Theory) ←────┐
|
||||
↓ │ Can parallelize
|
||||
Level 14 (ML Applications) ─────┤
|
||||
↓ │
|
||||
Level 15 (Graphics Applications)┘
|
||||
```
|
||||
|
||||
### Parallel Learning Opportunities
|
||||
- Levels 9, 10, 11 can be learned in parallel after Level 8
|
||||
- Level 13 (theory) can parallel with Levels 14-15 (applications)
|
||||
- Applications (14-15) depend on decompositions but can be learned in any order
|
||||
|
||||
---
|
||||
|
||||
## 📊 Prerequisite Matrix
|
||||
|
||||
| Topic | Must Know First | Can Learn In Parallel |
|
||||
|-------|----------------|----------------------|
|
||||
| Dot Product | Vector basics | Cross product |
|
||||
| Matrices | Vectors | - |
|
||||
| Matrix Mult | Matrix basics | Transpose |
|
||||
| Linear Systems | Matrix multiplication | - |
|
||||
| Determinants | Matrix multiplication | Inverses |
|
||||
| Inverses | Determinants, Systems | - |
|
||||
| Vector Spaces | Linear systems, Span | - |
|
||||
| Eigenvalues | Determinants, Vector spaces | - |
|
||||
| Orthogonality | Dot product, Basis | Inner products |
|
||||
| SVD | Eigenvalues, Orthogonality | - |
|
||||
| PCA | SVD, Statistics basics | - |
|
||||
|
||||
---
|
||||
|
||||
## 🎯 Learning Strategies
|
||||
|
||||
### Geometric First, Then Abstract
|
||||
1. Start with 2D/3D vectors (can visualize)
|
||||
2. Build geometric intuition
|
||||
3. Generalize to n dimensions
|
||||
4. Then study abstract theory
|
||||
5. Apply to real problems
|
||||
|
||||
### Computation Supports Theory
|
||||
1. Solve many numerical examples
|
||||
2. Use Python/MATLAB to verify
|
||||
3. See patterns emerge
|
||||
4. Then learn why (proofs)
|
||||
5. Deepen understanding
|
||||
|
||||
### Applications Motivate Learning
|
||||
1. See where linear algebra is used
|
||||
2. Understand why we need it
|
||||
3. Learn concepts to solve problems
|
||||
4. Apply immediately
|
||||
5. Build projects
|
||||
|
||||
---
|
||||
|
||||
This knowledge graph ensures you build strong foundations before tackling abstract concepts and applications!
|
||||
|
||||
307
learning_plans/linear_algebra/02_INITIAL_ASSESSMENT.md
Normal file
307
learning_plans/linear_algebra/02_INITIAL_ASSESSMENT.md
Normal file
@@ -0,0 +1,307 @@
|
||||
# Linear Algebra Initial Assessment
|
||||
|
||||
## 🎯 Purpose
|
||||
|
||||
This assessment will help determine your current Linear Algebra proficiency level and create a personalized learning path.
|
||||
|
||||
## 📋 Assessment Structure
|
||||
|
||||
### Part 1: Self-Assessment Questionnaire
|
||||
### Part 2: Computational Problems
|
||||
### Part 3: Knowledge Gap Analysis
|
||||
|
||||
---
|
||||
|
||||
## Part 1: Self-Assessment Questionnaire
|
||||
|
||||
Rate yourself honestly (0-4):
|
||||
- **Level 0:** Never heard of it
|
||||
- **Level 1:** Basic awareness
|
||||
- **Level 2:** Can use with reference
|
||||
- **Level 3:** Proficient, confident
|
||||
- **Level 4:** Expert, can teach
|
||||
|
||||
### Vectors
|
||||
| Topic | Level (0-4) | Notes |
|
||||
|-------|-------------|-------|
|
||||
| Vector definition & notation | | |
|
||||
| Vector addition/subtraction | | |
|
||||
| Scalar multiplication | | |
|
||||
| Dot product | | |
|
||||
| Cross product (3D) | | |
|
||||
| Vector magnitude/norm | | |
|
||||
| Unit vectors | | |
|
||||
| Orthogonal vectors | | |
|
||||
| Vector projection | | |
|
||||
| Linear combinations | | |
|
||||
|
||||
### Matrices
|
||||
| Topic | Level (0-4) | Notes |
|
||||
|-------|-------------|-------|
|
||||
| Matrix definition | | |
|
||||
| Matrix addition/subtraction | | |
|
||||
| Matrix multiplication | | |
|
||||
| Transpose | | |
|
||||
| Identity matrix | | |
|
||||
| Inverse matrix | | |
|
||||
| Determinants | | |
|
||||
| Special matrices (diagonal, symmetric) | | |
|
||||
|
||||
### Linear Systems
|
||||
| Topic | Level (0-4) | Notes |
|
||||
|-------|-------------|-------|
|
||||
| Systems of linear equations | | |
|
||||
| Gaussian elimination | | |
|
||||
| Row echelon form | | |
|
||||
| RREF | | |
|
||||
| Solution types (unique, infinite, none) | | |
|
||||
| Homogeneous systems | | |
|
||||
| Augmented matrices | | |
|
||||
|
||||
### Vector Spaces
|
||||
| Topic | Level (0-4) | Notes |
|
||||
|-------|-------------|-------|
|
||||
| Vector space definition | | |
|
||||
| Subspaces | | |
|
||||
| Span | | |
|
||||
| Linear independence | | |
|
||||
| Basis | | |
|
||||
| Dimension | | |
|
||||
| Null space | | |
|
||||
| Column space | | |
|
||||
| Rank | | |
|
||||
|
||||
### Eigenvalues & Decompositions
|
||||
| Topic | Level (0-4) | Notes |
|
||||
|-------|-------------|-------|
|
||||
| Eigenvalues | | |
|
||||
| Eigenvectors | | |
|
||||
| Characteristic polynomial | | |
|
||||
| Diagonalization | | |
|
||||
| LU decomposition | | |
|
||||
| QR decomposition | | |
|
||||
| SVD | | |
|
||||
| Orthogonalization (Gram-Schmidt) | | |
|
||||
|
||||
### Applications
|
||||
| Topic | Level (0-4) | Notes |
|
||||
|-------|-------------|-------|
|
||||
| Linear regression | | |
|
||||
| PCA | | |
|
||||
| Graphics transformations | | |
|
||||
| Least squares | | |
|
||||
| Optimization | | |
|
||||
|
||||
---
|
||||
|
||||
## Part 2: Computational Problems
|
||||
|
||||
### Problem 1: Vector Operations (Beginner)
|
||||
Given vectors u = [2, -1, 3] and v = [1, 4, -2]:
|
||||
|
||||
a) Compute u + v
|
||||
b) Compute 3u - 2v
|
||||
c) Compute ||u|| (magnitude)
|
||||
d) Compute u · v (dot product)
|
||||
e) Are u and v orthogonal?
|
||||
|
||||
**Can you solve this?** ☐ Yes ☐ No ☐ Partially
|
||||
|
||||
---
|
||||
|
||||
### Problem 2: Matrix Multiplication (Beginner)
|
||||
Compute AB where:
|
||||
```
|
||||
A = [1 2] B = [5 6]
|
||||
[3 4] [7 8]
|
||||
```
|
||||
|
||||
**Can you solve this?** ☐ Yes ☐ No ☐ With formula
|
||||
|
||||
---
|
||||
|
||||
### Problem 3: Solve Linear System (Intermediate)
|
||||
Solve using Gaussian elimination:
|
||||
```
|
||||
x + 2y - z = 3
|
||||
2x - y + z = 1
|
||||
3x + y + 2z = 11
|
||||
```
|
||||
|
||||
**Can you solve this?** ☐ Yes ☐ No ☐ With steps
|
||||
|
||||
---
|
||||
|
||||
### Problem 4: Matrix Inverse (Intermediate)
|
||||
Find the inverse of:
|
||||
```
|
||||
A = [2 1]
|
||||
[5 3]
|
||||
```
|
||||
|
||||
**Can you solve this?** ☐ Yes ☐ No ☐ With formula
|
||||
|
||||
---
|
||||
|
||||
### Problem 5: Eigenvalues (Advanced)
|
||||
Find eigenvalues and eigenvectors of:
|
||||
```
|
||||
A = [3 1]
|
||||
[1 3]
|
||||
```
|
||||
|
||||
**Can you solve this?** ☐ Yes ☐ No ☐ With steps
|
||||
|
||||
---
|
||||
|
||||
### Problem 6: Application - Linear Regression (Advanced)
|
||||
Given data points: (1,2), (2,4), (3,5), (4,6)
|
||||
|
||||
Set up and solve the least squares problem to find the best-fit line y = mx + b using matrix methods.
|
||||
|
||||
**Can you solve this?** ☐ Yes ☐ No ☐ Know concept only
|
||||
|
||||
---
|
||||
|
||||
## Part 3: Knowledge Gap Analysis
|
||||
|
||||
### Based on Self-Assessment
|
||||
|
||||
**Count your scores:**
|
||||
- Topics at Level 0: ___
|
||||
- Topics at Level 1: ___
|
||||
- Topics at Level 2: ___
|
||||
- Topics at Level 3: ___
|
||||
- Topics at Level 4: ___
|
||||
|
||||
**Total topics:** ___
|
||||
|
||||
### Based on Problems
|
||||
|
||||
**Problems solved:**
|
||||
- Problem 1 (Vectors): ☐
|
||||
- Problem 2 (Matrix Mult): ☐
|
||||
- Problem 3 (Systems): ☐
|
||||
- Problem 4 (Inverse): ☐
|
||||
- Problem 5 (Eigenvalues): ☐
|
||||
- Problem 6 (Application): ☐
|
||||
|
||||
**Total solved:** ___ / 6
|
||||
|
||||
---
|
||||
|
||||
## 📊 Proficiency Level Determination
|
||||
|
||||
### Absolute Beginner (0-20% Level 2+, 0-1 problems)
|
||||
- **Start:** Phase 1 from Module 1.1
|
||||
- **Timeline:** 10-12 months to applications
|
||||
- **Focus:** Build from scratch, emphasize geometric intuition
|
||||
- **Resources:** 3Blue1Brown, Khan Academy, "Linear Algebra Done Right"
|
||||
|
||||
### Beginner (20-40% Level 2+, 1-2 problems)
|
||||
- **Start:** Phase 1 with quick review, focus on Phase 2
|
||||
- **Timeline:** 8-10 months to applications
|
||||
- **Focus:** Strengthen basics, master systems and inverses
|
||||
- **Resources:** Gilbert Strang lectures, "Linear Algebra and Its Applications"
|
||||
|
||||
### Intermediate (40-60% Level 2+, 3-4 problems)
|
||||
- **Start:** Phase 2, review Phase 1 as needed
|
||||
- **Timeline:** 6-8 months to applications
|
||||
- **Focus:** Vector spaces, eigenvalues, decompositions
|
||||
- **Resources:** Strang's book, MIT OCW
|
||||
|
||||
### Advanced (60-80% Level 2+, 5 problems)
|
||||
- **Start:** Phase 3, skim Phase 1-2
|
||||
- **Timeline:** 4-6 months to specialization
|
||||
- **Focus:** Advanced theory and applications
|
||||
- **Resources:** "Matrix Analysis", research papers
|
||||
|
||||
### Expert (80%+ Level 3+, 6 problems)
|
||||
- **Start:** Phase 4-5 (Applications & Specialization)
|
||||
- **Timeline:** 2-4 months to deep specialization
|
||||
- **Focus:** Specialized applications, cutting-edge topics
|
||||
- **Resources:** Research papers, advanced texts
|
||||
|
||||
---
|
||||
|
||||
## 🎯 Personalized Learning Path
|
||||
|
||||
### Your Starting Point
|
||||
**Based on assessment:** _______________
|
||||
|
||||
### Recommended Phase
|
||||
**Start at Phase:** _______________
|
||||
|
||||
### Topics to Review First
|
||||
1. _______________
|
||||
2. _______________
|
||||
3. _______________
|
||||
|
||||
### Topics to Skip (Already Mastered)
|
||||
1. _______________
|
||||
2. _______________
|
||||
|
||||
### Weak Areas to Focus On
|
||||
1. _______________
|
||||
2. _______________
|
||||
|
||||
### Estimated Timeline to Advanced
|
||||
**From your starting point:** ___ months
|
||||
|
||||
---
|
||||
|
||||
## 📝 Action Items
|
||||
|
||||
### Immediate (This Week)
|
||||
1. ☐ Complete this assessment
|
||||
2. ☐ Set up Python + NumPy or MATLAB
|
||||
3. ☐ Watch 3Blue1Brown: "Essence of Linear Algebra" (video 1)
|
||||
4. ☐ Review recommended phase in Master Plan
|
||||
5. ☐ Join math communities (r/learnmath, Math Stack Exchange)
|
||||
|
||||
### First Month
|
||||
1. ☐ Complete ____ modules
|
||||
2. ☐ Solve 100+ practice problems
|
||||
3. ☐ Watch all 3Blue1Brown videos (11 total)
|
||||
4. ☐ Implement basic operations in code
|
||||
5. ☐ Take first monthly exam
|
||||
|
||||
---
|
||||
|
||||
## 🔄 Reassessment Schedule
|
||||
|
||||
- **Week 4:** Quick progress check
|
||||
- **Month 3:** Comprehensive reassessment
|
||||
- **Month 6:** Mid-journey assessment
|
||||
- **Month 9:** Full reassessment
|
||||
- **Month 12:** Expert level check
|
||||
|
||||
---
|
||||
|
||||
## 📚 Additional Resources
|
||||
|
||||
### Video Series
|
||||
- **3Blue1Brown:** "Essence of Linear Algebra" (MUST WATCH)
|
||||
- **MIT OCW:** Gilbert Strang's 18.06
|
||||
- **Khan Academy:** Linear Algebra playlist
|
||||
|
||||
### Interactive Tools
|
||||
- **GeoGebra:** Visualize vectors and transformations
|
||||
- **WolframAlpha:** Compute anything
|
||||
- **MATLAB/Octave:** Numerical experiments
|
||||
- **Python + NumPy:** Programming practice
|
||||
|
||||
### Problem Sources
|
||||
- MIT OCW problem sets
|
||||
- Gilbert Strang's textbook exercises
|
||||
- Linear Algebra Done Right exercises
|
||||
- Math Stack Exchange
|
||||
|
||||
---
|
||||
|
||||
**Date Completed:** _______________
|
||||
**Next Reassessment:** _______________
|
||||
**Notes:**
|
||||
_______________________________________________
|
||||
_______________________________________________
|
||||
|
||||
378
learning_plans/linear_algebra/README.md
Normal file
378
learning_plans/linear_algebra/README.md
Normal file
@@ -0,0 +1,378 @@
|
||||
# Linear Algebra Learning Plan
|
||||
|
||||
## 📐 Welcome to Your Linear Algebra Mastery Journey!
|
||||
|
||||
This comprehensive learning plan will guide you from basic vectors to advanced applications in machine learning, computer graphics, and data science.
|
||||
|
||||
---
|
||||
|
||||
## 📚 What's Included
|
||||
|
||||
### 1. Master Plan (`00_LINEAR_ALGEBRA_MASTER_PLAN.md`)
|
||||
Your complete roadmap containing:
|
||||
- **22 detailed modules** organized in 5 phases
|
||||
- **From geometric intuition to abstract theory**
|
||||
- **Applications in ML, graphics, data science**
|
||||
- **Resource recommendations** (textbooks, videos, tools)
|
||||
- **Milestone achievements** with project ideas
|
||||
- **Specialization paths** (ML, Graphics, Quantum, Computational)
|
||||
|
||||
### 2. Knowledge Graph (`01_KNOWLEDGE_GRAPH.md`)
|
||||
Complete dependency map showing:
|
||||
- **15 knowledge levels** from basics to expert
|
||||
- **Topic dependencies** clearly mapped
|
||||
- **Parallel learning opportunities**
|
||||
- **Visual knowledge tree**
|
||||
- **Critical learning path**
|
||||
|
||||
### 3. Initial Assessment (`02_INITIAL_ASSESSMENT.md`)
|
||||
Determine your starting point with:
|
||||
- **Self-assessment** covering 40+ topics
|
||||
- **6 computational problems** (beginner to expert)
|
||||
- **Proficiency level determination**
|
||||
- **Personalized recommendations**
|
||||
|
||||
### 4. Assessments Directory (`assessments/`)
|
||||
Track your exam performance:
|
||||
- **Personalized assessments** after each exam
|
||||
- **Strengths and weaknesses** identified
|
||||
- **Progress tracking** over time
|
||||
|
||||
---
|
||||
|
||||
## 🎯 Learning Path Overview
|
||||
|
||||
### Phase 1: Foundations (1.5-2 months)
|
||||
**Goal:** Master vectors and matrices
|
||||
- Module 1.1: Vectors Basics (geometric)
|
||||
- Module 1.2: Dot Product & Vector Operations
|
||||
- Module 1.3: Matrices Basics
|
||||
- Module 1.4: Matrix Properties
|
||||
|
||||
### Phase 2: Core Theory (2-3 months)
|
||||
**Goal:** Master systems, decompositions, eigenvalues
|
||||
- Module 2.1: Systems of Linear Equations
|
||||
- Module 2.2: Matrix Inverses
|
||||
- Module 2.3: Determinants
|
||||
- Module 2.4: Vector Spaces
|
||||
- Module 2.5: Linear Transformations
|
||||
- Module 2.6: Eigenvalues & Eigenvectors
|
||||
|
||||
### Phase 3: Advanced Topics (1.5-2 months)
|
||||
**Goal:** Master orthogonality and decompositions
|
||||
- Module 3.1: Orthogonality
|
||||
- Module 3.2: Inner Product Spaces
|
||||
- Module 3.3: Matrix Decompositions (LU, QR, SVD)
|
||||
- Module 3.4: Norms & Conditioning
|
||||
|
||||
### Phase 4: Applications (1-2 months)
|
||||
**Goal:** Apply to real-world problems
|
||||
- Module 4.1: Machine Learning (PCA, regression)
|
||||
- Module 4.2: Computer Graphics (transformations)
|
||||
- Module 4.3: Optimization
|
||||
- Module 4.4: Data Science
|
||||
|
||||
### Phase 5: Specialization (Ongoing)
|
||||
**Choose your path:**
|
||||
- Machine Learning Deep Dive
|
||||
- Computational Linear Algebra
|
||||
- Quantum Computing
|
||||
- Advanced Applications
|
||||
|
||||
---
|
||||
|
||||
## 🚀 Quick Start
|
||||
|
||||
### Step 1: Prerequisites (Optional, 1-2 days)
|
||||
- Review basic algebra if rusty
|
||||
- Set up Python + NumPy OR MATLAB
|
||||
- Test with simple calculations
|
||||
|
||||
### Step 2: Assessment (1-2 hours)
|
||||
1. Open `02_INITIAL_ASSESSMENT.md`
|
||||
2. Complete self-assessment
|
||||
3. Try computational problems
|
||||
4. Determine your level
|
||||
|
||||
### Step 3: Build Intuition (1 week)
|
||||
1. **WATCH:** 3Blue1Brown "Essence of Linear Algebra" (11 videos, ~3 hours total)
|
||||
2. This series provides incredible geometric intuition
|
||||
3. Watch before heavy studying!
|
||||
|
||||
### Step 4: Study (Daily)
|
||||
1. Read theory (30-40 min)
|
||||
2. Solve problems (30-40 min)
|
||||
3. Prove theorems (20-30 min)
|
||||
4. Code implementations (optional)
|
||||
|
||||
---
|
||||
|
||||
## 💻 Recommended Tools
|
||||
|
||||
### Python + NumPy (Recommended for Programmers)
|
||||
```python
|
||||
import numpy as np
|
||||
|
||||
# Vectors
|
||||
v = np.array([1, 2, 3])
|
||||
w = np.array([4, 5, 6])
|
||||
dot = np.dot(v, w) # Dot product
|
||||
norm = np.linalg.norm(v) # Magnitude
|
||||
|
||||
# Matrices
|
||||
A = np.array([[1, 2], [3, 4]])
|
||||
B = np.linalg.inv(A) # Inverse
|
||||
det = np.linalg.det(A) # Determinant
|
||||
eig = np.linalg.eig(A) # Eigenvalues
|
||||
|
||||
# Solve systems
|
||||
x = np.linalg.solve(A, b) # Solve Ax = b
|
||||
|
||||
# Decompositions
|
||||
U, S, Vt = np.linalg.svd(A) # SVD
|
||||
Q, R = np.linalg.qr(A) # QR
|
||||
```
|
||||
|
||||
### MATLAB/Octave (Industry Standard)
|
||||
```matlab
|
||||
% Matrices are first-class citizens
|
||||
A = [1 2; 3 4];
|
||||
B = inv(A); % Inverse
|
||||
det_A = det(A); % Determinant
|
||||
[V, D] = eig(A); % Eigenvalues
|
||||
|
||||
% Solve systems
|
||||
x = A \ b; % Solve Ax = b
|
||||
|
||||
% Decompositions
|
||||
[U, S, V] = svd(A); % SVD
|
||||
[Q, R] = qr(A); % QR
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 📚 Essential Resources
|
||||
|
||||
### Must-Watch Videos
|
||||
1. **3Blue1Brown: "Essence of Linear Algebra"** (11 videos)
|
||||
- BEST visual intuition
|
||||
- Watch FIRST before anything else
|
||||
- Free on YouTube
|
||||
|
||||
### Textbooks (In Order)
|
||||
1. **"Introduction to Linear Algebra"** by Gilbert Strang
|
||||
- Best overall introduction
|
||||
- Clear explanations
|
||||
- Many applications
|
||||
|
||||
2. **"Linear Algebra and Its Applications"** by David Lay
|
||||
- Very accessible
|
||||
- Application-focused
|
||||
- Great for beginners
|
||||
|
||||
3. **"Linear Algebra Done Right"** by Sheldon Axler
|
||||
- More theoretical
|
||||
- Avoids determinants initially
|
||||
- Beautiful proofs
|
||||
|
||||
4. **"Matrix Analysis"** by Horn & Johnson
|
||||
- Advanced reference
|
||||
- Comprehensive
|
||||
- For deep study
|
||||
|
||||
### Online Courses
|
||||
- **MIT OCW:** Gilbert Strang's 18.06 (legendary!)
|
||||
- **Khan Academy:** Linear Algebra series
|
||||
- **Brilliant.org:** Interactive problems
|
||||
|
||||
---
|
||||
|
||||
## 🏆 Key Milestones
|
||||
|
||||
### Milestone 1: Vector & Matrix Fluency ✅
|
||||
- **Timing:** Month 2
|
||||
- **Skills:** All vector/matrix operations
|
||||
- **Project:** Vector/matrix library in Python
|
||||
- **Test:** Solve 20 problems in 30 minutes
|
||||
|
||||
### Milestone 2: Systems Mastery ✅
|
||||
- **Timing:** Month 4-5
|
||||
- **Skills:** Solve any linear system, compute inverses
|
||||
- **Project:** Linear equation solver
|
||||
- **Test:** Pass comprehensive exam (75%+)
|
||||
|
||||
### Milestone 3: Eigenvalue Mastery ✅
|
||||
- **Timing:** Month 6-7
|
||||
- **Skills:** Eigenvalues, eigenvectors, diagonalization
|
||||
- **Project:** Markov chain simulator
|
||||
- **Test:** Pass advanced exam (70%+)
|
||||
|
||||
### Milestone 4: SVD & Applications ✅
|
||||
- **Timing:** Month 8-9
|
||||
- **Skills:** SVD, PCA, graphics transforms
|
||||
- **Project:** Image compression or PCA implementation
|
||||
- **Test:** Apply to real data
|
||||
|
||||
### Milestone 5: Specialization ✅
|
||||
- **Timing:** Month 10+
|
||||
- **Skills:** Deep expertise in chosen area
|
||||
- **Project:** ML model, graphics engine, or quantum algorithm
|
||||
- **Certification:** Professional portfolio
|
||||
|
||||
---
|
||||
|
||||
## 💡 Linear Algebra Learning Tips
|
||||
|
||||
### Do's ✅
|
||||
- **Visualize everything** - Draw vectors and transformations
|
||||
- **Use 3Blue1Brown** - Best intuition builder
|
||||
- **Solve many problems** - Fluency requires practice
|
||||
- **Implement in code** - Programming solidifies understanding
|
||||
- **Prove key theorems** - Understand WHY, not just HOW
|
||||
- **Connect to applications** - See real-world relevance
|
||||
- **Start geometric** - Intuition before abstraction
|
||||
|
||||
### Don'ts ❌
|
||||
- Don't memorize formulas without understanding
|
||||
- Don't skip geometric interpretation
|
||||
- Don't avoid proofs entirely
|
||||
- Don't neglect computational practice
|
||||
- Don't rush through fundamentals
|
||||
- Don't study in isolation (use visualizations)
|
||||
|
||||
---
|
||||
|
||||
## 🎯 Why Learn Linear Algebra?
|
||||
|
||||
### Foundation for Modern Tech
|
||||
- **Machine Learning:** PCA, neural networks, optimization
|
||||
- **Computer Graphics:** ALL transformations are matrices
|
||||
- **Data Science:** Dimensionality reduction, analysis
|
||||
- **Quantum Computing:** Quantum states are vectors
|
||||
- **Computer Vision:** Image processing, feature extraction
|
||||
- **Natural Language Processing:** Word embeddings, transformers
|
||||
|
||||
### Real Applications
|
||||
- Netflix recommendations (SVD, matrix factorization)
|
||||
- Google PageRank (eigenvectors of web graph)
|
||||
- Face recognition (eigenfaces, PCA)
|
||||
- 3D video games (transformation matrices)
|
||||
- Self-driving cars (sensor fusion, optimization)
|
||||
- ChatGPT/LLMs (attention is matrix operations!)
|
||||
|
||||
### Career Impact
|
||||
- Required for ML engineer roles
|
||||
- Essential for data science
|
||||
- Critical for graphics programming
|
||||
- Foundation for AI research
|
||||
- Needed for quantitative finance
|
||||
|
||||
---
|
||||
|
||||
## 📊 Study Schedules
|
||||
|
||||
### Full-Time (3-4 hours/day)
|
||||
- **Timeline:** 5-6 months to applications
|
||||
- **Daily:** 1 hour theory + 1-2 hours problems + 1 hour coding
|
||||
- **Projects:** 1-2 per week
|
||||
- **Pace:** 1 module per week
|
||||
|
||||
### Part-Time (1.5-2 hours/day)
|
||||
- **Timeline:** 8-10 months to applications
|
||||
- **Daily:** 40 min theory + 40 min problems + 20 min review
|
||||
- **Projects:** 1 per week
|
||||
- **Pace:** 1 module per 1.5-2 weeks
|
||||
|
||||
### Casual (1 hour/day)
|
||||
- **Timeline:** 12-15 months to applications
|
||||
- **Daily:** 30 min theory + 30 min problems
|
||||
- **Projects:** 2 per month
|
||||
- **Pace:** 1 module per 2-3 weeks
|
||||
|
||||
---
|
||||
|
||||
## 🎓 Integration with Tech Learning
|
||||
|
||||
### Python Integration
|
||||
Use NumPy to implement all concepts:
|
||||
- Vectors and matrices
|
||||
- Linear transformations
|
||||
- Eigenvalue computation
|
||||
- SVD and PCA
|
||||
- ML applications
|
||||
|
||||
### C++ Integration
|
||||
Implement for performance:
|
||||
- Matrix libraries
|
||||
- Graphics transformations
|
||||
- Game engine math
|
||||
- Scientific computing
|
||||
|
||||
### Machine Learning
|
||||
Linear algebra is EVERYWHERE:
|
||||
- Data representation
|
||||
- Model parameters
|
||||
- Forward/backward pass
|
||||
- Optimization
|
||||
- Dimensionality reduction
|
||||
|
||||
---
|
||||
|
||||
## 🌟 What Makes This Plan Special
|
||||
|
||||
### Visual & Intuitive
|
||||
- Emphasizes geometric understanding
|
||||
- 3Blue1Brown integration
|
||||
- Visualization tools
|
||||
- Draw everything!
|
||||
|
||||
### Computation & Theory Balanced
|
||||
- 60% computational practice
|
||||
- 25% theoretical understanding
|
||||
- 15% applications
|
||||
- Learn by doing AND understanding
|
||||
|
||||
### Application-Driven
|
||||
- See real uses immediately
|
||||
- Build actual projects
|
||||
- Connect to ML, graphics, data science
|
||||
- Not just abstract math
|
||||
|
||||
### Modern & Practical
|
||||
- Python/NumPy focus
|
||||
- Industry-relevant skills
|
||||
- Modern applications (ML, AI)
|
||||
- Cutting-edge topics
|
||||
|
||||
---
|
||||
|
||||
## 🎯 Your Next Steps
|
||||
|
||||
1. ☐ Read this README
|
||||
2. ☐ **WATCH:** 3Blue1Brown videos 1-3 (build intuition!)
|
||||
3. ☐ Complete `02_INITIAL_ASSESSMENT.md`
|
||||
4. ☐ Review `00_LINEAR_ALGEBRA_MASTER_PLAN.md`
|
||||
5. ☐ Check `01_KNOWLEDGE_GRAPH.md` for dependencies
|
||||
6. ☐ Set up NumPy or MATLAB
|
||||
7. ☐ Start Module 1.1!
|
||||
|
||||
---
|
||||
|
||||
## 🌟 Inspiration
|
||||
|
||||
*"Linear algebra is the mathematics of data."*
|
||||
— Gilbert Strang
|
||||
|
||||
*"You can't do machine learning without linear algebra."*
|
||||
— Every ML engineer
|
||||
|
||||
*"The more I learn about linear algebra, the more I realize it's everywhere."*
|
||||
— You, after completing this course!
|
||||
|
||||
---
|
||||
|
||||
**Linear algebra is the foundation of modern technology. Master it and unlock AI, graphics, data science, and more! 📐🚀**
|
||||
|
||||
**Last Updated:** October 21, 2025
|
||||
**Status:** ✅ Complete learning plan
|
||||
**Next Review:** January 2026
|
||||
81
learning_plans/linear_algebra/assessments/README.md
Normal file
81
learning_plans/linear_algebra/assessments/README.md
Normal file
@@ -0,0 +1,81 @@
|
||||
# Linear Algebra Assessments Directory
|
||||
|
||||
## 📁 Purpose
|
||||
|
||||
This directory contains all your personalized Linear Algebra exam assessments and performance reviews.
|
||||
|
||||
---
|
||||
|
||||
## 📊 What's Stored Here
|
||||
|
||||
### Exam Result Assessments
|
||||
- Detailed analysis of your exam performance
|
||||
- Problem-by-problem breakdown
|
||||
- Strengths and weaknesses identified
|
||||
- Personalized study recommendations
|
||||
- Progress tracking over time
|
||||
|
||||
### Assessment Format
|
||||
**Filename:** `howard_linear_algebra_{exam_id}_assessment.md`
|
||||
**Example:** `howard_linear_algebra_basics_v1_assessment.md`
|
||||
|
||||
---
|
||||
|
||||
## 📝 Future Assessments
|
||||
|
||||
As you take Linear Algebra exams, this folder will contain:
|
||||
- Foundations exam assessments
|
||||
- Core theory exam assessments
|
||||
- Advanced topics exam assessments
|
||||
- Applications exam assessments
|
||||
- Retake assessments showing improvement
|
||||
|
||||
---
|
||||
|
||||
## 🎯 How to Use These Assessments
|
||||
|
||||
### After Each Exam
|
||||
1. Review the assessment file
|
||||
2. Identify your strengths (celebrate!)
|
||||
3. Note areas for improvement
|
||||
4. Follow the recommended study plan
|
||||
5. Track progress over time
|
||||
|
||||
### For Progress Tracking
|
||||
- Compare assessments over time
|
||||
- See improvement in weak areas
|
||||
- Verify mastery before advancing
|
||||
- Celebrate milestones
|
||||
|
||||
### For Study Planning
|
||||
- Use weakness identification for focused study
|
||||
- Follow recommended action plans
|
||||
- Prioritize high-impact topics
|
||||
- Optimize learning time
|
||||
|
||||
---
|
||||
|
||||
## 🔗 Integration with Learning Plan
|
||||
|
||||
Assessments directly reference:
|
||||
- **Master Plan:** `/learning_plans/linear_algebra/00_LINEAR_ALGEBRA_MASTER_PLAN.md`
|
||||
- **Knowledge Graph:** `/learning_plans/linear_algebra/01_KNOWLEDGE_GRAPH.md`
|
||||
- **Initial Assessment:** `/learning_plans/linear_algebra/02_INITIAL_ASSESSMENT.md`
|
||||
|
||||
---
|
||||
|
||||
## 📊 Expected Contents Over Time
|
||||
|
||||
```
|
||||
assessments/
|
||||
├── README.md (this file)
|
||||
├── howard_linear_algebra_foundations_v1_assessment.md (future)
|
||||
├── howard_linear_algebra_intermediate_v1_assessment.md (future)
|
||||
├── howard_linear_algebra_eigenvalues_v1_assessment.md (future)
|
||||
├── howard_linear_algebra_applications_v1_assessment.md (future)
|
||||
└── progress_summary.md (coming soon)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
**Keep all your Linear Algebra assessments here for comprehensive progress tracking!** 📐✨
|
||||
Reference in New Issue
Block a user