EEL 6825: Pattern Recognition and Intelligent Systems - 4 Weeks Course Prep
This is a strategic approach to prepare for EEL 6825: Pattern Recognition and Intelligent Systems, a course that intersects statistics, machine learning, signal processing, and AI.
This plan is heavy on math and modeling, with applications in robotics, vision, speech, and bioinformatics.
🗓️ Study Plan Overview
- Week 1 (July 20–27): Introductory Foundations
- Week 2–3 (July 28–Aug 10): Intermediate Core
- Week 4–5 (Aug 11–Aug 24): Advanced Prep for Course Readiness
🎯 GOAL: Build strong grounding in the math behind pattern recognition.
🔢 1. Statistics & Probability Refresher
Key Topics:
- Bayes’ Theorem
- Conditional & joint probability
- PDFs, PMFs
- Expectation, variance, covariance
Practice:
- Solve Bayes classification problems on paper
- Simulate with Python: sample from
numpy.random
, calculate distributions
# Basic class probability simulation
import numpy as np
# Assume class priors
priors = [0.3, 0.7]
samples = np.random.choice([0, 1], size=1000, p=priors)
print("Class 0:", np.sum(samples == 0))
print("Class 1:", np.sum(samples == 1))
📚 2. Linear Algebra Refresher
Key Topics:
- Vectors, dot products, matrix multiplication
- Eigenvectors/eigenvalues
- Orthogonality, projection
Practice:
- Use
numpy
or SymPy
to find eigenvalues, singular values
- Visualize 2D vector projections
✍️ 3. Basic Pattern Recognition Concepts
Key Topics:
- What is a pattern?
- Types of classifiers:
- Supervised vs unsupervised
- Nearest neighbor, Bayes classifier (generative)
Read:
- Duda, Hart & Stork, Pattern Classification, Ch. 1–2 (classic)
- Or Bishop’s Pattern Recognition and Machine Learning, Ch. 1–2
🧠 WEEK 1 Outcomes:
Be able to:
- Classify data using basic rules (like Bayes or nearest neighbor)
- Interpret variance/covariance and apply probability to classification problems
📌 1. Bayesian Decision Theory
Key Topics:
- Loss function, risk minimization
- Classifiers under known distributions
- MAP (Maximum A Posteriori) and ML (Maximum Likelihood) rules
Practice:
- Derive classifiers given Gaussian class distributions
- Apply MAP rule to a toy dataset
📌 2. Parametric vs Nonparametric Models
Key Topics:
- Parametric: Assume distribution (e.g., Gaussian)
- Nonparametric: k-NN, Parzen windows
Practice:
- Build a 2D Gaussian classifier in Python
- Compare with a k-NN model using
scikit-learn
📌 3. Dimensionality Reduction
Key Topics:
- PCA (Principal Component Analysis)
- LDA (Linear Discriminant Analysis)
Practice:
- Use
sklearn.decomposition.PCA
to reduce to 2D and visualize
📌 4. Feature Extraction
Key Topics:
- Manual vs learned features
- Understand: Why good features matter more than the model
🧠 WEEK 2 Outcomes:
Be able to:
- Derive a MAP classifier
- Reduce dimensions and explain decision boundaries
- Build and test basic classifiers on real data
🤖 1. Intro to Learning Algorithms
Key Topics:
- Logistic Regression
- Perceptron Algorithm
- Support Vector Machines (SVMs)
Practice:
- Implement Perceptron or Logistic Regression from scratch
- Use
sklearn.svm.SVC
on linearly separable data
🧠 2. Neural Networks as Pattern Recognizers
Key Topics:
- Single-layer vs multi-layer perceptrons
- Activation functions
- Training with gradient descent
Practice:
- Use PyTorch or TensorFlow to build a small NN on MNIST
🧠 3. Unsupervised Learning
Key Topics:
- k-means clustering
- Gaussian Mixture Models (GMMs)
- Expectation Maximization (EM)
Practice:
- Compare k-means vs GMM on same dataset
🧠 4. Evaluation Metrics
Key Topics:
- Confusion matrix, ROC curve, F1-score
- Cross-validation
Practice:
- Evaluate models using
sklearn.metrics
🧠 WEEK 4 Outcomes:
Be able to:
- Implement a classifier from scratch
- Interpret confusion matrices
- Explain tradeoffs between discriminative vs generative models
📅 Target by August 25:
- ✅ Understand theory of Bayes, MAP, ML classifiers
- ✅ Know how and why PCA or LDA are used
- ✅ Comfortable reading matrix derivations
- ✅ Can read academic papers in the field
- ✅ Know the why behind learning models, not just how
📘 Reading Resources
- Textbook: Bishop, Pattern Recognition and Machine Learning
- Textbook: Duda, Hart & Stork, Pattern Classification: More formal, foundational
- Online Course: MIT OCW: Machine Learning (Tommi Jaakkola, Reg. Ng)
- YouTube: “StatQuest” (great for Bayes, PCA, etc.)