C S 12B: DEEP LEARNING
Foothill College Course Outline of Record
| Heading | Value |
|---|---|
| Effective Term: | Winter 2026 |
| Units: | 4.5 |
| Hours: | 4 lecture, 2 laboratory per week (72 total per quarter) |
| Prerequisite: | C S 12A. |
| Degree & Credit Status: | Degree-Applicable Credit Course |
| Foothill GE: | Non-GE |
| Transferable: | CSU/UC |
| Grade Type: | Letter Grade (Request for Pass/No Pass) |
| Repeatability: | Not Repeatable |
Student Learning Outcomes
- Explain different optimization strategies and identify appropriate methods for specific tasks
- Recognize the potential for bias and analyze the ethical implications of Deep Learning applications in different domains.
- Understand the need for explainability in certain domains and the challenges of interpreting increasingly complex networks
- Use Python packages to train and evaluate deep learning models, including Feedforward, Convolutional, and Recurrent Neural Networks
Description
Course Objectives
The student will be able to:
- Demonstrate foundational knowledge of machine learning principles
- Apply selected concepts from linear algebra and calculus to deep learning
- Create, train, and evaluate a feed-forward network using publicly available packages
- Identify techniques for optimizing models
- Create, train, and evaluate a recurrent neural network using publicly available packages
- Create, train, and evaluate a convolutional neural network using publicly available packages
- Explain the concept and use of generative models
- Articulate paths for future exploration of deep learning
- Recognize deep learning as a tool that can reduce or amplify problems in society
Course Content
- Machine learning foundations
- Training and evaluation
- Bias-variance tradeoff
- Logistic regression
- Perceptrons
- Concepts from linear algebra and calculus
- Vector and matrix operations
- Gradient and partial derivatives
- Chain rule for backpropagation
- Feed-forward networks
- Architectural components
- Forward propagation
- Loss and cost functions
- Backpropagation and gradient descent
- Learning rate and convergence/divergence
- Optimizations
- Hyperparameter tuning
- Stochastic gradient descent and mini-batching
- Regularization and dropout
- Weight initialization
- Batch normalization
- Recurrent neural networks
- Types of sequential data
- Simple recurrent architecture
- Vanishing/exploding gradients
- Long short-term memory networks
- Gated recurrent units
- Bidirectional RNNs
- Convolutional neural networks (CNNs)
- Computer vision applications
- Convolutional filters
- Activation maps
- Pooling layers
- Generative models
- Discriminative vs. generative tasks
- Generative adversarial networks
- Variational autoencoders
- Additional topics for exploration
- Transfer learning
- Deep reinforcement learning
- Self-supervised learning
- End-to-end learning
- Ethics and explainability
Lab Content
- Environment setup
- Navigating Jupyter notebooks or chosen environment
- Running code cells, troubleshooting common errors
- Installing and importing DL frameworks (e.g., PyTorch or TensorFlow)
- Implementing basic neural networks
- Building a single-layer perceptron and training on a small dataset
- Adding hidden layers to form an MLP
- Evaluating performance with basic metrics
- Hyperparameter experiments
- Adjusting learning rate, batch size, and hidden units
- Tracking accuracy and loss over epochs
- Using a validation set for early stopping
- Advanced training techniques
- Adding batch normalization and experimenting with different optimizers
- Trying different activation functions (ReLU, LeakyReLU)
- Comparing training stability and convergence speed
- Working with CNNs
- Implementing a simple CNN for image classification
- Inspecting feature maps
- Transfer learning
- Loading a pre-trained model
- Fine-tuning on a custom dataset
- Observing improvements in training time and accuracy
- Sequence models
- Training a simple RNN or LSTM for time-series forecasting
- Adjusting sequence length and analyzing results
- Comparing RNN outputs to feed-forward network predictions
- Exploring transformers
- Running inference with a pre-trained transformer model for classification
- Observing improvements over RNN-based models
- Generative modeling
- Implementing a simple GAN
- Examining generated samples and adjusting training steps
- Noting stability issues in GAN training
- Data pipeline management
- Demonstrating shuffling, batching, caching, and prefetching with a chosen framework
- Handling categorical or structured data (conceptually with code snippets)
- Responsible artificial intelligence in practice
- Checking for bias in a dataset (e.g., class imbalance)
- Interpreting and explaining a model's behavior
- Discussing potential mitigations to bias and documenting findings
- Project integration and review
- Combining steps: data preprocessing --> model building --> training --> evaluation
- Implementing a chosen project (e.g., image classification with a pre-trained model)
- Reporting results, plotting training curves, showing example predictions
Special Facilities and/or Equipment
2. The college will provide a website or course management system with an assignment posting component (through which all lab assignments are to be submitted) and a forum component (where students can discuss course material and receive help from the instructor). This applies to all sections, including on-campus (i.e., face-to-face) offerings.
3. When taught online, the college will provide a fully functional and maintained course management system through which the instructor and students can interact.
4. When taught online, students must have currently existing email accounts and ongoing access to computers with internet capabilities.
Method(s) of Evaluation
Tests and quizzes
Lab notebook
Written laboratory assignments which include source code, sample runs, and documentation
Reflective papers
Final examination or project
Method(s) of Instruction
Instructor-authored lectures which include mathematical foundations, theoretical motivation, and coding implementation of deep learning algorithms
Detailed review of assignments which includes model solutions and specific comments on the student submissions
Discussion which engages students and instructor in an ongoing dialog about deep learning
Instructor-authored labs that rigorously demonstrate a student's ability to implement deep learning models
Representative Text(s) and Other Materials
Chollet, François. Deep Learning with Python, 2nd ed.. 2021.
Géron, Aurélien. Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow, 3rd ed.. 2022.
Raschka, Sebastian, Yuxi (Hayden) Liu, and Vahid Mirjalili. Machine Learning with PyTorch and Scikit-Learn. 2022.
Types and/or Examples of Required Reading, Writing, and Outside of Class Assignments
- Reading
- Textbook assigned reading averaging 30 pages per week
- Reading the supplied handouts and modules averaging 10 pages per week
- Reading online resources as directed by instructor though links pertinent to programming
- Reading library and reference material directed by instructor through course handouts
- Writing
- Writing technical prose documentation that supports and describes the programs that are submitted for grades
