Neural Networks: Foundations and Modern Advances (1st Offering)
Presenter: Amangel Bhullar
Date: Thursday, June 26th, 2025
Time: 12:00 pm
Location: Workshop Space, 4th Floor - 300 Ouellette Ave., School of Computer Science, Advanced Computing Hub
Neural networks lie at the core of modern artificial intelligence and machine learning, powering applications from image recognition and natural language processing to autonomous vehicles and medical diagnosis. This workshop provides a comprehensive theoretical foundation of neural networks, beginning with their biological inspiration and progressing to modern architectures. We will explore how neural networks learn via backpropagation, analyze common activation functions and optimization strategies, and examine their limitations and future directions. Students will gain a robust understanding of multilayer perceptrons, training dynamics, overfitting, and regularization.
1. Introduction to Neural Networks
- Origins: Biological vs. artificial neurons
- Historical context: From perceptrons to deep learning
- Why neural networks are powerful function approximators
2. Mathematical Foundations
- Neuron computation
- Vectorized notation and matrix operations
- Role of weights, biases, and activations
3. Network Architectures
- Single-layer vs. Multi-layer Perceptrons (MLPs)
- Depth vs. width in neural networks
- Feedforward vs. recurrent structures
4. Training Neural Networks
- Loss functions: Mean Squared Error, Cross-Entropy
- Gradient Descent and Backpropagation
- Chain Rule and computational graph perspective
5. Activation Functions
- Sigmoid, Tanh, ReLU, Leaky ReLU
- Non-linearity and vanishing gradient problem
6. Optimization and Regularization
- Optimizers: SGD, Adam, RMSProp
- Techniques: Dropout, Weight Decay, Batch Normalization
- Early stopping and learning rate scheduling
7. Challenges and Limitations
- Overfitting and underfitting
- Interpretability issues
- Generalization and robustness
8. Discussion and Future Directions
- Transition to deeper models such as CNNs, RNNs, and Transformers
- Role of neural networks in self-supervised and generative learning
- Open research problems and real-world applications
- Basic linear algebra and calculus
- Introductory knowledge of machine learning concepts
Amangel Bhullar is a Ph.D. candidate in Computer Science at the University of Windsor, specializing in artificial intelligence with a focus on knowledge representation, machine learning, Social networks, and knowledge graphs. She currently serves as the President of the Graduate Student Society (GSS), where she leads initiatives to enhance the academic and social experience of graduate students.
In addition to her role at GSS, Amangel is the Director of the Lancer Sport and Recreation Center (LSRC) Corporation and serves as a Member of the Board of Governors at the University of Windsor. Her leadership contributions extend further as an Ex-Officio Member of the University Senate, where she brings a student-centred perspective to university policy and governance matters.