15-386/686 Neural Computation

Carnegie Mellon University

Spring 2024

Course Description

Neural Computation is an area of interdisciplinary study that seeks to understand how the brain learns and computes to achieve intelligence. It seeks to understand the computational principles and mechanisms of intelligent behaviors and mental abilities -- such as perception, language, motor control, decision making and learning -- by building artificial systems and computational models with the same capabilities. This course explores computational principles at multiple levels, from individual neurons to circuits and systems, with a view to bridging brain science and machine learning. It will cover basic models of neurons and circuits, computational models of learning, memories and inference in real and artifical systems. Concrete examples will be drawn mostly from the visual system, with emphasis on relating current deep learning research and the brain research, from hierarchical computation, attention, recurrent neural networks, to reinforcement learning. Students will learn to perform quantitative analysis as well as computational experiments using Matlab. No prior background in biology or machine learning is assumed. Prerequisites: Basic knowledge of matrix, linear algebra, basic calculus (partial differential equations), probability and statistics are required. 15-100, 21-120 or permission of instructor. 21-241 preferred but not required.

Course Information

Instructors Office (Office hours) Class zoom link Email (Phone)
Tai Sing Lee (Professor) Friday 9:00-10:00 a.m. Recitation (class zoom) taislee@andrew.cmu.edu
Tianqin Li (TA) Wed and Friday 7-8 p.m (class zoom) + Fri Recitation tianqinl@andrew.cmu.edu
Yung Ying Chen (TA) Mon 7-8 pm and Tue 8:30-9:30 p.m. (class zoom) yungyinc@andrew.cmu.edu
Ziqi (Zack) Wen (TA) Tues 7-8 pm and Wed 8:30-9:30 p.m. (class zoom) ziqiwen@andrew.cmu.edu

Recommended Supplementary Textbook

Classroom Etiquette

386 Grading Scheme

Evaluation% of Grade
Assignments 60
Class Exercise 10
Midterm 10
Final Exam 20
  • Grading scheme: A: > 88% B: > 75%. C: > 65% .
  • 686 Grading Scheme

    Evaluation% of Grade
    Assignments 60
    Class Exercise 10
    Midterm 10
    Final Exam 20
    BiWeekly Journal Club and Term Paper Required. You can miss one
  • Grading scheme: A+ > 96%, A: > 88% B: > 75%. C: > 65% .

    Assignments

    Late Policy

    Examinations

    Syllabus

    Date Lecture Topic Relevant Readings Assignments
      Part 1: Neurons and Logics    
    W 1/17 1. Introduction and Overview NIH Brain Facts (chapter 1)  
    F 1/19 Journal Club Orientation  
    M 1/22 2. Neurons and Membranes Trappenberg Ch 1.1-2.2
    W 1/24 3. Spikes and Cables Trappenberg Ch 2 (C) HW1 out
    F 1/26 Recitation: Problem set 1 and Matlab Tutorial Trappenberg Math Appendix  
    M 1/29 4. Axons and Synapse Trappenberg Ch 3.1, 3.3  
    W 1/31 5. Dendrites and Logics Trappenberg 3.1,3.5 McCulloch and Pitts (1943)  
    F 2/2 Journal Club #1    
      Part 2: Learning and Representation    
    M 2/5 6. Synaptic plasticity Trappenberg Ch 4 Abbott and Nelson (2000)  
    W 2/7 7. Hebbian Learning Trappenberg Ch 4, HPK Ch 8 Oja (1982) HW1 due. HW2 Out
    F 2/9 Recitation: Problem Set 2    
    M 2/12 8. Sensory Representation  
    W 2/14 9. Source Separation Foldiak (1990)  
    F 2/16 Journal Club #2    
    M 2/19 10. Sparse Coding Olshausen and Field (1997) (2004)  
      Part 3: Association and Memories    
    W 2/21 11. Association Trappenberg Ch 10.3 Hinton and Salahutdinov (2006) HW2 due. HW3 out.
    F 2/23 Recitation: Problem Set 3    
    M 2/26 12. Attractors Trappenberg Ch 8. Ch 9.4 Hopfield and Tank (1986)  
    W 2/28 Midterm  
    F 3/1 Journal Club #3    
    3/6-3/10 Midsemester and Spring break.  
    M 3/11 13. Memory Kumaran, Hssabis and McClelland (2016) Mid-Semester Grade due
    W 3/13 14. Computational Maps HPK Ch 9. Trappenberg 7.1-7.2 Kohonen (1982) Semester Drop deadline
    F 3/15 Journal Club #4    
      PART 4: Networks and Computation    
    M 3/18 15. Recurrent Network Marr and Poggio (1976) Samonds et al. (2013) Wang et al. (2018)  
    W 3/20 16. Hierarchy Trappenberg Ch 6, 10.1. Fukushima (1988), Yamins and DiCarlo (2016) HW 3 in. HW 4 out.
    F 3/22 Recitation: Problem Set 4    
    M 3/25 17. Feedback Trappenberg 5.1. Van Essen et al (1992) Fellman and Van Essen ( 1991)  
    W 3/27 18. Hierarchical Inference Mumford (1992) Rao and Ballard (1998) Lee and Mumford (2003)  
    F 3/29 Journal Club #5    
      PART 5: Prediction and Decision    
    M 4/1 19. Predictive Coding Trappenberg Ch 10. Lotter et al (2016), Colah (2015) Rao (2015)  
    W 4/3 20. Reinforcement Learning Trappenberg. Ch 9. Niv (2009), Montague et al. (1996) HW 4 in. HW 5 out.
    F 4/5 Recitation: Problem set 5    
    M 4/8 22. Bayesian Integration Ernst and Banks (2002). Kording and Wolpert (2004) Weiss et al. (2002).  
    W 4/10 21. Population Codes Ma et al. (2006) Kersten and Yuille (2003)  
    F 4/12 Carnival No Journal Club    
    M 4/15 23. Probabilistic Inference Orban et al. (2016). Shivkumar et al. (2019)  
    W 4/17 24. Attention Trappenberg Ch 10. Vaswani et al. (2017) Lindsay (2020) Knudsen (2007) HW 5 due
    F 4/19 Journal Club #7    
    M 4/22 25. Consciousness Blum and Blum (2018) Koch (2018).  
    W 4/24 26. Review    
    F 4/26 Recitation: Term Paper Presentation     Weiss et al. (2002). u
    M 5/6 Final Exam 5:30 pm - 8:30 p.m In Person  

    Supplementary Reading List

    Retinal Computation

    Sparse Coding

    Logical computation in Neurons

    Biological Neural Circuits

    Neural Network models of Neural Circuits

    Sparse Coding on computation and memory

    Reinforcement Learning

    Biological Plausible Deep Learning Algorithms

    Casual inference

    Inverse Rational Control

    Spiking Bayesian Circuit

    Curiosity and Imagination

    Reinforcement Learning and Song Birds

    Emotion

    Consciousness

    Glia and their functions


    Questions or comments: contact Tai Sing Lee
    Last modified: spring 2024, Tai Sing Lee