**Spring** **2024**: Neural Networks: Theory & Applications

**Course** **Instructor**: SueYeon Chung

**Course** **Description**: This course introduces mathematical methods and concepts used in theoretical neuroscience and theory of neural networks. We will explore computations and functions performed by the brain, and how they are implemented by neurons and their networks. Topics covered will include: normative theories of sensory representations, population coding, classical neural network theory, and deep learning theory. Concrete examples of applications of these ideas to the brain will be discussed. Topics at the research frontier will be emphasized.

**Prerequisites**: MathTools (NEURL-GA.2201) or permission of the instructor.

**Location** / **Time**:

Lectures - on Tuesdays, 1:30-3pm, Meyer 1010

Paper Discussions - on Wednesdays, 1:30-3pm, Meyer 1010

Office Hours - on weeks where there is no one to lead paper discussion, the Wednesday slot is converted into office hours.

**Grading**:

Paper presentation (35%)

Once throughout the term, students, possibly in a group of two, will lead a discussion.

Pedagogical literature review / blog posts (35%)

Students will write a short piece of writing explaining a computational method or a literature review in modern neuroscience. This piece can be, but does not have to be, a review of a particular paper discussed in class.

Attendance and participation in paper discussion (30%)

Students are expected to attend paper discussions roughly bi-weekly. Permission to miss a discussion section should be requested in advance as much as possible.

**Dates** / **Topics**:

Jan 23 - Introduction, Mathematical Background

Jan 30 - Early Vision and Efficient Coding Theory

Feb 6 - Compressed Sensing and Sparse Coding

Feb 13 - Perceptrons I: Basics

Feb 20 - Perceptrons II: Support Vector Machines (SVMs)

Feb 27 - Non-Linear SVMs and The Kernel Method

Mar 5 - Spin-Glass Approach to Neural Networks

Mar 12 - Perceptron Capacity to Manifold Capacity

Mar 19 - Spring Break

Mar 26 - Generalization Error in Neural Networks

Apr 2 - Multilayer Networks (Deep Networks): Introduction

Apr 9 - Modern Topics: Deep Learning and Neuroscience

Apr 16 - Understanding Biological & Artificial Neural Networks: Neural Manifolds

Apr 23 - Understanding Biological & Artificial Neural Networks: Learning

May 30 - Finish Literature Review & Blog Posts

**Detailed schedule and reading list
**

**Dates / Topics:**

- Jan 23 (Lecture) - Course Introduction
- Jan 24 (Lecture) - Mathematical Background
- Jan 30 (Lecture) - Early Vision and Efficient Coding Theory
- Jan 31 (Paper Discussion) - H. Barlow (1961), "Possible Principles Underlying the Transformation of Sensory Messages," Sensory Communication, 1(01), 217-33.
- Feb 6 (Lecture) - Compressed Sensing and Sparse Coding

Additional Material: S. Kakade & S. Shakhnarovich, "Random Projections," Lecture Notes - Feb 7 (Paper Discussion) - S. Ganguli & H. Sompolinsky (2012), "Compressed Sensing, Sparsity, and Dimensionality in Neuronal Information Processing and Data Analysis," Annual Review of Neuroscience, 35, 485-508.
- Feb 13 (Lecture) - Perceptrons I: Perceptron Capacity, Cover’s Theorem Additional Material: H. Sompolinsky, "Perceptron," Lecture Notes
- Feb 14 (Lecture) - Perceptrons II: Support Vector Machines (SVMs)

Reference: C.J. Burges (1998), "A Tutorial on Support Vector Machines for Pattern Recognition," Data Mining and Knowledge Discovery, 2(2), 121-67. - Feb 20 (Lecture) - Non-Linear SVMs and The Kernel Method, Mercer’s Theorem Reference: C.J. Burges (1998), "A Tutorial on Support Vector Machines for Pattern Recognition," Data Mining and Knowledge Discovery, 2(2), 121-67.
- Feb 21 (Paper Discussion) - Modern Topics in Kernel Methods

Reference: A. Jacot, F. Gabriel & C. Hongler (2018), "Neural Tangent Kernel: Convergence and Generalization in Neural Networks," Advances in Neural Information Processing Systems, 31. - Feb 27 (Paper Discussion) - Kernels and Similarity Matrices

Reference: J. Diedrichsen & N. Kriegeskorte (2017), "Representational Models: A Common Framework for Understanding Encoding, Pattern-component, and Representational-similarity Analysis," PLoS Computational Biology, 13(4), e1005508. - Feb 28 - No Meeting (Cosyne 2024) Work on Literature Review & Blog Posts
- Mar 5 - No Meeting (Cosyne 2024). Work on Literature Review & Blog Posts
- Mar 6 (Lecture) - Physicist’s Approach to Perceptron Capacity

Reference: E. Gardner (1988), "The Space of Interactions in Neural Network Models," Journal of Physics A: Mathematical and General, 21(1), 257. - Mar 12 (Lecture) - Manifold Capacity: Connecting Perceptron Capacity to Neural Manifold Geometry

Reference: S. Chung, D.D. Lee & H. Sompolinsky (2018), "Classification and Geometry of General Perceptual Manifolds," Physical Review X, 8(3), 031003. - Mar 13 (Paper Discussion) - Mapping Models in Cognitive Neuroscience

Reference: A.A. Ivanova et al. (2022), "Beyond Linear Regression: Mapping Models in Cognitive Neuroscience Should Align with Research Goals," Neurons, Behavior, Data Analysis, and Theory. - Mar 19 - Spring Break
- Mar 26 (Lecture) - Generalization Error Theory

Reference: H.S. Seung, H. Sompolinsky & N. Tishby (1992), "Statistical Mechanics of Learning from Examples," Physical Review A, 45(8), 6056. - Mar 27 (Paper Discussion) - Modern Topics in Generalization Error Theory

Reference: A. Canatar, B. Bordelon & C. Pehlevan (2021), "Spectral Bias and Task-model Alignment Explain Generalization in Kernel Regression and Infinitely Wide Neural Networks," Nature Communications, 12(1), 2914. - Apr 2 (Lecture) - Deep Networks: Introduction

Reference: A.R. Barron (1994), "Approximation and Estimation Bounds for Artificial Neural Networks," Machine Learning, 14, 115-33. - D.L. Yamins & J.J. DiCarlo (2016), "Using Goal-driven Deep Learning Models to Understand Sensory Cortex," Nature Neuroscience, 19(3), 356-65.
- Apr 3 (Paper Discussion) - A Deep Learning Framework for Neuroscience

Reference: B.A. Richards et al. (2019), "A Deep Learning Framework for Neuroscience," Nature Neuroscience, 22(11), 1761-70. - Apr 9 (Lecture) - Deep Learning for Neuroscience: Perspectives

Reference: A. Saxe, S. Nelli & C. Summerfield (2021), "If Deep Learning Is the Answer, What Is the Question?

**Possible papers / topics for blog posts**

Efficient Coding

Barlow HB. Possible principles underlying the transformation of sensory messages. Sensory communication. 1961 Sep;1(01):217-33.

(Decorrelation hypothesis) Atick JJ, Redlich AN. Towards a theory of early visual processing. Neural computation. 1990 Sep;2(3):308-20.

(ICA hypothesis) Bell AJ, Sejnowski TJ. Learning the higher-order structure of a natural sound. Network: Computation in neural systems. 1996 May 1;7(2):261.

Population Coding

Averbeck BB, Latham PE, Pouget A. Neural correlations, population coding and computation. Nature reviews neuroscience. 2006 May 1;7(5):358-66.

Pouget A, Dayan P, Zemel R. Information processing with population codes. Nature Reviews Neuroscience. 2000 Nov;1(2):125-32.

Compressed Sensing and Sparse Coding

Kakade S, Shakhnarovich S. Random projections (Lecture notes). https://home.ttic.edu/~gregory/courses/LargeScaleLearning/lectures/jl.pdf

Ganguli S, Sompolinsky H. Compressed sensing, sparsity, and dimensionality in neuronal information processing and data analysis. Annual review of neuroscience. 2012 Jul 21;35:485-508.

Mapping methods suitable for research goals

Ivanova AA, Schrimpf M, Anzellotti S, Zaslavsky N, Fedorenko E, Isik L. Beyond linear regression: mapping models in cognitive neuroscience should align with research goals. Neurons, Behavior, Data analysis, and Theory. 2022 Aug 8.

David Marr’s Levels of Abstraction

Marr D, Poggio T. From understanding computation to understanding neural circuitry.

Poggio T. The levels of understanding framework, revised. Perception. 2012 Sep;41(9):1017-23.

Biologically Plausible Learning

Lillicrap TP, Santoro A, Marris L, Akerman CJ, Hinton G. Backpropagation and the brain. Nature Reviews Neuroscience. 2020 Jun;21(6):335-46.

Statistical Physics of Learning in Deep Networks

Bahri Y, Kadmon J, Pennington J, Schoenholz SS, Sohl-Dickstein J, Ganguli S. Statistical mechanics of deep learning. Annual Review of Condensed Matter Physics. 2020 Mar 10;11:501-28.

Neural Manifold Framework

Chung S, Abbott LF. Neural population geometry: An approach for understanding biological and artificial neural networks. Current opinion in neurobiology. 2021 Oct 1;70:137-44.

Kriegeskorte N, Kievit RA. Representational geometry: integrating cognition, computation, and the brain. Trends in cognitive sciences. 2013 Aug 1;17(8):401-12.

Student’s topic of choice

Check with the instructor