See the problem sets page for information about homework availability and due dates.

You will observe that the lectures don't follow any particular text. Thus, although references to discussions textbooks may be helpful, they won't necessarily correspond exactly to what is covered in class. MacKay, GK, HKP, Handbook, DA, Bishop, and Maass refer to the required or recommended books for the course (see information page). For recommended handouts, go here.

- Sep 26, 2006. No class.

Download syllabus, do the assigned reading for the Thursday class.*Brain stats, neuronal densities, linear algebra handouts recommended*. - Sep 28, 2006. Linear Threshold Units in high dimensions.

Linear separability. Perceptron Learning Rule. Noise tolerance for pattern recognition. Functions uncomputable by LTU. w/ & w/o thresholds.**MacKay 38.****HKP 1.1-1.3, 5.1-5.3.****Two articles: autoencoders.pdf and correlated_states.pdf.**See the problem sets page for details.Sep 29, 2006.

*Homework 0 due: Background warm-up (self-graded)* - Oct 3, 2006. Layered architectures.
*(Response to readings due in class.)*

What they can compute. Vision examples.**Riesenhuber & Poggio, Nature Neuroscience, 2: 1019-1025 (1999).** - Oct 5, 2006. Gradient descent for multi-layer learning I.

Approximation vs Classification. Analog vs discrete. Error landscapes in weight space. Single-unit gradient descent learning, comparison to PLR. Adaptive step size, Local minima.**MacKay 39. HKP 5.4-5.5**Oct 6, 2006.

*Homework 1 due: Perceptron, High-dimensional spaces* - Oct 10, 2006. Gradient descent for multi-layer learning II.

Back-propagation algorithm. Other objective functions.**HKP 6.1-6.3.** - Oct 12, 2006. Generalization and Inference.

Why least-squares? Why minimize it? Overfitting issue. Training and Validation sets. Early stopping. Bias vs variance. Bayesian Inference, Maximum Likelihood, Gaussians, and squared error.**HKP 6.4-6.6. MacKay 2.1-2.3, 3, 41.1-41.2, 44.***Geman et al, 1992*Oct 13, 2006.

*Homework 2 due: Gradient descent, Multilayer computation* - Oct 17, 2006. Capacity of LTU.

Quantification of limitations: number of functions computable by LTU.**MacKay 40. HKP 5.7**## Recurrent Networks

- Oct 19, 2006. Recurrent network dynamics.

Feedforward vs. recurrent architectures. Rate equations. Recurrent net example: flip-flop. Linear stability analysis. Lyapunov functions. stability vs oscillations in a natural circuit.**HKP 3.4.**Oct 20, 2006.

*Homework 3 due: Backpropagation & Limitations* - Oct 24, 2006. Computation by energy minimization in recurrent nets.

Lyapunov function for symmetric networks. Computation of hard problems by energy minimization.**HKP 3.3, 4.1-4.4.***MacKay 42.9.* - Oct 26, 2006. Stochastic networks & Ising model.

Metropolis & Glauber dynamics, detailed balance. Simulated annealing.**MacKay 29.6, 30.3, 31, 33.1-33.3.**Oct 27, 2006.

*Homework 4 due: Generalization and Lyapunov analysis* - Oct 31, 2006. Associative memories I.

Associative memory problem. Single memory attractor basins. Hebb's rule for multiple memories. Crosstalk terms, stability probabilities for random memories.**HKP 2.1-2.2. MacKay 42.1-42.6.** - Nov 2, 2006. Associative memory II.

Statistical capacity; improved learning for associative memories.**HKP 2.3-2.5. MacKay 42.7-42.8.**Nov 3, 2006.

*Homework 5 due: Computing by energy min, Stochastic networks* - Nov 7, 2006. The Boltzman machine.

Probabilistic models, hidden units, learning rule.**HKP 7.1. MacKay 43.**## Spiking Networks

- Nov 9, 2006. Compartmental modeling of neurons and Integrate-&-fire model.

Electronic circuit models. Linear dendritic tree. HH model. I&F model. Single-cell computation overview.**GK 1 and 2**Nov 10, 2006.

*Homework 6 due: Associative memory* - Nov 14, 2006. Mean rate behavior of I & F model.

I-f curve, noise. Integrate-and-Fire <--> analog firing rate <--> electrical circuits.**GK 4, 5.9.1, 6.4***Mean field equation derivation handout.* - Nov 16, 2006. Restoration and equivalence of computation

Analog --> M-P (value restoration). IaF --> M-P (temporal restoration). General restoration to an attractor.Nov 17, 2006.

*Homework 7 due: Integrate and Fire Neuron, Stochastic networks* - Nov 21, 2006. Computing with I&F units.

Sound localization, Integrate-and-fire units with delays, spike timing. Analog match problem & invariants for recognition.**Hopfield, Nature, v376, 33-36 (1995), Carr and Konishi, J. Neurosci. v10, 3227-3246 (1990).****GK 12.5** - Nov 23, 2006. THANKSGIVING!!!! NO CLASS!!!!
- Nov 28, 2006. Oscillation and Synchronization in I&F units

Synchronization to signal. Synchronization of network, locking.**GK 9.4** - Nov 30, 2006. Computation by Synchronization

Synchronization for pattern recognition (Most-Approx-Equal, speech example).**Hopfield and Brody, PNAS 97, 13919-13924 (2000) and PNAS 98, 1282-1287 (2001).***Hopfield, Brody, Roweis, NIPS 10, 166-172 (1998).*Dec 1, 2006.

*Homework 8 due: Temporal restoration, spike timing*