Courses:

Introduction to Neural Networks >> Content Detail



Syllabus



Syllabus

Amazon logo Help support MIT OpenCourseWare by shopping at Amazon.com! MIT OpenCourseWare offers direct links to Amazon.com to purchase the books cited in this course. Click on the Amazon logo to the left of any citation and purchase the book from Amazon.com, and MIT OpenCourseWare will receive up to 10% of all purchases you make. Your support will enable MIT to continue offering open access to MIT courses.



Course Philosophy


The subject will focus on basic mathematical concepts for understanding nonlinearity and feedback in neural networks, with examples drawn from both neurobiology and computer science. Most of the subject is devoted to recurrent networks, because recurrent feedback loops dominate the synaptic connectivity of the brain. There will be some discussion of statistical pattern recognition, but less than in the past, because this perspective is now covered in Machine Learning and Neural Networks. Instead the connections to dynamical systems theory will be emphasized.

Modern research in theoretical neuroscience can be divided into three categories: cellular biophysics, network dynamics, and statistical analysis of neurobiological data. This subject is about the dynamics of networks, but excludes the biophysics of single neurons, which will be taught in 9.29J, Introduction to Computational Neuroscience.



Prerequisites


  • Permission of the instructor
  • Familiarity with linear algebra, multivariate calculus, and probability theory
  • Knowledge of a programming language (MATLAB® recommended)


Course Requirements


  • Problem sets
  • Midterm exam
  • Final exam


Textbook


The following text is recommended:

Amazon logo Hertz, John, Anders Krogh, and Richard G. Palmer. Introduction to the Theory of Neural Computation. Redwood City, CA: Addison-Wesley Pub. Co., 1991. ISBN: 9780201515602.


 








© 2017 Coursepedia.com, by Higher Ed Media LLC. All Rights Reserved.