CS128/PSYCH128 Computational Models of Learning
CS128/PSYCH128
Computational Models of Learning
Spring 1999, Tuesdays 6:30 - 9:30pm
Trotter 215 (discussion) and Trotter 117 (lab)
Robert
Dufour,
Papazian 323, x8417 (rdufour1)
Lisa Meeden,
Sproul 1, x8565, (meeden@cs)
This course will cover computer-based representational formalisms and
algorithms that facilitate learning behaviors and that are inspired by
biological models. We will focus on connectionist models that are based on
neural network abstractions. We will also cover evolutionary-based
approaches such as genetic algorithms, genetic programming, and
evolutionary programming, discussing the kinds of learning behaviors
facilitated by them. The course is made-up of two components: laboratory
and discussion. Every week, students will have the opportunity to work with
various models and algorithms in the development of learning behaviors.
Readings relevant to the models to-be-covered that week will also be
assigned and be used as the basis for seminar-style discussions.
- Understanding Neural Networks, Volume 1: Basic Networks,
Caudill and Butler, MIT Press, 1992.
- Rethinking Innateness: A Connectionist Perspective on
Development, Elman, Bates, Johnson, Karmiloff-Smith, Parisi, and Plunkett,
MIT Press, 1996.
- Excercises in Rethinking Innateness: A Handbook for
Connectionist Simulations, Plunkett and Elman, MIT Press.
- Various other articles to be provided in a reading packet.
Your performance in this course will be evaluated along four
parameters:
1. Class Participation & Reaction Notes: 20%
Reaction notes, your reactions to that week's readings, will be due by
5pm on Monday evenings via email. These notes will be posted to the
web page to facilitate discussion. These notes should not be summaries of
the papers, instead they should be the product the reading process
(e.g., questions that were raised, points that were not clear, links
to previous material you might have read, etc). You are expected to
be an active participant in class discussions and that you come
prepared for class every week.
Each student will also be designated as discussion leader for one of the
class meeting.
2. Home work: 20%
One of the goal for this course is to provide students with a
framework to get hands-on experience working with various
computational models. Although we will do some of that work in-class,
some of that experience will have to be gained through homework.
Homework will be due by 5pm on Fridays via email.
3. Midterm Project: 20%
There will be a mid-semester paper, a
write-up of one of the computational simulations available in
Exercises in Rethinking Innateness. Assigned February 23. Due
March 19.
4. Term Project: 40%
Each student will be asked to design a project involving computational
approach to learning. A series of topics will be provided at the beginning
of the semester. These projects may be carried-out by pairs of students.
Each student will be asked to present their project to the class at
the end of the semester as well as turn in a written report.
Week 01: Tuesday January 20
Topic: Introduction to Neural Networks
Lab:
- Understanding Neural Networks, chapters 1, 2, 3, & 4.
Week 02: Tuesday January 27
Topic: Competitive learning and attractor networks
Reading:
- Anderson, J.A. (1995). Introduction to Neural Networks.
Chapters 1 & 2 (pp. 1 - 60). Cambridge, MA: MIT Press.
- Reactions
Lab:
- Understanding Neural Networks, chapters 6, 7, 8, & 9
Week 03: Tuesday February 2
Topic: Backpropagation Networks
Reading:
- Rethinking Innateness, chapters 1 & 2
- Reactions
Lab:
- Understanding Neural Networks, chapters 10 & 11.
- Exercises in Rethinking Innateness, chapters 1, 2, & 3
Week 04: Tuesday February 9
Topic: Issues of Representation
Reading:
- Blank, D.S., Meeden, L.A., and Marshall, J.B. (1992). Exploring the
Symbolic/Subsymbolic Continuum: A Case Study of RAAM. In J. Dinsmore
(ed.), The Symbolic and Connectionist Paradigms: Closing the Gap, LEA,
chapter 6, pp. 113 - 148.
- Clark, A. (1993). Associative Engines, MIT Press, chap. 3 & 4, pp. 41 - 86.
- Jagota, Plate, Shastri, & Sun editors (1999).
Connectionist Symbol Processing: Dead or Alive?
Neural
Computing Surveys 2, pp
1-40.
- Reactions
Lab:
- Exercises in Rethinking Innateness, chapters 4 & 5
Week 05: Tuesday February 16
Topic: Backpropagation networks (continued)
Reading:
Lab:
- Exercises in Rethinking Innateness, chapters 6 & 7
Week 06: Tuesday February 23
Topic: Recurrent Networks
Reading:
- Rethinking Innateness, chapter 4 & 5
- Reactions
Lab:
- Exercises in Rethinking Innateness, chapters 8 & 9
Week 07: Tuesday March 2
Topic: Recurrent Networks (continued)
Reading:
- Rethinking Innateness, chapter 6 & 7
- Do we have it in us? A review of Rethinking
Innateness by Jerry Fodor, TLS, May 1997
- Reactions
Lab:
- Exercises in Rethinking Innateness, midterm projects
Week 08:
SPRING BREAK
Week 09: Tuesday March 16
Topic: Critiques of connectionism
- Marcus, Gary (1995). The acquisition of the English past tense in
children and multi-layered connectionist networks.
Cognition, Volume 56, Number 3, pages 271-279.
- Marcus, Gary (1998). Can connectionism save constructivism?
Cognition, Volume 66, Number 2, pages 153-182.
- Reactions
Week 10: Tuesday March 23
Topic: Rule extraction and insertion
- Omlin, C.W. & Giles, C.L. (1996). Extraction of rules from
discrete-time recurrent neural networks. Neural Networks,
Volume 9, Number 1, pages 41-52.
- Omlin, C.W. & Giles, C.L. (1996). Rule revision with recurrent
neural networks. IEEE Transactions on Knowledge and Data
Engineering, Volume 8, Number 1.
- Giles, C.L., Lawrence, S. & Tsoi, A.C. (1997). Rule inference for
financial prediction using recurrent neural networks. In The
Proceedings of IEEE Conference on Computational Intelligence for
Financial Engineering, IEEE Press, Piscataway, NJ, pages 253-259.
Homework: Prepare a one-page final project
proposal, due at the next class meeting.
Week 11: Tuesday March 30
Topic: Experimental design for projects
- Tarassenko, L. (1998). A Guide to Neural Computing
Applications, Chapters 6 & 7. John Wiley & Sons, New York, NY.
- Cohen, P. (1995). Empirical Methods for Artificial
Intelligence, Chapters 3 & 4. MIT Press, Cambridge, MA.
Be prepared to give a brief summary of your
final project proposal for the class.
Week 12: Tuesday April 6
Topic: Connectionist Models
- Hinton, G.E. & Shallice, T. (1991). Lesioning a connectionist
network: Investigations of acquired dyslexia. Psychological
Review, Volume 98, pages 74-95.
- Jacobs, R.A., Jordan, M.I., & Barto, A.G. (1991). Task
decomposition through competition in a modular connectionist
architecture: The what and where vision tasks. Cognitive
Science, Volume 15, Number 2, pages 219-250.
- Plunkett, K., Sinha, C. Moller, M.F., & Strandsby,
O. (1992). Symbol grounding of the emergence of symbols? Vocabulary
growth in children and a connectionist net. Connection Science,
Volume 4, Numbers 3-4, pages 293-312.
- Reactions
Week 13: Tuesday April 13
Topic: Dynamic Connectionist Models
- Beer, R.D. (1995). Computational and dynamical languages for
autonomous agents. In Minds as motion: Explorations in the dynamics
of Cognition, edited by Port, R.F. & van Gelder, T., MIT Press,
Cambridge, MA, pages 121-148.
- Shastri, L. & Fontaine, T. (1999). Recognizing handwritten digit
strings using modular spatio-temporal connectionist
networks. Connection Science, Volume 7, Number 3.
- Reactions
Week 14: Tuesday April 20
Final project presentations by students
- David P.
- Charlie
- Martine
- Ben
Week 15: Tuesday April 27
Final project presentations by students
- Martin and Simon
- Chaos
- David A.
- Jon
- Nik and Nathaniel
- Craig
- Josh
Week 16: Tuesday May 4
Dinner at Lisa's 6:30pm