DCS 530 -- Principles of Artificial Intelligence

Fall 2001


Main Page Contents
About this course
Schedule
Lectures
Materials
Links

Course People
Matthew Stone
Rong Zhang

Schedule

Class Tuesday/Thursday 4:30-5:50, Hill 254


Announcements

  • Dec 16
    Reminder: the exam is either at 9am or 4pm December 18 in Hill 254.
    Caveat: there is a typo in the justification of the ANOVA in Cohen, on page 193. The between-group variance should include an additional factor Nj - the number of data points in group j.
    Good luck.

  • Nov 27
    Short Written Exercises
    Due December 6.

  • Nov 13
    Final paper - a short proposal
    Due December 11
    (Plenty of time to think about it.)
    Reminder: No office hours Nov 14, Dimitris Metaxas Nov 15.

  • Nov 8
    Written Exercises
    Due November 20
    Updated and synchronized syllabus on this page.

  • Oct 4
    Correction to Homework One
    Small typo fixed in mystery algorithm of extra credit A

  • Oct 3
    Written Exercises
    Due October 18

  • Sep 27
    Updated and synchronized syllabus on this page.

  • Sep 25
    Homework One
    Due October 11

  • Aug 29
    Who should take this class?


Lecture Schedule, AI Events, Notes

  • Sep 4
    What is AI?

  • Module 1: A Prototypical Case Study in AI
    Reading: Agents in the Real World
    Homework: Decision Analysis
    Lectures: Sep 6-Sep 20

  • Sep 6
    Decision analysis as a computational model (1)
    Actions, observations, outcomes; probability and utility.

  • Sep 11
    United we stand

  • Sep 13
    Decision analysis as a computational model (2)
    Interpreting models; solving for and carrying out policies in agents.

  • Sep 18
    Decision analysis as a computational model (3)
    Design and evaluation; sensitivity analysis, statistical hypotheses about running agents.

  • Sep 20
    Decision analysis as a computational model (3)
    Computational complexity; training data; model estimation and model induction; generalization, model selection, data sparsity.

  • Module 2: Perception and Bayesian Inference
    Reading: Chris Bishop, Neural Networks for Pattern Recognition Chapters 1-3
    Homework: Written Exercises
    Lectures: Sep 25-Oct 11.

  • Sep 25
    Bayesian analysis and models for classification (1)
    Motivation for a Bayesian approach. (Ch 1.1-1.3; 1.8)

  • Sep 27
    Bayesian analysis and models for classification (2)
    Naive Bayes inference (discrete case). Text classification. (Ch 3.1.4)

  • Oct 2
    Bayesian analysis and models for classification (3)
    Continuous variables, normal distributions, linear classifiers. (Ch 1.8; Ch 3.1; Ch 3.2)

  • Oct 4
    Bayesian analysis and models for classification (4)
    Learning from training data. Maximum likelihood. (Ch 1.9 and 2.1-2.4)

  • Oct 9 Guest Lecture: Haym Hirsh
    Current research in text classification.

  • Oct 11 Bayesian analysis and models for classification (5)
    General density estimation: nearest neighbor classification. (Ch 2.5 and 2.5)

  • Oct 16
    Bayesian analysis and models for classification (5)
    Clustering; k-means; expectation maximization.

  • Module 3: Time
    Reading: Eugene Charniak, Statistical Language Learning Chapters 1-7
    Maybeck. Introduction, from Stochastic Models, Estimation and Control
    Russell and Norvig. Chapter 15, Probabilistic Inference, from Artificial Intelligence, a Modern Approach
    Homework: Written Exercises
    Lectures: Oct 18-Nov 15.

  • Oct 18
    Models of time (1)
    Markov models and hidden Markov models.
    Charniak Chapter 2 and 3.1-3.2.

  • Oct 23
    Models of time (2)
    HMM decoding. Part-of-speech tagging.
    Charniak Chapter 3.3 and Chapter 4.1.

  • Oct 25
    Midterm.

  • Oct 30
    Models of time (3)
    HMM inference. Forward/backward.
    Charniak Chapter 4.2

  • Nov 1
    HMM inference: Training. Speech and gesture recognition.
    Graphical representations of probabilistic models.
    Charniak Chapter 4.3; Russell and Norvig Chapter 15.

  • Nov 6
    Models of time (4)
    The Kalman filter. Tracking and learning with Gaussian priors.
    Maybeck, Introduction.

  • Nov 8
    Models of hierarchical structure. Trees, CFGs and PCFGs.
    Charniak Chapter 5.

  • Nov 13
    Algorithms for PCFGs.
    Charniak Chapter 6.

  • Nov 15
    Guest Lecture: Dimitris Metaxas.
    Current research in visual tracking and recognition.

  • Module 4: Planning
    Reading: Russell and Norvig. Chapters 16, 17 and 20, from Artificial Intelligence, a Modern Approach

  • Nov 20
    General probabilistic inference: influence diagrams.

  • Nov 27
    Markov decision processes: Value iteration.

  • Nov 29
    Markov decision processes: Policy iteration.

  • Module 5: Evaluation
    Reading: Cohen. Chapters 3 and 6, from Empirical Methods in Artificial Intelligence

  • Dec 4
    Evaluation (1).
    Pitfalls and methodology.

  • Dec 6
    Evaluation (2).
    Performance metrics.

  • Dec 11
    Evaluation (3).
    Training and test data. Reliability. Cross-validation.

  • Dec 18
    4-7 pm, Hill 254: Final.


Materials


Links