This page will be updated shortly before the midterm and final exams to reflect what we actually covered this semester.

Midterm exam

Reading Guide:

Listed below are the minimum things you should be prepared to do. This is not an all-inclusive list, but you should at least be prepared to do these things:

Homework 1 and 2

==> Logistic Regression, Linear Regression

  • How do you update the parameters for linear/logistic regression?
  • What is the cost function and what is the log-likelihood?

==> Naive Bayes (NB), Laplace smoothing

  • How to estimate parameters for likelihood functions using bayes rule?
  • How to calculate/ predict class labels in generative models?
  • How do you handle continuous and discrete X in NB?

Generative vs discriminative

  • how many parameters needed for generative classifiers,
  • how does NB assumption improves it,
  • how do you calculate conditional probabilities from joint probabilities?

Kernel

Stanford lecture notes (SML) 5.1, 5.2, 5.4 (page 53, 54, 55)
  • Focus on What kinds of functions K(·, ·) can correspond to some feature map φ?
  • How to calculate φ(x) from x?
  • How to calculate the weight-parameters (w/theta) given the decision function? You will find examples in the sample exams

SVM

  • Impact of offset, impact of C and slack variable (slides show examples)

Bias - variance, and Cross-validation

  • How is model complexity connected to bias, variance, and test error?
  • How does L1 and L2 regularization affect classifiers?
  • How does cross-validation help us to better generalize?

Multiclass classification

  • How is Multiclass classification connected to Logistic Regression (the general idea)
  • what is their loss (how do you train multiclass)?

Sample Questions

I am providing some sample exams. Please be reminded that not all the topics in the questions have been convered in our classes. I have crossed them out. But if you still see a question with unfamiliar topic, do not focus on that. Some topics such as bias-variance may have similar questions, but with classifiers/models we have covered. Some old midterms from CMU 10-701 course: