This tutorial reviews Probability starting right at ground level. It is, arguably, a useful investment to be completely happy with probability before venturing into advanced algorithms from data mining, machine learning or applied statistics. In addition to setting the stage for techniques to be used over and over again throughout the remaining tutorials, this tutorial introduces the notion of Density Estimation as an important operation, and then introduces Bayesian Classifiers such as the overfitting-prone Joint-Density Bayes Classifier, and the over-fitting-resistant Naive Bayes Classifier.
Powerpoint Format: The Powerpoint originals of these slides are freely available to anyone who wishes to use them for their own work, or who wishes to teach using them in an academic institution. Please email Andrew Moore at awm@cs.cmu.edu if you would like him to send them to you. The only restriction is that they are not freely available for use as teaching materials in classes or tutorials outside degree-granting academic institutions.
Advertisment: I have recently joined Google, and am starting up the new Google Pittsburgh office on CMU's campus. We are hiring creative computer scientists who love programming, and Machine Learning is one the focus areas of the office. If you might be interested, feel welcome to send me email: awm@google.com .