How Siri Works: iPhone's 'Brain' Comes from Natural Language Processing, Stanford Professors to Teach Free Online Course
Ever wonder how Siri understands you? It's called natural language processing and, starting in January, two of the field's leaders from Stanford will teach a free online course teaching you how to create these types of leading-edge programs.
Here's the scoop from Stanford's Online Education Site (all rights Stanford):
The course plan calls for a wide overview of various natural language processing topics--running the gamut from word and sentence tokenization, text classification and sentiment analysis, spelling correction, information extraction, parsing, meaning extraction, and question answering,
In other words, everything from your spell-checker type functions to Siri's enigmatic responses to philosophical questions. According to the pair, We will also introduce the underlying theory from probability, statistics, and machine learning that are crucial for the field, and cover fundamental algorithms like n-gram language modeling, naive bayes and maxent classifiers, sequence models like Hidden Markov Models, probabilistic dependency and constituent parsing, and vector-space models of meaning.
OK, so this is a real--read:tough--course. If you aren't prepared to do the work you will be lost as fast as any English major who ever wandering into a computational linguistics class. That said, these guys re MAJOR leaders in the field and this course can level the playing field for you, if you are determined enough to keep up your A-game.
The pair of profs have major street cred. According to their writeup on the Stanford site, Professors Jurafsky and Manning are the leading natural language processing educators, through their textbooks on natural language processing, speech, and information retrieval.
Here's what they look like and their brief bios, again from Stanford:
Dan Jurafsky is Professor of Linguistics and Professor by Courtesy of Computer Science at Stanford University. Dan received his Bachelors degree in Linguistics in 1983 and his Ph.D. in Computer Science in 1992, both from the University of California at Berkeley, and also taught at the University of Colorado, Boulder before joining the Stanford faculty in 2004. He is the recipient of a MacArthur Fellowship and has served on a variety of editorial boards, corporate advisory boards, and program committees. Dan's research extends broadly throughout natural language processing as well as its application to the behavioral and social sciences.
Christopher Manning is an Associate Professor of Computer Science and Linguistics at Stanford University. Chris received a Bachelors degree and University Medal from the Australian National University and a Ph.D. from Stanford in 1994, both in Linguistics. Chris taught at Carnegie Mellon University and The University of Sydney before joining the Stanford faculty in 1999. He is a Fellow of the American Association for Artificial Intelligence, and is one of the most cited authors in natural language processing, for his research on a broad range of statistical natural language topics from tagging and parsing to grammar induction and text understanding.
The class starts January 23, 2012, so you have that much time to tear yourself aware from office parties and brush up on Bayesian logic, and all the other things you will be well-advised to have at least an introductory grasp of before embarkin on this adventure. Classes will be on video, two hours a week, with ungraded homework (and yes, you better do it) and real programming exercises--and then a machine graded final exam. They also, hope to transcribe the lectures into text to make them more accessible for those not fluent in English. But as to that they advised, Stay tuned.
If you get lost, there will be a Q&A forum for questions and answers, which should help. Teaching staff will monitor these and if no one else can answer your question, they will.
© Copyright IBTimes 2024. All rights reserved.