Книга: The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World

Chapter Seven

Chapter Seven

Frank Abagnale details his exploits in his autobiography, Catch Me If You Can, cowritten with Stan Redding (Grosset & Dunlap, 1980). The original technical report on the nearest-neighbor algorithm by Evelyn Fix and Joe Hodges is “Discriminatory analysis: Nonparametric discrimination: Consistency properties”* (USAF School of Aviation Medicine, 1951). Nearest Neighbor (NN) Norms,* edited by Belur Dasarathy (IEEE Computer Society Press, 1991), collects many of the key papers in this area. Locally linear regression is surveyed in “Locally weighted learning,”* by Chris Atkeson, Andrew Moore, and Stefan Schaal (Artificial Intelligence Review, 1997). The first collaborative filtering system based on nearest neighbors is described in “GroupLens: An open architecture for collaborative filtering of netnews,”* by Paul Resnick et al. (Proceedings of the 1994 ACM Conference on Computer-Supported Cooperative Work, 1994). Amazon’s collaborative filtering algorithm is described in “Amazon.com recommendations: Item-to-item collaborative filtering,”* by Greg Linden, Brent Smith, and Jeremy York (IEEE Internet Computing, 2003). (See Chapter 8’s further readings for Netflix’s.) Recommender systems’ contribution to Amazon and Netflix sales is referenced in, among others, Mayer-Sch?nberger and Cukier’s Big Data and Siegel’s Predictive Analytics (cited earlier). The 1967 paper by Tom Cover and Peter Hart on nearest-neighbor’s error rate is “Nearest neighbor pattern classification”* (IEEE Transactions on Information Theory).

The curse of dimensionality is discussed in Section 2.5 of The Elements of Statistical Learning,* by Trevor Hastie, Rob Tibshirani, and Jerry Friedman (2nd ed., Springer, 2009). “Wrappers for feature subset selection,”* by Ron Kohavi and George John (Artificial Intelligence, 1997), compares attribute selection methods. “Similarity metric learning for a variable-kernel classifier,”* by David Lowe (Neural Computation, 1995), is an example of a feature weighting algorithm.

“Support vector machines and kernel methods: The new generation of learning machines,”* by Nello Cristianini and Bernhard Sch?lkopf (AI Magazine, 2002), is a mostly nonmathematical introduction to SVMs. The paper that started the SVM revolution was “A training algorithm for optimal margin classifiers,”* by Bernhard Boser, Isabel Guyon, and Vladimir Vapnik (Proceedings of the Fifth Annual Workshop on Computational Learning Theory, 1992). The first paper applying SVMs to text classification was “Text categorization with support vector machines,”* by Thorsten Joachims (Proceedings of the Tenth European Conference on Machine Learning, 1998). Chapter 5 of An Introduction to Support Vector Machines,* by Nello Cristianini and John Shawe-Taylor (Cambridge University Press, 2000), is a brief introduction to constrained optimization in the context of SVMs.

Case-Based Reasoning,* by Janet Kolodner (Morgan Kaufmann, 1993), is a textbook on the subject. “Using case-based retrieval for customer technical support,”* by Evangelos Simoudis (IEEE Expert, 1992), explains its application to help desks. IPsoft’s Eliza is described in “Rise of the software machines” (Economist, 2013) and on the company’s website. Kevin Ashley explores case-based legal reasoning in Modeling Legal Arguments* (MIT Press, 1991). David Cope summarizes his approach to automated music composition in “Recombinant music: Using the computer to explore musical style” (IEEE Computer, 1991). Dedre Gentner proposed structure mapping in “Structure mapping: A theoretical framework for analogy”* (Cognitive Science, 1983). “The man who would teach machines to think,” by James Somers (Atlantic, 2013), discusses Douglas Hofstadter’s views on AI.

The RISE algorithm is described in my paper “Unifying instance-based and rule-based induction”* (Machine Learning, 1996).

Оглавление книги


Генерация: 1.198. Запросов К БД/Cache: 3 / 0
поделиться
Вверх Вниз