Pub. Date:
MIT Press
The Minimum Description Length Principle

The Minimum Description Length Principle

by Peter D. Grünwald
Current price is , Original price is $60.0. You

Temporarily Out of Stock Online

Please check back later for updated availability.


A comprehensive introduction and reference guide to the minimum description length (MDL) Principle that is accessible to researchers dealing with inductive reference in diverse areas including statistics, pattern classification, machine learning, data mining, biology, econometrics, and experimental psychology, as well as philosophers interested in the foundations of statistics.

The minimum description length (MDL) principle is a powerful method of inductive inference, the basis of statistical modeling, pattern recognition, and machine learning. It holds that the best explanation, given a limited set of observed data, is the one that permits the greatest compression of the data. MDL methods are particularly well-suited for dealing with model selection, prediction, and estimation problems in situations where the models under consideration can be arbitrarily complex, and overfitting the data is a serious concern. This extensive, step-by-step introduction to the MDL Principle provides a comprehensive reference (with an emphasis on conceptual issues) that is accessible to graduate students and researchers in statistics, pattern classification, machine learning, and data mining, to philosophers interested in the foundations of statistics, and to researchers in other applied sciences that involve model selection, including biology, econometrics, and experimental psychology.

Part I provides a basic introduction to MDL and an overview of the concepts in statistics and information theory needed to understand MDL. Part II treats universal coding, the information-theoretic notion on which MDL is built, and part III gives a formal treatment of MDL theory as a theory of inductive inference based on universal coding. Part IV provides a comprehensive overview of the statistical theory of exponential families with an emphasis on their information-theoretic properties. The text includes a number of summaries, paragraphs offering the reader a "fast track" through the material, and boxes highlighting the most important concepts.

Product Details

ISBN-13: 9780262529631
Publisher: MIT Press
Publication date: 03/23/2007
Series: Adaptive Computation and Machine Learning series
Pages: 736
Sales rank: 1,074,977
Product dimensions: 6.90(w) x 9.10(h) x 1.60(d)
Age Range: 18 Years

About the Author

Peter D. Grünwald is a researcher at CWI, the National Research Institute for Mathematics and Computer Science, Amsterdam, the Netherlands. He is also affiliated with EURANDOM, the European Research Institute for the Study of Stochastic Phenomena, Eindhoven, the Netherlands.

Customer Reviews

Most Helpful Customer Reviews

See All Customer Reviews

The Minimum Description Length Principle 5 out of 5 based on 0 ratings. 1 reviews.
mdreid on LibraryThing More than 1 year ago
I have not read this book from cover to cover (and doubt I will) but the sections I have read so far¿universal codes, exponential families, and MDL in context¿have all been excellent. I would recommend the treatment of exponential families to anyone who wants to grasp the basic ideas intuitively before moving on to more technical treatments. The MDL principle is presented in an enthusiastic but considered light. I was particularly impressed with the sober analysis of its short-comings and myriad directions for future research.I expect this will be one of those books I will return to over the years, obtaining new insights each time.