Unlock Free Shipping at $50
Shopping Cart
Essentials of Statistical Inference - Cambridge Statistical & Probabilistic Mathematics Series #16 | Data Analysis Textbook for Researchers & Students | Perfect for Academic Studies, Research Papers & Statistical Modeling
$97.5
$130
Safe 25%
Essentials of Statistical Inference - Cambridge Statistical & Probabilistic Mathematics Series #16 | Data Analysis Textbook for Researchers & Students | Perfect for Academic Studies, Research Papers & Statistical Modeling
Essentials of Statistical Inference - Cambridge Statistical & Probabilistic Mathematics Series #16 | Data Analysis Textbook for Researchers & Students | Perfect for Academic Studies, Research Papers & Statistical Modeling
Essentials of Statistical Inference - Cambridge Statistical & Probabilistic Mathematics Series #16 | Data Analysis Textbook for Researchers & Students | Perfect for Academic Studies, Research Papers & Statistical Modeling
$97.5
$130
25% Off
Quantity:
Delivery & Return: Free shipping on all orders over $50
Estimated Delivery: 10-15 days international
9 people viewing this product right now!
SKU: 12917330
Guranteed safe checkout
amex
paypal
discover
mastercard
visa
apple pay
shop
Description
This textbook presents the concepts and results underlying the Bayesian, frequentist, and Fisherian approaches to statistical inference, with particular emphasis on the contrasts between them. Aimed at advanced undergraduates and graduate students in mathematics and related disciplines, it covers basic mathematical theory as well as more advanced material, including such contemporary topics as Bayesian computation, higher-order likelihood theory, predictive inference, bootstrap methods, and conditional inference.
More
Shipping & Returns

For all orders exceeding a value of 100USD shipping is offered for free.

Returns will be accepted for up to 10 days of Customer’s receipt or tracking number on unworn items. You, as a Customer, are obliged to inform us via email before you return the item.

Otherwise, standard shipping charges apply. Check out our delivery Terms & Conditions for more details.

Reviews
*****
Verified Buyer
5
This short book covers all the major topics in statistical theory. I feel like the authors strike the perfect balance between building intuition and mathematical detail. Most standard results are accompanied by proofs. However instead of full mathematical generality, the proofs are often presented with simplified assumptions, that still bring out the essence of the main ideas. Measure theory is kept to the minimum, whenever it's needed the authors use discrete setting instead.The book could be used as a refresher, reference, or a study guide to accompany another statistics book (such as Casella & Berger). It is definitely geared towards readers who have studied statistical inference before. The minimum mathematical prerequisites are: calculus at the level of elementary analysis, linear algebra, discrete and continuous probability, and familiarity with convergence of random variables. Basic group theory is used in one place to define transformation families.It's incredible how much the authors cover in this short book. It starts out with decision theory in chapter 2, which is unifies frequentist and bayesian approaches to statistical inference under finding decisions to minimize risk. Chapter 3 is fully devoted to Bayesian analysis and covers conjugate priors, Bayesian confidence intervals, empirical bayes, hierarchical models, prediction, and computational techniques. There's also a nice description of the James-Stein estimator, a surprising result of using seemingly unrelated data to improve an estimator. Chapters 4-8 then cover the standard statistical inference topics from frequentist perspective: hypothesis testing and interval estimation, sufficient and ancillary statistics, conditionality principle, exponential models, maximum likelihood and fisher information. I especially enjoyed chapter 8 on maximum likelihood, the authors do a superb job intuitively justifying the asymptotic normality of maximum likelihood estimators.Next three chapters deal with topics not usually encountered in first courses on statistics. Chapter 9 is a dense chapter on asymptotic density approximations. It deals with approximations such as CLT, but with more terms to improve the convergence speed. There is not much motivation or intuition here, most results are simply stated without proofs. It could serve as a good first exposure to this material, but to get any deeper understanding will require consulting other books devoted to this subject.Chapter 10 deals with predicting random variables and their distributions, rather than inference about unknown parameters. This is probably the closest chapter to what is currently known as "machine learning". Here it is shown that better predictions can be made directly, instead of by first estimating the parameters and then plugging them into the distribution to make predictions. Usually the topic of formulating predictive distributions is covered only from the Bayesian viewpoint, but this chapter shows how it can also be done using the frequentist appoach. It covers several such methods: pivots, predictive likelihood, and bootstrap. This chapter is also short on intuition and proofs, and only scratches the surface of this subject. It could serve as a first exposure to build your interest, but you'll have to explore the references to make it useful in applied work. The section on predictive likelihood looks very similar to the two papers "Predictive inference a review" by Bjornstad and "Predictive inference" by Hinkley. I suggest reading those papers instead, since the book doesn't do such a good job motivating these ideas.Last Chapter 11 deals with bootstrap. This has now become a very popular method of building empirical distributions for estimators, without relying on analytic derivations or asymptotic approximations. Notation is very important when presenting bootstrap methods, since it needs to be kept clear which is the modeling distribution, which are the distributions for drawing bootstrap samples, and what are the various parameters and their estimates. I feel like the authors do a good job at presenting this material without ambiguity.Each chapter is relatively short and usually focuses on a few key concepts. Authors do a very good job using examples to motivate and elucidate the material. I would actually like to see even more examples, especially of various sufficient and ancillary statistics. The exercises following each chapter are not incredibly hard, and focus on applying the theory in various special cases, providing more examples to the reader.In summary, this is a very readable and concise summary of the main topics in statistical inference, with just enough mathematical detail to be interesting and elucidating, while not becoming too dry. I would definitely recommend it to anyone who have studied statistical theory before and would like another source of this material for reference.

You Might Also Like