Contents


Preface

The recalcitrant and alarming title of Ioannidis' research article from 2005 (quoted above) addressed a growing concern that the current practice of statistics to a significant extent could be flawed. In his book Willful Ignorance: The Mismeasure of Uncertainty (2009), Herbert Weisberg pointed out the nearly abyssal difference in perspectives of quantitative and qualitative studies and reflected on the historical evolution of statistics, displeased by the fact that practical scientists nowadays systematically avoid discussing the meaning of probability, despite its being intensively debated for over two centuries. These claims of doubt were accompanied by Alex Reinhard, who in his book Statistics Done Wrong: The Woefully Complete Guide (2015) discussed a number of questionable ways statistical analyses of today are

The famous statistician R.A. Fisher stated in the early 20th century that: "The tendency of modern scientific teaching is to neglect the great books, to lay far too much stress upon relatively unimportant modern work, and to present masses of detail of doubtful truth and questionable weight in such a way as to obscure principles." In the modern society flooded not only by massive unleashed computational power but also by data-based artificial intelligence and unprecedented means to craft stunning graphics, it is not at all surprising that people at large are struggling to distinguish facts from fake. Fisher's concern from the 1930s suddenly appears more alive than ever before. The need to re-evaluate statistical practices thus steadily seem to grow

These observations suggest current statistical methodologies may have gone too far into an engineering practice with minimal questioning. As an example, multiple tests are often launched in scientific studies to enhance the likelihood of finding a

In the modern age pervaded by data in various forms, statistical processing affects almost everyone, whether we like it or not. Internet-based analyses operate on sample sets almost as big as the populations they are drawn from. Behavioral, political, and customer surveys are frequently made. Despite this, the basic principles and practices have remained remarkably intact over the years. In stark contrast stands the rapid evolution of computer programming methodologies and development of convenient, flexible, high-level, and broadly accessible software tools for statistical analysis and presentation, like Python, R, Matlab/Octave, etc. To fully utilize their potential rather than be distracted and paralyzed by their breathtaking performance, good statistical methodologies and practices are of greater value than

ever before. This context provided the basic motivation for this book.

A selection of various aspects of statistical methodologies are here presented by independent authors. The chapters are not meant to be exhaustive or representative, even though each contribution is self-contained and complete within the task

significant result, although this should be impossible …


"Why Most Published Research Findings Are False"

performed.

the more they are utilized.
