Cookies on this website
We use cookies to ensure that we give you the best experience on our website. If you click 'Continue' we'll assume that you are happy to receive all cookies and you won't see this message again. Click 'Find out more' for information on how to change your cookie settings.
Skip to main content

Many psychologists were trained in practices that are now recognised as problematic, because they generate non-reproducible results. Particular problems arise if there is no clear distinction made between hypothesis generation and hypothesis testing, so the same data are used to formulate and test a hypothesis. Furthermore, it has not been customary to check statistical power before doing a study, and many resources are wasted in performing studies that give inconclusive or erroneous results. In addition, confirmation bias means we disregard evidence that does not fit a preconceived idea, leading to canonization of false facts. I will discuss a range of solutions, including use of simulated data in statistical training, collaborative working, open data, pre-registration of studies and changes to the incentive structure for scientists.


Spaces are limited, please book your place by emailing Angela Fox (