“It is utterly implausible that a mathematical formula should make the future known to us, and those who think it can would once have believed in witchcraft.” Jakob Bernoulli (1655–1705), Ars Conjectandi (1713)
Biographical Information
I specialise in bridging the divide between expert knowledge and analytical application, ensuring qualitative insights are fully leveraged by quantitative modellers and key decision makers. I have applied this skill across diverse sectors, including government policy, industrial planning, and scientific discovery, to transform complex information into strategic value.
More generally, I am an applied statistician with experience in developing novel statistical methods within a Bayesian framework. Immediately before joining Durham in 2022, I worked for two-and-a-half years in industry modelling large-scale flood events for JBA Risk Management Ltd. Until late 2019, I was an associate professor of statistics and director of research and innovation in the School of Mathematics at the University of Leeds. I also provide consultancy services on Bayesian statistics, expert elicitation and accounting for uncertainty.
Before joining Leeds, I spent three years as a civil servant at the Food and Environment Research Agency. My role there was to develop new methodologies to help characterise and propagate uncertainties in risk assessments.
My formal training in statistics took place at the University of York, with Peter Lee teaching many of my undergraduate statistics courses, and then at the University of Sheffield under the supervision of Jeremy Oakley and Tony O’Hagan. For the latter, I spent three years as a PhD student and three years as a postdoctoral researcher working on expert elicitation and analysing complex computer models. Even further back, before taking up my PhD position, I almost turned my back on studying altogether to become a pub landlord (that is a different story&hellip).
Current Research Interests
My research to date has focused on what to do in situations where data are sparse: how can we use computer simulators and expert knowledge when making decisions? In particular, I have looked at:
- Structured elicitation of expert beliefs and the potential for large language models in helping experts to make judgements;
- Uncertainty and sensitivity analysis of complex computer models;
- Robustness of Bayesian inference to changes in the model specification (see Michael Goldstein’s 2006 article for the standpoint I find most compelling);
- Modelling of preferences through networks (specifically CP-nets).
This list is not exhaustive, and I am particularly interested in pursuing methodological developments that will help to increase the uptake of the Bayesian and subjectivist standpoints. Much of my previous research was motivated by the reduction of the use of animals in toxicological risk assessment. I was the principal investigator on the NC3Rs- and EPSRC-funded project “Uncertainty and confidence in applying mathematical models and in vitro data in toxicological safety assessments”.