Tailored training and coaching
Experienced trainers
Impartial advice


Research projects


Non-parametric Bayesian updates by kernel densities


One of the big attractions for people adopting Bayesian methods is the promise of "updating" their parameter estimates and predictions as more data arrive. Yesterday's posterior becomes today's prior. In practice, this is not always simple, requiring at the very least a complete set of sufficient statistics, random samples for an unchanging population, and no changes of probability distribution for the priors. Sometimes, one would like to update without imposing an a priori distribution on yesterday's posterior and without estimating lots of statistics. I discuss a kernel approach, which is easily incorporated in Stan by an additional target+= statement with uniform proposal densities. I compare this with parametric updates, and explore the potential to reduce computation by using kernels weighted by counts of posterior draws inside hypercubes of parameter space.

This is being presented at StanCon 2019 in Cambridge, UK, and Bayes Comp 2020 in Gainesville, FL, USA.

You can access more detail and all the code (in R) here.




Rehabilitation after spinal fusion surgery


A collaboration with University College Hospital and St George's, University of London to design and run a clinical trial of an enhanced physiotherapy rehab programme. Rather than just running some simple endpoint hypothesis test, we are helping to design a Bayesian structural equation model that will take all the aspects of recovery (physical, psychological and social) into account together.




Simulation methods in agent-based models for sustainability transitions


We are contributing statistical advice to this project of the Global Climate Forum. The goal is reliable but complex modelling of how social networks and economies evolve as environmentally sustainable technologies are adopted, for example switching from hydrocarbon to electric cars.




More research coming soon, including meta-analysis code, algorithms to deal with huge numbers of group-level parameters efficiently, and alternatives to random forests or xgboost, when you have a small dataset.


You can contact Robert at robert@bayescamp.com to discuss bespoke training for your team, or one-to-one coaching.



BayesCamp Ltd is incorporated in England and Wales, number 10666858. The term 'BayesCamp' and the Gaussian tent logo are registered trademarks.