About Me

I am currently a fourth year Phd student studying in the Department of Statistics at the University of Oxford under the supervision of Patrick Rebeschini as well as Yee Whye Teh, and funded through the joint centre for doctoral training in Statistical Science run by the universities of Warwick and Oxford. In the summer of 2020 I was an intern at Facebook AI Research (FAIR) Montréal working with Mike Rabbat. Prior to starting my Phd I completed a bachelors and masters at the University of Warwick where I worked with Adam Johansen on particle methods.

Research Interests

I am interested in the understanding the interplay between statistics and computation in the context of machine learning. Recently, I have been investigating the performance of estimators arising from gradient descent in settings where the learning problem is more benign. I have also worked on the interplay between statistics and network topology in context of decentralised machine learning.


  • Comparing Classes of Estimators: When does Gradient Descent Beat Ridge Regression in Linear Models? (with E. Dobriban and P. Rebeschini) (submitted) [Paper]
  • Stability & Generalisation of Gradient Descent for Shallow Neural Networks without the Neural Tangent Kernel (with I. Kuzborskij) (NeurIPS 2021) [Paper]
  • Distributed Machine Learning with Sparse Heterogeneous Data (with S. N. Negahban and P. Rebeschini) (NeurIPS 2021)
  • Learning with Gradient Descent and Weakly Convex Losses (with M. Rabbat) (AISTATS, 2021) [Paper]
  • Asymptotics of Ridge(less) Regression under General Source Condition (with J. Mourtada and L. Rosasco) (AISTATS, 2021) [Paper]
  • Decentralised Learning with Random Features and Distributed Gradient Descent (with P. Rebeschini and L. Rosasco) (ICML 2020) [Paper] [Code]
  • Optimal statistical rates for decentralised non-parametric regression with linear speed-up (with P. Rebeschini) (NeurIPS 2019). [Paper]
  • Graph-dependent implicit regularisation for distributed stochastic subgradient descent (with P. Rebeschini) (JMLR) [Paper] [Code]


  • 16/10/2020 - Asymptotics of Ridge(less) Regression under General Source Condition (Department of Applied Mathematics and Theoretical Physics, University of Cambridge)
  • 22/07/2020 - Asymptotics of Ridge(less) Regression under General Source Condition (MontrĂ©al Machine Learning and Optimization Group)
  • 04/06/2019 - Decentralised Sparse Multi-Task Regression (Numerical Analysis Group Internal Seminar, Mathematical Institute, University of Oxford)
  • 04/05/2019 - Optimal Statistical Rates for Non-parametric Decentralised Regression with Distributed Gradient Descent (Yale Probabilistic Networks Group, Yale University)


Email: Dominic [dot] Richards [at] spc [dot] ox [dot] ac [dot] uk

Department of Statistics
University of Oxford
24-29 St Giles'
Oxford, OX1 3LB
United Kingdom