Since september 2016, I am an assistant professor at the Center for Applied Mathematics (CMAP) in Ecole Polytechnique near Paris. My research interests focus on theoretical statistics and Machine Learning with a particular emphasis on nonparametric estimates. I did my PhD thesis on a particular algorithm of Machine Learning called random forests, under the supervision of Gérard Biau (LSTA - Paris 6) and Jean-Philipe Vert (Institut Curie).
A new graduate degree on Artificial Intelligence opened in September 2018 at Ecole Polytechnique.
The official training website is here
and additional information on the scientific
content can be found here .
A short summary can also be found here (presentation of March 2019)
If you are interested, applications are here.
- Winner of the Jacques Neveu 2016 Prize for a thesis in the field of probability or statistic.
Jaouad Mourtada (2016-2020)
Ph.D. student co-supervised with Stéphane Gaïffas
Nicolas Prost (2018-2019)
Ph.D. student co-supervised with Julie Josse and Gaël Varoquaux
Clément Bénard (2018-2021)
Ph.D. student co-supervised with Gérard Biau and Sébastien Da Veiga
Ludovic Arnould (2020-)
Ph.D. student co-supervised with Claire Boyer
Bénédicte Colnet (2020-)
Ph.D. student co-supervised with Julie Josse and Gaël Varoquaux
Alexis Ayme (2021-)
Ph.D. student co-supervised with Claire Boyer and Aymeric Dieuleveut
- J. Josse, N. Prost, E. Scornet, G. Varoquaux. On the consistency of supervised learning with missing values , 2019
- B. Colnet, J. Josse, G. Varoquaux, E. Scornet. Reweighting the RCT for generalization: finite sample analysis and variable selection , 2022.
- A. Ayme, C. Boyer, A. Dieuleveut, E. Scornet. Naive imputation implicitly regularizes high-dimensional linear models , 2023.
- Scornet, E., Biau, G. and Vert, J.-P. (2015). Consistency of random forests, The Annals of Statistics, Vol. 43, pp. 1716-1741 (Supplementary materials ).
- Scornet, E. (2016). On the asymptotics of random forests, Journal of Multivariate Analysis, Vol. 146, pp. 72-83.
- Scornet, E. (2016). Random forests and kernel methods, IEEE Transactions on Information Theory, Vol. 62, pp. 1485-1500.
- Biau, G., Scornet, E. (2016). A Random Forest Guided Tour, TEST, Vol. 25, pp. 197-227. ( Discussion ).
- Scornet, E. (2016). Promenade en forêts aléatoires, MATAPLI, Vol. 111.
- E. Bernard, Y. Jiao, E. Scornet, V. Stoven, T. Walter and J.-P. Vert (2017) Kernel multitask regression for toxicogenetics, Molecular Informatics, Vol. 36.
- J. Mourtada, S. Gaïffas, E. Scornet, (2017) Universal consistency and minimax rates for online Mondrian Forest, NIPS 2017 (Supplementary materials ).
- Scornet, E. (2017). Tuning parameters in random forests, ESAIM Procs, Vol. 60 pp. 144-162.
- R. Duroux, E. Scornet (2018) Impact of subsampling and tree depth on random forests, ESAIM: Probability and Statistics, Vol. 22, pp. 96-128.
- G. Biau, E. Scornet, J. Welbl, (2018) Neural Random Forests , Sankhya A, pp. 1-40.
- J. Mourtada, S. Gaïffas, E. Scornet (2020) Minimax optimal rates for Mondrian trees and forests , The Annals of Statistics, 48(4), 2253-2276.
- M. Le Morvan, N. Prost, J. Josse, E. Scornet. & G. Varoquaux (2020) Linear predictor on linearly-generated data with missing values: non consistency and solutions , AISTAT.
- M. Le Morvan, J. Josse, T. Moreau, E. Scornet, G. Varoquaux (2020) Neumann networks: differential programming for supervised learning with missing values , NeurIPS (oral communication).
- C. Bénard, G. Biau, S. Da Veiga, E. Scornet (2021) SIRUS: Stable and Interpretable RUle Set for Classification , Electronic Journal of Statistics, Vol. 15, pp. 427-505.
- C. Bénard, G. Biau, S. Da Veiga, E. Scornet (2021). Interpretable Random Forests via Rule Extraction , AISTAT.
- J. Mourtada, S. Gaïffas, E. Scornet (2021). AMF: Aggregated Mondrian Forests for Online Learning , Journal of the Royal Statistical Society: Series B (Statistical Methodology), 83(3), 505-533.
- L. Arnould, C. Boyer, E. Scornet (2021). Analyzing the tree-layer structure of Deep Forests , ICML.
- M. Le Morvan, J. Josse, E. Scornet, G. Varoquaux (2021). What's a good imputation to predict with missing values? , NeurIPS.
- E. Scornet (2021). Trees, forests, and impurity-based variable importance , Annales de l’Institut Henri Poincaré
- C. Bénard, G. Biau, S. Da Veiga, E. Scornet (2022). SHAFF: Fast and consistent SHApley eFfect estimates via random Forests , AISTAT.
- C. Bénard, S. Da Veiga, E. Scornet (2022). MDA for random forests: inconsistency, and a practical solution via the Sobol-MDA , Biometrika.
- A. Ayme, C. Boyer, A. Dieuleveut, E. Scornet (2022). Near-optimal rate of consistency for linear models with missing values , ICML.
- B. Colnet, J. Josse, E. Scornet, G. Varoquaux (2022). Generalizing a causal effect: sensitivity analysis and missing covariates , accepted for publication in Journal of Causal Inference.
- L. Arnould, C. Boyer, E. Scornet (2023). Is interpolation benign for regression random forests? , AISTAT.
- P. Lutz, L. Arnould, C. Boyer, E. Scornet (2023). Sparse tree-based initialization for neural networks , ICLR.
- B. Iooss, R. Kenett, P. Secchi, B.M. Colosimo, F. Centofanti, C. Bénard, S. Da Veiga, E. Scornet, S. N. Wood, Y. Goude, M. Fasiolo Interpretability for Industry 4.0 : Statistical and Machine Learning Approaches , Editors: A. Lepore, B. Palumbo, J.-M. Poggi, Springer 2022.
- PhD thesis Learning with random forests, defended on Monday, 30th November, 2015.
- HDR manuscript Random forests, interpretability, neural networks and missing values, defended on the 17th December, 2020.
- Vintage Neural Networks - Part 1 - Slides
- Vintage Neural Networks - Part 2 (same set of slides)
- Vintage Neural Networks 2.1
- Vintage Neural Networks 2.2
- Vintage Neural Networks 2.3
- Vintage Neural Networks 2.4
- Vintage Neural Networks 2.5
- Vintage Neural Networks 2.6
- Vintage Neural Networks 2.7
- Vintage Neural Networks 2.8
- Vintage Neural Networks 2.9
- Vintage Neural Networks 2.10
- Vintage Neural Networks 2.11
- Vintage Neural Networks 2.12
- Vintage Neural Networks 2.13
- Vintage Neural Networks 2.14
- Convolutional Neural Networks
- Convolutional Neural Networks 3.1
- Convolutional Neural Networks 3.2
- Convolutional Neural Networks 3.3
- Convolutional Neural Networks 3.4
- Convolutional Neural Networks 3.5
- Convolutional Neural Networks 3.6
- Convolutional Neural Networks 3.7
- Convolutional Neural Networks 3.8
- Convolutional Neural Networks 3.9
- Convolutional Neural Networks 3.10
- Convolutional Neural Networks 3.11
- Convolutional Neural Networks 3.12
- Convolutional Neural Networks 3.13
- Convolutional Neural Networks 3.14
- Convolutional Neural Networks 3.15
- Convolutional Neural Networks 3.16
- Convolutional Neural Networks 3.17
- Convolutional Neural Networks 3.18
- Recurrent Neural Networks
- Generative Modelling
- Word Embedding (in construction)
- Email: email@example.com (without hyphens).
- Office: 136, Turing Building, Route de Saclay, Palaiseau.
- Phone number: +33 1 77 57 80 80