Short Bio

Since september 2023, I am a professor (lecturer) at LPSM and SCAI in Sorbonne Université (previously known as Paris 6, in the center of Paris). Before that, I was an assistant professor at the Center for Applied Mathematics (CMAP) in Ecole Polytechnique near Paris. My research interests focus on theoretical statistics and Machine Learning, with a particular emphasis on nonparametric estimates. I did my PhD thesis on a particular algorithm of Machine Learning called random forests, under the supervision of Gérard Biau (LSTA - Paris 6) and Jean-Philipe Vert (Institut Curie).

Keywords: statistical learning, non-parametric estimation, random forests, decision trees, variable importance, missing data, neural networks, causal inference.

Curriculum Vitae

Google Scholar

Awards and distinctions

Students

  1. Jaouad Mourtada (2016-2020)
    Ph.D. student co-supervised with Stéphane Gaïffas
  2. Nicolas Prost (2018-2019)
    Ph.D. student co-supervised with Julie Josse and Gaël Varoquaux
  3. Clément Bénard (2018-2021)
    Ph.D. student co-supervised with Gérard Biau and Sébastien Da Veiga
  4. Ludovic Arnould (2020-2023)
    Ph.D. student co-supervised with Claire Boyer
  5. Bénédicte Colnet (2020-2023)
    Ph.D. student co-supervised with Julie Josse and Gaël Varoquaux
  6. Alexis Ayme (2021-2024)
    Ph.D. student co-supervised with Claire Boyer and Aymeric Dieuleveut
  7. Abdoulaye Sakho (2023-)
    Ph.D. student (CIFRE at Artefact) co-supervised with Emmanuel Malherbe
  8. Ahmed Boughdiri (2023-)
    Ph.D. student co-supervised with Julie Josse

Publications

Preprints

  1. B. Colnet, J. Josse, G. Varoquaux, E. Scornet. Risk ratio, odds ratio, risk difference... Which causal measure is easier to generalize? , 2023.
  2. A. Sakho, E. Malherbe, E. Scornet. Do we need rebalancing strategies? A theoretical and empirical study around SMOTE and its variants. , 2024.
  3. A.D. Reyero Lobo, A. Ayme, C. Boyer, E. Scornet. A primer on linear classification with missing data , 2024.
  4. A. Boughdiri, J. Josse, E. Scornet. Quantifying Treatment Effects: Estimating Risk Ratios via Observational Studies , 2024.
  5. J. Naf, J. Josse, E. Scornet. What Is a Good Imputation Under MAR Missingness , 2024.

Accepted/Published papers

  1. Scornet, E., Biau, G. and Vert, J.-P. (2015). Consistency of random forests, The Annals of Statistics, Vol. 43, pp. 1716-1741 (Supplementary materials ).
  2. Scornet, E. (2016). On the asymptotics of random forests, Journal of Multivariate Analysis, Vol. 146, pp. 72-83.
  3. Scornet, E. (2016). Random forests and kernel methods, IEEE Transactions on Information Theory, Vol. 62, pp. 1485-1500.
  4. Biau, G., Scornet, E. (2016). A Random Forest Guided Tour, TEST, Vol. 25, pp. 197-227. ( Discussion ).
  5. Scornet, E. (2016). Promenade en forêts aléatoires, MATAPLI, Vol. 111.
  6. E. Bernard, Y. Jiao, E. Scornet, V. Stoven, T. Walter and J.-P. Vert (2017) Kernel multitask regression for toxicogenetics, Molecular Informatics, Vol. 36.
  7. J. Mourtada, S. Gaïffas, E. Scornet, (2017) Universal consistency and minimax rates for online Mondrian Forest, NIPS 2017 (Supplementary materials ).
  8. Scornet, E. (2017). Tuning parameters in random forests, ESAIM Procs, Vol. 60 pp. 144-162.
  9. R. Duroux, E. Scornet (2018) Impact of subsampling and tree depth on random forests, ESAIM: Probability and Statistics, Vol. 22, pp. 96-128.
  10. G. Biau, E. Scornet, J. Welbl, (2018) Neural Random Forests , Sankhya A, pp. 1-40.
  11. J. Mourtada, S. Gaïffas, E. Scornet (2020) Minimax optimal rates for Mondrian trees and forests , The Annals of Statistics, 48(4), 2253-2276.
  12. M. Le Morvan, N. Prost, J. Josse, E. Scornet. & G. Varoquaux (2020) Linear predictor on linearly-generated data with missing values: non consistency and solutions , AISTAT.
  13. M. Le Morvan, J. Josse, T. Moreau, E. Scornet, G. Varoquaux (2020) Neumann networks: differential programming for supervised learning with missing values , NeurIPS (oral communication).
  14. C. Bénard, G. Biau, S. Da Veiga, E. Scornet (2021) SIRUS: Stable and Interpretable RUle Set for Classification , Electronic Journal of Statistics, Vol. 15, pp. 427-505.
  15. C. Bénard, G. Biau, S. Da Veiga, E. Scornet (2021). Interpretable Random Forests via Rule Extraction , AISTAT.
  16. J. Mourtada, S. Gaïffas, E. Scornet (2021). AMF: Aggregated Mondrian Forests for Online Learning , Journal of the Royal Statistical Society: Series B (Statistical Methodology), 83(3), 505-533.
  17. L. Arnould, C. Boyer, E. Scornet (2021). Analyzing the tree-layer structure of Deep Forests , ICML.
  18. M. Le Morvan, J. Josse, E. Scornet, G. Varoquaux (2021). What's a good imputation to predict with missing values? , NeurIPS.
  19. E. Scornet (2021). Trees, forests, and impurity-based variable importance , Annales de l’Institut Henri Poincaré
  20. C. Bénard, G. Biau, S. Da Veiga, E. Scornet (2022). SHAFF: Fast and consistent SHApley eFfect estimates via random Forests , AISTAT.
  21. C. Bénard, S. Da Veiga, E. Scornet (2022). MDA for random forests: inconsistency, and a practical solution via the Sobol-MDA , Biometrika.
  22. A. Ayme, C. Boyer, A. Dieuleveut, E. Scornet (2022). Near-optimal rate of consistency for linear models with missing values , ICML.
  23. B. Colnet, J. Josse, E. Scornet, G. Varoquaux (2022). Generalizing a causal effect: sensitivity analysis and missing covariates , accepted for publication in Journal of Causal Inference.
  24. L. Arnould, C. Boyer, E. Scornet (2023). Is interpolation benign for regression random forests? , AISTAT.
  25. P. Lutz, L. Arnould, C. Boyer, E. Scornet (2023). Sparse tree-based initialization for neural networks , ICLR.
  26. A. Ayme, C. Boyer, A. Dieuleveut, E. Scornet (2023). Naive imputation implicitly regularizes high-dimensional linear models , ICML.
  27. J. Josse, J.M. Chen, N. Prost, E. Scornet, G. Varoquaux (2024, first submission in 2019). On the consistency of supervised learning with missing values , accepted for publication in Statistical Papers.
  28. B. Colnet, J. Josse, G. Varoquaux, E. Scornet (2024). Reweighting the RCT for generalization: finite sample analysis and variable selection , accepted for publication in JRSS-A.
  29. A. Ayme, C. Boyer, A. Dieuleveut, E. Scornet (2024). Random features models: a way to study the success of naive imputation , ICML.

Book

  1. B. Iooss, R. Kenett, P. Secchi, B.M. Colosimo, F. Centofanti, C. Bénard, S. Da Veiga, E. Scornet, S. N. Wood, Y. Goude, M. Fasiolo Interpretability for Industry 4.0 : Statistical and Machine Learning Approaches , Editors: A. Lepore, B. Palumbo, J.-M. Poggi, Springer 2022.

Academic publications

  1. PhD thesis Learning with random forests, defended on Monday, 30th November, 2015.
  2. HDR manuscript Random forests, interpretability, neural networks and missing values, defended on the 17th December, 2020.

Teaching

  1. Decision Trees: Slides and lectures in English or in French
  2. Random Forests and Tree Boosting: Slides and lectures in English or in French
  3. Introduction to neural networks: Slides and lectures in English or in French
  4. Hyperparameter tuning in neural networks: Slides and lectures in English or in French
  5. Convolutional Neural Networks: Slides and lectures in English
  6. Applications of Convolutional Neural Networks: Slides and lectures in English
  7. Recurrent Neural Networks: Slides and lectures in English
  8. A detour through unsupervised learning: Slides and lectures in English
  9. Generative Models: Slides and lectures in English
  10. Word Embedding Some slides in construction

Statistics and Video games - Stone's theorem

    You need to download the corresponding game package (Mac or PC) and launch the .exe file (you may need to download RenPy ).
  1. Stone's theorem (statement)
  2. Game package for Mac
  3. Game package for PC

Talks

  1. Random Forests
  2. General overview of AI
  3. Ai for health
  4. Trees, forests, and impurity-based variable importance
  5. Is interpolation benign for random forest regression? Paper here
  6. Pour une botanomancie rigoureuse: lire l'importance dans les feuilles des forêts (aléatoires) et en extraire des préceptes élémentaires., StatLearn23
Contact
  1. Email: prenom.nom@po-ly-tech-ni-que.edu (without hyphens).
  2. Office 214, Tour 15-25, Jussieu Campus.