Short Bio

Since september 2016, I am an assistant professor at the Center for Applied Mathematics (CMAP) in Ecole Polytechnique near Paris. My research interests focus on theoretical statistics and Machine Learning with a particular emphasis on nonparametric estimates. I did my PhD thesis on a particular algorithm of Machine Learning called random forests, under the supervision of Gérard Biau (LSTA - Paris 6) and Jean-Philipe Vert (Institut Curie).

Curriculum Vitae

Graduate Degree "Artificial Intelligence and Advanced Visual Computing"

    A new graduate degree on Artificial Intelligence opened in September 2018 at Ecole Polytechnique. The official training website is here and additional information on the scientific content can be found here . A short summary can also be found here (presentation of March 2019)

Awards and distinctions

Students

  1. Jaouad Mourtada (2016-2020)
    Ph.D. student co-supervised with Stéphane Gaïffas
  2. Nicolas Prost (2018-2019)
    Ph.D. student co-supervised with Julie Josse and Gaël Varoquaux
  3. Clément Bénard (2018-2021)
    Ph.D. student co-supervised with Gérard Biau and Sébastien Da Veiga
  4. Ludovic Arnould (2020-)
    Ph.D. student co-supervised with Claire Boyer
  5. Bénédicte Colnet (2020-)
    Ph.D. student co-supervised with Julie Josse and Gaël Varoquaux
  6. Alexis Ayme (2021-)
    Ph.D. student co-supervised with Claire Boyer and Aymeric Dieuleveut

Publications

Preprints

  1. J. Josse, N. Prost, E. Scornet, G. Varoquaux. On the consistency of supervised learning with missing values , 2019
  2. B. Colnet, J. Josse, E. Scornet, G. Varoquaux. Generalizing a causal effect: sensitivity analysis and missing covariates , 2021.
  3. L. Arnould, C. Boyer, E. Scornet. Is interpolation benign for random forests? , 2022.

Accepted/Published papers

  1. Scornet, E., Biau, G. and Vert, J.-P. (2015). Consistency of random forests, The Annals of Statistics, Vol. 43, pp. 1716-1741 (Supplementary materials ).
  2. Scornet, E. (2016). On the asymptotics of random forests, Journal of Multivariate Analysis, Vol. 146, pp. 72-83.
  3. Scornet, E. (2016). Random forests and kernel methods, IEEE Transactions on Information Theory, Vol. 62, pp. 1485-1500.
  4. Biau, G., Scornet, E. (2016). A Random Forest Guided Tour, TEST, Vol. 25, pp. 197-227. ( Discussion ).
  5. Scornet, E. (2016). Promenade en forêts aléatoires, MATAPLI, Vol. 111.
  6. E. Bernard, Y. Jiao, E. Scornet, V. Stoven, T. Walter and J.-P. Vert (2017) Kernel multitask regression for toxicogenetics, Molecular Informatics, Vol. 36.
  7. J. Mourtada, S. Gaïffas, E. Scornet, (2017) Universal consistency and minimax rates for online Mondrian Forest, NIPS 2017 (Supplementary materials ).
  8. Scornet, E. (2017). Tuning parameters in random forests, ESAIM Procs, Vol. 60 pp. 144-162.
  9. R. Duroux, E. Scornet (2018) Impact of subsampling and tree depth on random forests, ESAIM: Probability and Statistics, Vol. 22, pp. 96-128.
  10. G. Biau, E. Scornet, J. Welbl, (2018) Neural Random Forests , Sankhya A, pp. 1-40.
  11. J. Mourtada, S. Gaïffas, E. Scornet (2020) Minimax optimal rates for Mondrian trees and forests , The Annals of Statistics, 48(4), 2253-2276.
  12. M. Le Morvan, N. Prost, J. Josse, E. Scornet. & G. Varoquaux (2020) Linear predictor on linearly-generated data with missing values: non consistency and solutions , AISTAT.
  13. M. Le Morvan, J. Josse, T. Moreau, E. Scornet, G. Varoquaux (2020) Neumann networks: differential programming for supervised learning with missing values , NeurIPS (oral communication).
  14. C. Bénard, G. Biau, S. Da Veiga, E. Scornet (2021) SIRUS: Stable and Interpretable RUle Set for Classification , Electronic Journal of Statistics, Vol. 15, pp. 427-505.
  15. C. Bénard, G. Biau, S. Da Veiga, E. Scornet (2021). Interpretable Random Forests via Rule Extraction , AISTAT.
  16. J. Mourtada, S. Gaïffas, E. Scornet (2021). AMF: Aggregated Mondrian Forests for Online Learning , Journal of the Royal Statistical Society: Series B (Statistical Methodology), 83(3), 505-533.
  17. L. Arnould, C. Boyer, E. Scornet (2021). Analyzing the tree-layer structure of Deep Forests , ICML.
  18. M. Le Morvan, J. Josse, E. Scornet, G. Varoquaux (2021). What's a good imputation to predict with missing values? , NeurIPS.
  19. E. Scornet (2021). Trees, forests, and impurity-based variable importance , accepted for publication in Annales de l’Institut Henri Poincaré
  20. C. Bénard, G. Biau, S. Da Veiga, E. Scornet (2022). SHAFF: Fast and consistent SHApley eFfect estimates via random Forests , AISTAT.
  21. C. Bénard, S. Da Veiga, E. Scornet (2022). MDA for random forests: inconsistency, and a practical solution via the Sobol-MDA , accepted for publication in Biometrika.
  22. A. Ayme, C. Boyer, A. Dieuleveut, E. Scornet (2022). Near-optimal rate of consistency for linear models with missing values , ICML.

Academic publications

  1. PhD thesis Learning with random forests, defended on Monday, 30th November, 2015.
  2. HDR manuscript Random forests, interpretability, neural networks and missing values, defended on the 17th December, 2020.

Teaching

Deep Learning course (slides + videos)
  1. Vintage Neural Networks - Part 1 - Slides
    1. Vintage Neural Networks 1.1
    2. Vintage Neural Networks 1.2
    3. Vintage Neural Networks 1.3
    4. Vintage Neural Networks 1.4
    5. Vintage Neural Networks 1.5
    6. Vintage Neural Networks 1.6
    7. Vintage Neural Networks 1.7
    8. Vintage Neural Networks 1.8
    9. Vintage Neural Networks 1.9
  2. Vintage Neural Networks - Part 2 (same set of slides)
    1. Vintage Neural Networks 2.1
    2. Vintage Neural Networks 2.2
    3. Vintage Neural Networks 2.3
    4. Vintage Neural Networks 2.4
    5. Vintage Neural Networks 2.5
    6. Vintage Neural Networks 2.6
    7. Vintage Neural Networks 2.7
    8. Vintage Neural Networks 2.8
    9. Vintage Neural Networks 2.9
    10. Vintage Neural Networks 2.10
    11. Vintage Neural Networks 2.11
    12. Vintage Neural Networks 2.12
    13. Vintage Neural Networks 2.13
    14. Vintage Neural Networks 2.14
  3. Optimization
  4. Convolutional Neural Networks
    1. Convolutional Neural Networks 3.1
    2. Convolutional Neural Networks 3.2
    3. Convolutional Neural Networks 3.3
    4. Convolutional Neural Networks 3.4
    5. Convolutional Neural Networks 3.5
    6. Convolutional Neural Networks 3.6
    7. Convolutional Neural Networks 3.7
    8. Convolutional Neural Networks 3.8
    9. Convolutional Neural Networks 3.9
    10. Convolutional Neural Networks 3.10
    11. Convolutional Neural Networks 3.11
    12. Convolutional Neural Networks 3.12
    13. Convolutional Neural Networks 3.13
    14. Convolutional Neural Networks 3.14
    15. Convolutional Neural Networks 3.15
    16. Convolutional Neural Networks 3.16
    17. Convolutional Neural Networks 3.17
    18. Convolutional Neural Networks 3.18
  5. Recurrent Neural Networks
  6. Generative Modelling
  7. Word Embedding (in construction)

Talks

  1. Random Forests
  2. General overview of AI
  3. Ai for health
  4. Trees, forests, and impurity-based variable importance
Contact
  1. Email: prenom.nom@po-ly-tech-ni-que.edu (without hyphens).
  2. Office: 136, Turing Building, Route de Saclay, Palaiseau.
  3. Phone number: +33 1 77 57 80 80