Gaspard Beugnot

3rd year PhD student in Machine Learning and Optimization.


Welcome to my personal page. Learn more about my research interests and my latest news here!

I’m currently a third-year PhD student at ENS and Inria, under the supervision of Julien Mairal and Alessandro Rudi. My research focuses on theoretical properties of kernel methods in all sorts of flavour, but I’m also interested in providing fast and reliable implementation of my research projects.

Currently, I’m interested in non-convex optimization, and I’m enthusiastic about Elixir, Julia, and various flavors of functional programming languages.

Before that, I graduated from Ecole Polytechnique (X2016) and got a master from École Normale Supérieure in Mathematics, Vision and Machine Learning in 2020 (Master MVA).


Sep 28, 2023 Our GloptiNets paper was awarded a spotlight presentation at NeurIPS 2023 🎉 See you there!
Jul 3, 2023 Check my latest preprint on GloptiNets! An algorithm which provides certificates to non-convex optimization problems.
Jun 23, 2023 I stumbled upon Google’s Foobar challenge. I doubt it’ll actually get me an interview, but it’s definitely a sneaky way to trap you into spending a ton of time on some tricky brain teasers.
Apr 1, 2023 I’m working on polynomial optimization (with certificates!) – if you have real-life problems involving those, reach me out!
May 20, 2022 Our paper on the influence of the learning on the generalization was accepted at COLT22! See you there!
Apr 15, 2022 The learning rate impacts the generalization when training neural network. Did you know it can occur in convex settings too? Check our last preprint!
Sep 28, 2021 My paper on the proximal point method for concordant loss function was awarded a Spotlight award at NeurIPS 2021!

latest posts

Mar 30, 2023 From Python to Julia

selected publications

  1. NeurIPS23 Spotlight
    GloptiNets: Scalable Non-Convex Optimization with Certificates
    Gaspard BeugnotJulien Mairal, and Alessandro Rudi
  2. NeurIPS21 Spotlight
    Beyond Tikhonov: faster learning with self-concordant losses, via iterative regularization
    Gaspard BeugnotJulien Mairal, and Alessandro Rudi
    In Advances in Neural Information Processing Systems, 2021
  3. COLT22
    On the Benefits of Large Learning Rates for Kernel Methods
    Gaspard BeugnotJulien Mairal, and Alessandro Rudi