Welcome to my personal page. Learn more about my research interests and my latest news here!
I’m currently a third-year PhD student at ENS and Inria, under the supervision of Julien Mairal and Alessandro Rudi. My research focuses on theoretical properties of kernel methods in all sorts of flavour, but I’m also interested in providing fast and reliable implementation of my research projects.
|Jul 3, 2023||Check my latest preprint on GloptiNets! An algorithm which provides certificates to non-convex optimization problems.|
|Jun 23, 2023||I stumbled upon Google’s Foobar challenge. I doubt it’ll actually get me an interview, but it’s definitely a sneaky way to trap you into spending a ton of time on some tricky brain teasers.|
|Apr 1, 2023||I’m working on polynomial optimization (with certificates!) – if you have real-life problems involving those, reach me out!|
|May 20, 2022||Our paper on the influence of the learning on the generalization was accepted at COLT22! See you there!|
|Apr 15, 2022||The learning rate impacts the generalization when training neural network. Did you know it can occur in convex settings too? Check our last preprint!|
|Sep 28, 2021||My paper on the proximal point method for concordant loss function was awarded a Spotlight award at NeurIPS 2021!|
|Mar 30, 2023||From Python to Julia|
ArxivGloptiNets: Scalable Non-Convex Optimization with Certificates2023
NeurIPS21 SpotlightBeyond Tikhonov: faster learning with self-concordant losses, via iterative regularizationIn Advances in Neural Information Processing Systems, 2021
COLT22On the Benefits of Large Learning Rates for Kernel Methods2022