Sébastien Rouault

$ whoami

About my work...

I currently research and develop applied security solutions for machine learning.

I defended my PhD thesis at EPFL, in the Distributed Computing Laboratory (DCL), where I studied robust, distributed machine learning algorithms and privacy mechanisms. In particular, my academic work has an emphasis on developing and testing practical algorithms, that can be used to improve actual ML systems.

My scientific background spans from mathematics and statistics to design principles of computer systems. My interest in understanding (and later building) automated systems goes back to my childhood, and has only kept growing with the studying of mathematics, foundations of software, software system engineering and actual networked system administration.

...from an academic angle

I have co-published conference papers at the following venues1:

  1. Collaborative Learning in the Jungle (Decentralized, Byzantine, Heterogeneous, Asynchronous and Nonconvex Learning)
    El-Mahdi El-Mhamdi, Sadegh Farhadkhani, Rachid Guerraoui, Arsany Guirguis, Lê Nguyên Hoang, Sébastien Rouault
    NeurIPS 2021 — 35th Conference on Neural Information Processing Systems
    Virtual only, December 7–10, 2021
  2. Differential Privacy and Byzantine Resilience in SGD: Do They Add Up?
    Rachid Guerraoui, Nirupam Gupta, Rafaël Pinot, Sébastien Rouault, John Stephan
    PODC 2021 — ACM 40th Symposium on Principles of Distributed Computing
    Selerno, Italy, July 26–30, 2021
  3. Garfield: System Support for Byzantine Machine Learning
    Rachid Guerraoui, Arsany Guirguis, Jérémy Max Plassmann, Anton Alexandre Ragot, Sébastien Rouault
    DSN 2021 — 51st IEEE/IFIP International Conference on Dependable Systems and Networks
    Taipei, Taiwan, June 21–24, 2021
  4. Distributed Momentum for Byzantine-resilient Stochastic Gradient Descent
    El-Mahdi El-Mhamdi, Rachid Guerraoui, Sébastien Rouault
    ICLR 2021 — 9th International Conference on Learning Representations
    Vienna, Austria, May 4–8, 2021
  5. Aksel: Fast Byzantine SGD
    Amine Boussetta, El-Mahdi El-Mhamdi, Rachid Guerraoui, Alexandre Maurer, Sébastien Rouault
    OPODIS 2020 — 24th International Conference on Principles of Distributed Systems
    Strasbourg, France, December 14–16, 2020
    Best Student Paper award
  6. Fast and Robust Distributed Learning in High Dimension
    El-Mahdi El-Mhamdi, Rachid Guerraoui, Sébastien Rouault
    SRDS 2020 — IEEE 39th International Symposium on Reliable Distributed Systems
    Shanghai, China, September 21–24, 2020
  7. Genuinely Distributed Byzantine Machine Learning
    El-Mahdi El-Mhamdi, Rachid Guerraoui, Arsany Guirguis, Lê Nguyen Hoang, Sébastien Rouault
    PODC 2020 — ACM 39th Symposium on Principles of Distributed Computing
    Selerno, Italy, August 3–7, 2020
  8. AggregaThor: Byzantine Machine Learning via Robust Gradient Aggregation
    Georgios Damaskinos, El-Mahdi El-Mhamdi, Rachid Guerraoui, Arsany Guirguis, Sébastien Rouault
    MLSys 2019 — 1st Conference on Machine Learning and Systems
    Palo Alto, CA, USA, March 31–April 2, 2019
  9. The Hidden Vulnerability of Distributed Learning in Byzantium
    El-Mahdi El-Mhamdi, Rachid Guerraoui, Sébastien Rouault
    ICML 2018 — 35th International Conference on Machine Learning
    Stockholm, Sweden, July 10–15, 2018
    Accepted with a "Long Talk"
  10. On The Robustness of a Neural Network
    El-Mahdi El-Mhamdi, Rachid Guerraoui, Sébastien Rouault
    SRDS 2017 — IEEE 36th Symposium on Reliable Distributed Systems
    Hong Kong, China, September 26–29, 2017

1Please note that, in the DCL, the author names are always written in alphabetical order.


I gave the following talks:

  • Distributed Momentum for Byzantine-resilient Stochastic Gradient Descent
    ICLR 2021 — 9th International Conference on Learning Representations
  • Genuinely Distributed Byzantine Machine Learning
    PODC 2020 — ACM 39th Symposium on Principles of Distributed Computing
    First half (background on ML + asynchronous algorithm)
  • Byzantine Resilient Machine Learning: Algorithms to cure poisoned SGD
    EcoCloud 2019 — A Center for Sustainable Cloud Computing
    Invited talk

I have (co-)implemented the following research software:

  • Distributed Momentum for Byzantine-resilient Stochastic Gradient Descent
    ICLR 2021 — 9th International Conference on Learning Representations
    Authored everything
    (Substantial) parts were also reused in:
    • Differential Privacy and Byzantine Resilience in SGD: Do They Add Up?
    • Garfield: System Support for Robust Machine Learning
    • Aksel: Fast Byzantine SGD (best student paper award)
    • Fast and Robust Distributed Learning in High Dimension
    • Genuinely Distributed Byzantine Machine Learning
  • Garfield: System Support for Byzantine Machine Learning
    DSN 2021 — 51st IEEE/IFIP International Conference on Dependable Systems and Networks
    Authored the parallelized CPU and CUDA (GPU) Byzantine-resilient GAR implementations for both PyTorch and TensorFlow
    In addition, this paper reuses code I wrote for:
    • Distributed Momentum for Byzantine-resilient Stochastic Gradient Descent
    • AggregaThor: Byzantine Machine Learning via Robust Gradient Aggregation
  • AggregaThor: Byzantine Machine Learning via Robust Gradient Aggregation
    MLSys 2019 — 1st Conference on Machine Learning and Systems
    Everything, except: MPI and UDP-related code additions, and TensorFlow optional patches
    (Substantial) parts were also reused in:
    • Garfield: System Support for Robust Machine Learning
    • Genuinely Distributed Byzantine Machine Learning
  • On The Robustness of a Neural Network
    SRDS 2017 — IEEE 36th Symposium on Reliable Distributed Systems
    Authored everything


I started my PhD in September 2017. In September 2021, I had the privilege to defend my thesis before:

Before the PhD, I completed two MSc degrees:

As a side note, I remember as a MSc student that there were two kinds of course projects: those with an automated submission system, and those without any. In my teaching assistant (TA) duties, I decided to develop a submission system for the EPFL MSc course Concurrent Algorithms. In the project, students have to write their own Software Transactional Memory library, and test it against a common workload on a shared high-performance, multi-core server. The submission system manages the sequential testing of the student implementations on the hardware, besides many good-to/must-have features (tight sandboxing via firejail, extensive logging, submission rate limitation, automated credential broadcast to student via emails, student can overwrite/download their own/best submission). The satisfaction of both obtaining (very) positive student feedbacks and at the same time substantially "optimizing" my TA workload 😉 seem enough to justify the place of this work in this section.

...from a personal angle

I have been officially involved in the following student associations:

  • EPIC
    The graduate student association of the faculty. It organizes mountain outings, board game nights, informative panels and talks, and more. I was the treasurer for the year 2019; to be fair it was nothing overwhelming 🙂. The 6 members of the association won the EPFL IC faculty's Distinguished Service Award for our steadfast help and involvement during the EDIC Open House 2019.
  • Supélec Rézo Rennes
    I have been a member of the association for both of my two years at Supélec. I was elected president for my second academic year, from April 2014 to March 2015. This association acts as an Internet and service provider for the students on the campus (5 residential buildings + 1 "social" building), and manages its own hardware and software. This was a great experience, and the knowledge I acquired there still serves me today. During my mandate, the association renewed its stock of aging (even failing) switchs, added the TV via multicast and a few other services. In particular, we worked on a quality of service problem: the upstream bandwidth provided by our (many) ADSL modems was not always shared evenly between our many users, sometimes leading to increased latencies for everyone. My part of the project was to design the algorithms and write the kernel module ensuring fair upstream sharing, while my teammate worked on the integration with the existing software architecture.

...and the work of others

Do you know this methaphor, to "stand on the shoulders of giants"? That's what I do all day long.

About my PhD work, its (upper layer) foundations were largely laid by the authors of Machine Learning with Adversaries: Byzantine Tolerant Gradient Descent. One of the authors, El Mahdi El Mhamdi, advised me on all the scientific aspects in my junior time, under the supervision of Rachid Guerraoui, who left us carte blanche on this then-new line of research for his unit.

About my personal work, well, there are definitely too many factors that contributed to what I have done so far, and what I can do today. Assuredly, growing in a "fully developed" country, sheltered from all the vital needs, and with parents who introduced me to Mr. Computer very early (age 4 for the keyboard, age 8 for coding), have been a tremendous chance. Stellar resources out there are likely another factor, so perhaps I could take advantage of this section to highlight some pieces of work that (I think) are really worth casting a glance at, to learn about specific subjects:

  • Preshing on Programming
    I find the blog entries on C11/C++11 concurrent programming limpid, which really is a feat for such a technical subject. And there is more to this blog than only concurrent programming.
  • Ralf's Ramblings
    The author is an expert (PhD!) on Rust, and one of the main project contributors. I find that his posts on this language tackle specification/internal details which can be critical to know to properly write unsafe Rust.
  • Git Internals
    Just in case you have missed it. If in general you very much prefer to understand things in details first, to be able to reason about and use them better, regarding git, the documentation is excellent.
  • Distill
    From the about page: "Machine Learning Research Should Be Clear, Dynamic and Vivid. Distill Is Here to Help.".
  • WordReference.com
    For word/expression translation, this is my go-to resource (I use it mostly for EN/FR).

This list is by no mean exhaustive, I put entries as I think about them. There is no particular order.