Meetup 1 -- Guillaume Dalle & François Pacaud

Details
[english version below]
Première réunion du meetup Julia à Paris -- venez rencontrer des utilisateurs du langage de programmation Julia, que vous soyiez curieux, novice ou expérimenté !
Le meetup se tiendra en salle Denisse, à l'Observatoire de Paris (entrée au 77, avenue Denfert-Rochereau) ; l'entrée est gratuite mais la réservation est obligatoire au moins 72h à l'avance pour des raisons de sécurité.
Nous accueillerons Guillaume Dalle et François Pacaud ; vous trouverez leurs sujets ci-dessous. Les présentations seront en français ou anglais selon le public, et seront suivies d'un moment de convivialité.
------------------------
First meeting of the Julia in Paris meetup -- come and meet users of the Julia programming language, whether you are simply curious, a novice or an expert!
The meetup will take place in the Denisse room, at the Observatoire de Paris (entrance at 77, avenue Denfert-Rochereau); admission is free of charge but reservations are mandatory at least 72h in advance for security reasons.
We will be welcoming Guillaume Dalle and François Pacaud; you can find their talk subjects down below. The presentations will be in french or english according to the audience, and they will be followed by a moment of conviviality.
------------------------
Guillaume Dalle -- Automatic differentiation, a tale of two languages
It's 2025, are you still computing gradients by hand? Then let's free up some of your time for more rewarding hobbies! Thanks to automatic differentiation, you can build awesome machine learning models and let the computer worry about derivatives. Well... mostly.
Not every piece of code is amenable to autodiff. Writing differentiable programs is a non-trivial task, which can only be mastered knowing what the computer does under the hood. It is also essential to understand the features and limitations of each machine learning stack.
The first part of the talk will explain how computer code is transformed to compute derivatives. We will clarify the sometimes confusing terminology of this field (numeric / symbolic / algorithmic differentiation) and push back against common misconceptions (gradients obtained by autodiff are usually exact up to floating point errors!). We will also discuss the difference between forward and reverse mode and the extension to higher-order derivatives.
The second part of the talk will provide a comparison between two high-level programming languages with excellent autodiff ecosystems: Python and Julia. We will cover Python's trinity of autodiff frameworks (TensorFlow, PyTorch, JAX) and show that the situation in Julia is very different: the code is not constrained by the choice of autodiff. In particular, there is no need to consider only a certain type of tensors, or a certain family of operations. We will comment on this crucial distinction, analyzing its upsides and downsides. Finally, we will give some recommendations on how to pick the right language and autodiff library for your own projects.
------------------------
François Pacaud -- MadNLP: leveraging Julia for large-scale nonlinear optimization
We present the solver MadNLP, an implementation of the interior-point method in pure Julia. MadNLP is a port of the solver Ipopt exploiting the many benefits of the Julia language, from multiple-dispatch to cutting edge automatic differentiation libraries. Notably, MadNLP is among the first optimization solvers running seamlessly on Graphical Processing Units (GPUs). In this talk we will present our experience in porting optimization to the huge-scale regime using Julia, illustrated by concrete examples.

Meetup 1 -- Guillaume Dalle & François Pacaud