Interested in the latest technologies, sharing your passion for code, 👨💻👩💻 or just looking for some drinks, 🍺 food, 🍕 and good company? You've come to the right place! We host regular talks on all things programming with in-house speakers, in our offices, with our very own bar tap and kitchen. What else could you be asking for? 😉
For the first time ever, Picnic is hosting its very own meetup. 🎉 Join us on July 4th for a great evening about Machine Learning and Software Engineering. 😍 As always, 🍺 and 🍕 are on us!
- 18:00: Doors open 🚪
- 19:00: Talk by Bas Vlaming (Picnic): Machine Learning in Logistics: Optimising distribution one delivery at a time
- 19:45: Talk by Thiago de Faria (LINKIT): ML Lifecycle, Continuous Evaluation & DevOps - scaling your ML efforts
- 20:30: Networking
- Bas Vlaming
Bas Vlaming is a data scientist at Picnic. He holds a Ph.D. in physics and worked at MIT and Ocado before joining the team at Picnic Technologies two years ago. He has a keen interest in obtaining data-driven insights into and thereby optimizing processes in complex environments, leveraging a wide variety of computational techniques.
In this talk, he will explain how neural networks can be used to better predict how much time should be allotted to grocery deliveries, and how this leads to a much more efficient distribution model. We will go through the full development cycle: from gathering relevant data and dealing with unreliable inputs, via feature selection, model development and optimization, to building a robust and scalable framework so that the model predictions can be used in Picnic's operation.
- Thiago de Faria
Thiago de Faria is the Head of Solutions Engineering at LINKIT, a knowledge-driven organization with IT experts supporting partners with a transparent path toward pain mitigation and solving business problems.
Your ML/AI Engineers build great models, but do you feel that after the model is "ready" still take time to reach Production? Do you think you could move faster? That's where a devops mindset comes in: reduce the batch size, continuous-everything, a culture of failure/experimentation, monitoring & sharing. These principles can be translated into your data team & improve your ML lifecycle.”
Abstract: "You are here because your team is building ML models… but when do we use ML? To solve problems that machines can find patterns without explicitly programming them to do so. And you may have great ML/AI Engineers building great models, but do you feel that after the model is “ready” still take too much time to reach Production? Do you think you could move faster? Help them use more Testing principles? Maybe share what they have done?
That’s where a devops mindset comes in: reduce the batch size, continuous-everything, a culture of failure/experimentation, monitoring & sharing. These principles can be translated into your data team & improve your ML lifecycle.
It is important to know that CI/CD is essential, but we need to talk about CE – Continuous Evaluation -, reduce waste of time and resources while reworking models.
In the end, I will show how the workflow of ana ML Engineer/Data Scientist can be in real life with a live demo!
Van Marwijk Kooystraat[masked] AG Amsterdam