In Collaboration with Visser & van Baars (https://www.visservanbaars.nl), we want to invite you to share your experience and questions about Microservices Architecture and Machine Learning/AI.
18:30 Doors open + 🍔🍔🍔
19:00 Speaker 1
20:00 Speaker 2
20:45 End + drinks
Microservices Architecture and Machine Learning/AI are two popular topics nowadays. So how can we place ML models into Microservices Architecture?
Each service in Microservices Architecture can expose APIs to serve its consumers. API development is a separate expertise and can be challenging for Data Scientists as well as Data Engineers. Often companies hire Data Scientists to analyze data and build ML models. They also hire Data Engineers to set up data infrastructure for efficiently collecting, processing and storing data pipelines. After setting up the data infrastructure and building models next step is to industrialize those models. There are several different ways of industrialization. A model could be deployed in a batch mode to do one off predictions or can be integrated into streaming applications to do continuous near real-time predictions. Another way to industrialize is deploying ML models as an API for on-demand predictions. In this demo presentation we will talk about industrialization of ML models in general and show you how you can build an API to expose your model.
I am a software engineer and trainer with almost 15 years of hands-on experience in designing, developing and maintaining Low Latency, High Availability, scalable software applications. I moved to the Netherlands 12 years ago from Turkey. I had worked as a Freelance software engineer for about 3,5 years before I started Pivot Horizon as a Co-Founder.
Dennis de Weerdt:
I am a recently graduated software engineer from the Netherlands, with a specialization in machine learning and formal verification. I obtained by Master's degree in Denmark. I joined Pivot Horizon in August this year, and seek to develop myself as a professional skilled in both data science and data engineering.