For our third Snacks&Hacks meetup, we will be playing around with BERT and Transfer learning.
BERT’s key technical innovation is applying the bidirectional training of Transformer, a popular attention model, to language modelling. This is in contrast to previous efforts which looked at a text sequence either from left to right or combined left-to-right and right-to-left training.
We will provide you with more information and materials in the coming weeks.
18:00 - 18:30 Walk-in (food and drinks)
18:30 - 19:00 Introduction to BERT & Transfer Learning
by Anchormen's CTO Jeroen Vlek
19:00 - 21:00 Hacking!
21:00 - 21:15 Wrap up
Bring your own device, we will arrange the rest!
P.S. If you have any questions, please call us on this number:
+31 [masked] and ask for Yordan (facility questions) or Jeroen (technical questions).