Zum Inhalt springen

Details

Dear Deep Learning community,

Finally, after way too long, we have our first in-person meeting again – we are extremely excited to return to our real-world meeting format!

This time, we will take a look a the most interesting fields of research presented at the ICLR conference (International Conference on Learning Representations) and will have a presentation on the practical aspects of building one of the largest open-source NLP models.

This event is graciously hosted by A1 at Lassallestraße 9, 1020 Wien.

Our agenda is as follows:

18:30 - 18:40 Welcome & Introduction
18:40 - 19:45 ICLR 2022 - Trends & interesting highlights
19:45 - 20:15 Break
20:15 - 20:25 Announcements
20:25 - 21:00 GPT-NeoX-20B - An Open-Source Autoregressive Language Model
21:00: Networking

The talks:

ICLR 2022 - Trends & interesting highlights
Rene Donner, Medical Volume Annotator mva.ai & contextflow

We will present a distilled overview of this year's International Conference on Learning Representations – one of the largest academic conferences on deep learning. We will look at the most active areas of research, point to the most interesting and approachable tutorials and highlight specific papers and demos!

GPT-NeoX-20B - An Open-Source Autoregressive Language Model
Michael Pieler, EleutherAI

A high-level overview of the GPT-NeoX-20B setup which is the largest dense autoregressive language model that has publicly available code and weights. We will discuss the distributed training setup, some practical aspects, and the implications and future trends.

Very much looking forward to seeing you all again,
the VDLM organizers

Verwandte Themen

Artificial Intelligence
Deep Learning
Machine Learning
Neural Networks

Das könnte dir auch gefallen