Skip to content

Attention Is All You Need

Photo of Lee
Hosted By
Lee and Howard .
Attention Is All You Need

Details

Welcome to the DC/NoVA Papers We Love meetup! This month's paper is "Attention Is All You Need"

Papers We Love is an international organization centered around the appreciation of computer science research papers. There's so much we can learn from the landmark research that shaped the field and the current studies that are shaping our future. Our goal is to create a community of tech professionals passionate about learning and sharing knowledge. Come join us!

New to research papers? Watch The Refreshingly Rewarding Realm of Research Papers (https://www.youtube.com/watch?v=8eRx5Wo3xYA) by Sean Cribbs.

Ideas and suggestions are welcome–fill our our interest survey here (https://docs.google.com/forms/d/e/1FAIpQLSeJwLQhnmzWcuyodPrSmqHgqrvNxRbnNSbiWAuwzHwshhy_Sg/viewform) and let us know what motivates you!

// Schedule

• 7:00-7:30–Informal paper discussion

• 7:30-7:35–Introduction and announcements

• 7:35-8:40–Paper and Discussion:
Attention Is All You Need ( https://arxiv.org/abs/1706.03762 ), a paper about machine learning and neural networks, presented by Jenny Ching

• 8:40-9:00–Informal paper discussion

// Directions

CustomInk Cafe (3rd Floor)
Mosaic District, 2910 District Ave #300
Fairfax, VA 22031

When you get here you can come in via the patio. Don't be scared by the metal gate and sign. It's accessible via the outside stairs near True Food. There is a parking garage next door for those coming by vehicle. And, there is a walkway to the patio on the 3rd floor of the garage nearest moms organic market.

Metro: The Dunn Loring metro station is about 0.7 miles from our meetup location. It’s very walkable, but if you’d prefer a bus, the 402 Southbound and 1A/1B/1C Westbound leave from Dunn Loring Station about every 5-10 minutes (see a schedule for more detailed timetable).

If you're late, we totally understand–please still come! (via the patio is best) Just be sure to slip in quietly if a speaker is presenting.

// Papers

Attention Is All You Need

Web page with abstract: https://arxiv.org/abs/1706.03762
PDF: https://arxiv.org/pdf/1706.03762.pdf

Attention mechanism is not new. The dominant sequence transduction models are based on complex recurrent or convolutional neural networks that include an encoder and a decoder. The best performing models also connect the encoder and decoder through an attention mechanism. But in this paper, the model is based solely on attention mechanisms. Attention is a concept that helped improve the performance of neural machine translation applications. In this paper, we will look at The Transformer – a model that uses attention to boost the speed with which these models can be trained.

Photo of Papers We Love DMV group
Papers We Love DMV
See more events
CustomInk
2910 District Avenue · Fairfax, VA