MLT @DeepCon: CNN Architectures


Details
DeepCon is a Deep Learning conference in Otemachi organized by Adam Gibson (Skymind), Junpei Hirono (Microsoft), Hiroshi Maruyama (Preferred Networks). Find all workshops here https://deepcon.github.io/
The MLT workshop on CNN Architectures is conducted by our MLT Core Team Engineers Dimitris Katsios, Mustafa Yagmur and Alisher Abdulkhaev.
-- Part 1 --
A Historical Review of Deep CNNs
-- Part 2 --
-- Popular CNN Architectures (interactive implementation)
-- VGG-Net: 3x3 vs 11x11 Convolution
-- Inception-Net: “1x1 convolution” vs “Fully Connected”
-- Xception: Separable Convolutions in Inception-Networks
-- MobileNet: Depthwise (Separable) Convolutions for Training Light Models
-- ResNet: Residuals in Convolution Operations
-- DenseNet: Dense Connections in Convolution Operations
-- SqueezeNet: Distributed Training of Networks
-- Part 3 --
-- Advanced Deep CNN Architectures (short summary only)
-- ShuffleNet
-- Squeeze and Excitation Networks (SENet)
-- Feature Pyramid Networks (FPNs)
-- Neural ODEs
-- THANK YOU --
Thank you to Adam Gibson (Skymind), Junpei Hirono (Microsoft), Hiroshi Maruyama (Preferred Networks) for having us.
-- MLT PATRON --
Become a MLT Patron and help us to keep MLT meetups like this inclusive and for free. https://www.patreon.com/MLTOKYO
-- MLT RESOURCES --
Slack: https://goo.gl/WnbYUP
Github: https://github.com/Machine-Learning-Tokyo
Youtube: https://www.youtube.com/MLTOKYO

MLT @DeepCon: CNN Architectures