Lieu visible par les membres
• E. Feuilleaubois (curator of twitter.com/Deep_In_Depth): "Transfer Learning with Convolutional Neural Network - Off the shelf top notch performances"
• M. Ducoffe (CNRS-Lab I3S): "Learning Wasserstein Embeddings"
E. Feuilleaubois :
The application of Convolutional Neural Networks (CNNs) to image recognition has led to highly effective open source models (e.g. VGG16, ResNet50, SqueezeNet, InceptionV3, … freely available in the Keras core) that can discriminate between thousands of object classes with extremely good accuracy. These models have been built with very large databases and high computing power which are not necessarily available for other tasks. Transfer learning (TL), which uses a model developed for a certain task as a starting point for a different task, makes these performances available to custom image classification tasks and avoids a lot of the hurdles that can hamper the construction of CNN classification model from scratch.
The TL process specific to CNN will be presented, including some fine-tuning techniques which have the potential to significantly improve the first results obtained by the TL process.
The Wasserstein distance has received a lot of attention recently in the community of machine learning because it is a powerful tool to compare data distributions with wide applications in image processing, computer vision and machine learning. It already has found numerous applications in several hard problems, such as domain adaptation, dimensionality reduction or generative models. However, its use is still limited by a heavy computational cost. We provide an approximation mechanism that allows to overcome its inherent complexity and to compute optimization problems in the Wasserstein space extremely fast.
Numerical experiments supporting this idea are conducted on the digit data set MNIST, and the drawings dataset Google doodle. They show the wide potential beneﬁts of the method presented.
Courty, Flamary & Ducoffe (2017): "Learning Wasserstein embeddings"