Doorgaan naar de inhoud

Details

This Friday we'll have two talks followed by drinks.

17:00 Benyou Wang (University of Padova) Quantum formulations for language: understanding words as particles

Quantum Information Retrieval has been proposed one decade ago to unify logic, vector spaces, and probability in a single framework. Recently, the framework also attracted some attention to understand language. The basic idea is to use probability to derive the training/inference procedure in continuous vector spaces (e.g. in neural networks, discrete word index is necessarily embedded as continuous vectors). With explicit probabilities in neural networks, one can design more interpretable components by borrowing some mature mathematical tools from physics. The language to probabilistically describe particles can be tailored to model words in the language as well. By proposing a quantum probability driven neural network using quantum probability theory, it got comparable performances with state-of-art neural networks like CNN/RNN.

Bio:
Benyou Wang is a second-year PhD student at the University of Padova, Italy. Before starting his PhD career, he got his master degree from Tianjin University (first modern Chinese university) and worked a big IT company called Tencent (the 6th biggest IT companies in the world). In his early-state research, he has already gotten many respected awards in related fields, like SIGIR 2017 best paper honorable mention award and NAACL 2019 best explainable NLP paper. His research interest falls on interpreting and designing well-motivated mathematically-sound models/algorithms/components/innovations for NLP and IR problems.

Leden zijn ook geïnteresseerd in