This Friday we'll have two talks followed by drinks.
17:00 Benyou Wang (University of Padova) Quantum formulations for language: understanding words as particles
Quantum Information Retrieval has been proposed one decade ago to unify logic, vector spaces, and probability in a single framework. Recently, the framework also attracted some attention to understand language. The basic idea is to use probability to derive the training/inference procedure in continuous vector spaces (e.g. in neural networks, discrete word index is necessarily embedded as continuous vectors). With explicit probabilities in neural networks, one can design more interpretable components by borrowing some mature mathematical tools from physics. The language to probabilistically describe particles can be tailored to model words in the language as well. By proposing a quantum probability driven neural network using quantum probability theory, it got comparable performances with state-of-art neural networks like CNN/RNN.
Benyou Wang is a second-year PhD student at the University of Padova, Italy. Before starting his PhD career, he got his master degree from Tianjin University (first modern Chinese university) and worked a big IT company called Tencent (the 6th biggest IT companies in the world). In his early-state research, he has already gotten many respected awards in related fields, like SIGIR 2017 best paper honorable mention award and NAACL 2019 best explainable NLP paper. His research interest falls on interpreting and designing well-motivated mathematically-sound models/algorithms/components/innovations for NLP and IR problems.
17:30 Wanyu Chen (ILPS, University of Amsterdam) A dynamic co-attention network for session-based recommendation
Session-based recommendation is the task of recommending the next item a user might be interested in given partially known session information, e.g., part of a session or recent historical sessions. An effective session-based recommender should be able to exploit a user’s evolving preferences, which we assume to be a mixture of her short- and long-term interests. Existing session-based recommendation methods often embed a user’s long-term preference into a static representation, which plays a fixed role when dealing with her current short-term interests. This is problematic because long-term preferences may be more or less important for predicting the next conversion depending on the user’s short-term interests. We propose a Dynamic Co-attention Network for Session-based Recommendation (DCN-SR). DCN-SR applies a co-attention network to capture the dynamic interactions between the user’s long and short-term interaction behavior and generates co-dependent representations of the user’s long- and short-term interests. For modeling a user’s short-term interaction behavior, we also design a Contextual Gated Recurrent Unit (CGRU) network to take actions like “click”, “collect” and “buy” into account. Experiments on e-commerce datasets show significant improvements of DCN-SR over state-of-the-art session-based recommendation methods.