Crossing boundaries: from cross-lingual learning to creative thought
The first part of her presentation is concerned with multilingual transformer-based language models. The question is what the nature of the representations learned by these models is, also after they cross language boundaries. They performed several experiments where they fine-tuned mBERT in multiple ways for two different tasks, cross-lingual part-of-speech tagging and natural language inference and studied the effect on the language-specific and language neutral representations in mBERT. In the second part of her talk, she argued in more general terms that in order to encompass a larger share of what human intelligence entails, a focus on models that are able to learn more from less data, and extrapolate beyond the training distribution, possibly inspired by human cognition, is needed. For example, creative thinking is a human ability that has been underexplored in computational modelling, even though it is an important ingredient for engaging technology, ethical AI, and innovation. She showed some of their recent work on the topic of computational creativity, more precisely on novel concept generation.
You can now watch or rewatch this talk here.
And the program of this Reseach Seminar in Language Technology here.