Speaker: Prof. Flora Salim from UNSW Sydney
Location: EE Conference room
Abstract:
In this talk, I will present our self-supervised learning (SSL) pretraining approaches for multimodal sensor data, and also recent works on multimodal self-supervision. I will show why Transformer architecture, designed for sequence-to-sequence modelling, with multi-head attention mechanism, is a popular choice for time-series data, and for spatio-temporal forecasting tasks. I will also present recent works on leveraging Large Language Models (LLMs), benchmark, and datasets for time-series modelling, such as for traffic forecasting and energy demand forecasting. Finally, I will present our ongoing works on model explainability and robustness.
Bio:
Flora Salim is a Professor in the School of Computer Science and Engineering (CSE), the inaugural Cisco Chair of Digital Transport & AI, University of New South Wales (UNSW) Sydney, and the Deputy Director (Engagement) of UNSW AI Institute. Her research is on machine learning for time-series and multimodal sensor data and on trustworthy AI. She has received several prestigious fellowships including Humboldt-Bayer Fellowship, Humboldt Fellowship, Victoria Fellowship, and ARC Australian Postdoctoral (Industry) Fellowship.