Systems & Control PhD Seminar Series | In-Context Learning for Estimation and Control

Mercoledì 18 giugno 2025 | 11:45
Dipartimento di Elettronica, Informazione e Bioingegneria - Politecnico di Milano
Sala Conferenze "Emilio Gatti" (Edificio 20)
Dipartimento di Elettronica, Informazione e Bioingegneria - Politecnico di Milano
Sala Conferenze "Emilio Gatti" (Edificio 20)
Speaker: Alessandro Colombo (Politecnico di Milano)
Contatti: Prof. Simone Formentin | simone.formentin@polimi.it
Contatti: Prof. Simone Formentin | simone.formentin@polimi.it
Sommario
In-Context Learning (ICL) involves extrapolating the underlying input-output relationships across a class of systems rather than a single system in isolation. By exploiting transformer-like model architectures, Large Language Models (LLM), such as ChatGPT and DeepSeek, are able to instantly respond to a given input with a reasonably accurate output without the need for retraining or adaptation. In this seminar, we show how a modified transformer architecture, which receives as input a time series instead of language tokens, can be used to estimate dynamic variables of different systems belonging to the same class. Training data for the model is obtained through simulation: in such a controlled environment system parameters can be perturbed arbitrarily, effectively simulating different individual instances of a class of systems. The presented case study shows how this type of filter can achieve performance comparable to, or even surpassing, that of classical filtering methods using the same input-output data.