As artificial intelligence systems increasingly operate in dynamic, resource-constrained environments, they must adopt fundamentally different learning principles. How can we design AI models that are powerful yet capable of adapting efficiently, in ways more closely aligned with natural intelligence? This talk outlines a research trajectory focused on developing learning systems that combine flexibility across time scales and structures with efficiency in computation, energy, and representation. Inspired by neuroscience and dynamical systems, our research centers on recurrent architectures as a core computational principle, reflecting the deeply recurrent nature of the brain across spatial and temporal levels. We explore architectural principles that promote modularity, sparsity, temporal abstraction, and local plasticity. I will highlight selected works that exemplify this perspective, including innovations in minimal and scan-compatible recurrent networks, spiking models with oscillatory dynamics, and recent refinements of foundational components like activation functions. These contributions serve as building blocks in a broader vision for flexible, scalable, and more sustainable learning, grounded in the temporal structure of the world.
Vortragsdetails
Towards Flexible and Efficient Learning
In der Regel sind die Vorträge Teil von Lehrveranstaltungsreihen der Universität Bremen und nicht frei zugänglich. Bei Interesse wird um Rücksprache mit dem Sekretariat unter sek-ric(at)dfki.de gebeten.