TFML Talk: (Sparse) Compositionality
Title
TFML Talk: (Sparse) Compositionality
Speaker
Tomaso Poggio - Massachusetts Institute of Technology
This talk is part of the TFML PhD School, but open to everyone interested. You can view the full program below.
Abstract
Let us define learnability of a function f as follows. f is learnable when (i) f admits a poly(d)‑parameter approximation with provable generalization and (ii) those parameters are efficiently recoverable from data. We know that every efficiently Turing‑computable f enjoys a compositionally sparse DAG of low‑arity modules, so a sparse deep network of polynomial width satisfies (i). Achieving (ii) is harder: gradient‑based end‑to‑end training is general intractable and, when it works, produces task‑specific constituent functions. Transfer arises only if training data are available that include examples of each sub‑function—i.e., a curriculum teaching primitives before compositions, mirroring infant development. Thus theory, computation and cognition converge on sparse compositionality and curricula targeting reusable primitives as prerequisites for scalable intelligence.
Bio
Tomaso A. Poggio is the Eugene McDermott Professor in MIT’s Department of Cognitive and Brain Sciences and the director of the NSF Center for Brains, Minds and Machines at MIT. He is a founding member of the McGovern Institute and the Computer Science and Artificial Intelligence Laboratory. Former Corporate Fellow of Thinking Machines Corporation, former director of PHZ Capital Partners, Inc. and Mobileye, he has been involved in starting or investing in several other high tech companies including Arris Pharmaceutical, nFX, Imagen, Digital Persona, Deep Mind and Orcam. He is one of the most cited computational scientists and has mentored PhD and postdoc students who are some of the current leaders in intelligence science and engineering.
When
Wednesday, June 25 2025, 11:30
Where
Room 509, UniGe DIBRIS/DIMA, Via Dodecaneso 35