The Next Generation of AI
A brief history
From purely symbolic systems and perceptrons in the 70s' to today's deep learning and LLM architectures, AI has made tremendous progress over the last 50 years. Yet, we believe that scaling is not enough, and new ideas are now needed to advance to the next generation, capable to make a synthesis of the advances of the last decades to combine them into new approaches that are more robust, grounded in interactions and capable of real understanding.
Where we are today
Today's forefront AI research is focusing on advancing key issues and limitations of current deep learning and LLM approaches: understanding of causality, grounding of shared meaning, reasoning and abstraction forming, embodiment, interactions and life-long learning. The current paradigm to approach these issues is called "end-to-end differentiable learning", which poses as a necessity that all the above issues should be framed in the context of model that can be trained by gradient descent.
At Developmental Labs, we challenge this paradigm and instead advocate for hybrid approaches, combining the best of neural network technologies with explicit cognitive architectures making use of discrete representations, black-box optimization techniques, algorithmic methods and formal approaches.
We also focus on a developmental approach to AI, where AI systems are embodied into a physical and social world and build incremental knowledge by interaction with humans. Language emergence is a cornerstone of our approach: it is a central part of the process by which grounded meaning is socially synced between agents, out-of-distribution goals can be explored, and joint attention and joint plans can be built between agents.
Combine the best of deep learning and algorithmic symbolic approaches into a hybrid cognitive architecture.
Train AIs in VR simulations, interacting with humans and building grounded common sense and language.