year: 2021/12
paper: https://arxiv.org/pdf/2201.10346
website:
code:
connections: michael levin, collective intelligence, cognitive light cone, self, cognition, basal cognition, GRN, emergence
What is a true Agent?
For example, one view is that only biological, evolved forms have intrinsic motivation, while software AI agents are only faking it via functional performance (but don’t actually care). But which biological systems really care – fish? Single cells? Do mitochondria (which used to be independent organisms) have true preferences about physiological states? The lack of consensus on this question in purely biological systems highlights the futility of binary categories.
→
There is no magical jump to "true cognition"
For any putative difference between a creature that is proposed to have true preferences, memories, and plans and one that supposedly has none, we can now construct in-between, hybrid forms which then make it impossible to say whether the resulting being is an Agent or not.
→
Given the gradualist nature of the framework, the key question for any agent is “how well”, “how much”, and “what kind” of capacity it has for each of those key aspects, which in turn allows agents to be directly compared in an option space.
No privileged substrate is required for a self.
Self
A self is an agent that pursues goals, …
Scaling small agents into larger agents.








---an-experimentally-grounded-framework-for-understanding-diverse-bodies-and-minds-20240728135907129.webp)
---an-experimentally-grounded-framework-for-understanding-diverse-bodies-and-minds-20240728135942810.webp)