consciousness likely has this role in our brain, but this is not the case at the level of our society or state under capitalism.

Link to original

Coherence is a way of letting N processing elements do the work of more than N processing elements, because there is some emergent communication happening between them.

One challenge with the notion of coherence / consistency / minimization of constraint violations is that it is a semantic notion - already at the level where a representation is established…

→ Can we map this onto a syntactic criterion that we can establish in a general enough way?
→ Or is there a good proxy, e.g. an energy based notion: How many operations does a model need to keep current when connected to a processing stream? How well are you able to predict the data that you care about?
→ In a system with fixed amount of resources: Can you organize the system in such a way that you can perform the most valuable computations using the available resources … something like coherence might emerge as a sub-goal, something the system would need to optimize in order to produce the desired performance.
→ It could also be that the system has to develop a market in order to do that: Agents need to decide which software to run, but an individual agent does not have the capacity / is not entagled enough with the larger environment in order to do that: Intermediate layers arise that decide how many compute credits the individual agent gets.

Energy in the mind are compute credits.

Link to original