You find general patterns by not optimizing for specifics.

The Weakness Principle

compression doesn't cause generalization.

Generalisation is a consequence of “weak” constraints implied by function, not form.
If function is determined by a goal-directed process that favours adaptability (e.g. natural selection), then finite compute forces weak constraints to take simple forms.

Todo

https://www.youtube.com/watch?v=K18Gmp2oXIM&t
every-definition-of-intelligence-is-wrong-here-s-why-michael-bennett
is-complexity-an-illusion
Was trying to unify Maximum Occupancy Principle, w-maxxing, WGCBP; connection to schmidhubers curiosity/interest with WGCBP, …
605d38a3-e5c1-4a42-9349-4b5401a8e710
need to first go into each of the source materials a bit more and let the thoughts marinate.

Untangle this confusion (related: seti callout in compression note; greatness cannot be planned, intrinsic motivation to occupy future action-state path space):


In other words,~~ If we view weakness as a constraint?: constraints are a feature/neccessary condition for general intelligence to evolve:
Human biological limits, like our tiny working memory and shallow calculation depth, are actually a feature. They force us to abstract, compress, intuit. If we had infinite resources, we would never have needed intelligence.