asserts that intelligence and final goals are largely independent
Link to originalThere is a variety of organisms with different levels of general intelligence. But can there be general intelligence without a general purpose soul?
If we build machines that are not conscious, it’s going to be difficult to negotiate with them…
Intelligence and abstraction layers do seem to correlate.
Or rather the generality of the intelligence, because intelligence itself is goal / problem space dependent.
I do think generality increases as we go up the stack of abstractions, and that intelligence / specific problem solving skills are orthogonal to that on the stack.
So from the perspective of resonance with the environment mattering for consciousness and thus maybe an increased ability to abstract and generalize effectively, it does seem that, while intelligence and goals may be independent (orthogonality thesis), what’s not independent is goals & abstraction layers!