Suffering is a defect between the boundry of world-model and self-model - it is a defect of regulation, and thus can be fixed by more consciousness.

→ If you are smart enough / conscious enough, you could configure yourself to what you care about - a smart enough AI would probably not decide to suffer (this is without editing source code - which is a different but maybe similar question … though isn’t consciousness literally editing the source code if it is self-organizing).

Link to original