Values and control

post by dotsam · 2022-08-04T18:28:44.857Z · EA · GW · 1 comments

If I value something it means that I also value being able to act in the world so that I get what I want.

I think any entity capable of having coherent preferences will also implicitly value controlling itself and its environment so it can move toward its preferred states.

Have I missed something? It seems that any value whatsoever also smuggles in the preference for a maximised ability to control yourself and the world to get what you want.

Looking at AGI risk, I find it plausible that whatever values it has it will always be incentivised to increase its control over itself and its environment.

The only case where I can see where the maximisation of control is not inevitable is if it has an overwhelming preference to limit its interaction with the world, which would presumably tend towards a preference for immediate self-termination.

1 comments

Comments sorted by top scores.

comment by dotsam · 2022-08-11T10:45:27.020Z · EA(p) · GW(p)

For my own reference: this concern is largely captured by the term ‘instrumental convergence’ https://en.wikipedia.org/wiki/Instrumental_convergence