The gap between prestige and impact

post by Adam Binks, ojorgensen · 2022-08-04T21:38:27.688Z · EA · GW · 1 comments

Contents

  Examples of proxy goals that might be misaligned with big goals:
  What to do about this
None
1 comment

Here’s a pattern that we noticed. You probably have some large goal you’re optimising for, like “learn to build great software”. But day-to-day, you optimise for a proxy goal, like “write code which my supervisor says is good”.

Proxy goals are often useful: they’re easier to evaluate, which makes it easier to make quick decisions.

For example, if your big goal is to write a bestselling novel, this is hard to evaluate - to find out if you’re achieving it, you might need to survey readers or even publish the novel. It’s easier to use the proxy of writing a novel that you think is great.

 

But sometimes, it’s unclear that your proxy tracks your big goal.

For example, if your big goal is to help solve the alignment problem, you might find yourself using the proxy goal of doing things that other people in the EA community think are prestigious.

However, there’s uncertainty here. Prestige within the community doesn’t always align with impact. Working on an established research agenda might be immediately impressive, while building your own research agenda might be confusing or illegible to other community members. 

Examples of proxy goals that might be misaligned with big goals:

What to do about this

  1. Be aware that you are using proxy goals.
  2. Periodically step back to notice how your proxy goals misalign with your big goals.
  3. Re-engineer your proxy goals or incentive environment to make sure that the work you’re doing actually promotes your big goal.

 

Thanks to Isaac Dunn and Jarred Filmer for great conversations that led to this.

1 comments

Comments sorted by top scores.

comment by Spencer Becker-Kahn · 2022-08-04T22:02:21.567Z · EA(p) · GW(p)

So I think I agree with the general point, but having thought about this a fair bit recently too, I think that there is something tricky that is missing from your presentation. You write:

  1. Periodically step back to notice how your proxy goals misalign with your big goals.

  2. Re-engineer your proxy goals or incentive environment to make sure that the work you’re doing actually promotes your big goal.

The issue is that lots of things actually do require deep, sustained effort to make progress on, or it may require deep sustained effort to get the necessary skills for something. So to some extent you can't just periodically re-adjust...you really do need to place a bet on something and double down on it, or you may forever be tacking your trajectory and never really having any impact on anything. This is really hard to figure out.

More speculatively, there is a broader critique of EA here too which has been mentioned recently in other posts where it just feels nicer and cooler and more interesting to keep 'EA' as your central thing - as the core part of your identity - rather than being like 'OK I've decided the best thing to do is to get a job in, say, this government department' and to just double down on that. Once you do the latter it might feel like you kinda leave EA behind more than you wanted to.