Thoughts on "trajectory changes"

post by evelynciara · 2021-04-07T02:18:36.962Z · EA · GW · None comments

"Trajectory changes" are "very persistent/long-lasting improvements to total value achieved at every time" conditional on not having an existential catastrophe - in other words, interventions that increase the chances of good futures as opposed to merely okay ones (Forethought Foundation).

From the vantage point of the present, it would be very hard to reliably make trajectory changes, because we can't foresee whether our actions will be good or not. However, we can take actions that empower future generations to create the futures they want. These include:

This is similar to the taxonomy of EA focus areas proposed here [EA · GW].

This approach may also be valuable from a deontological perspective: if future generations have a right to choose their own future, then we ought to help them make that decision themselves; making irreversible trajectory changes on their behalf would violate that right.

Based on this focus area taxonomy, I think the most pressing area is global priorities research. Compared to the other focus areas, it seems highly neglected relative to its importance [EA · GW], whereas lots of people inside and outside EA are doing object-level work on x-risks, forecasting, governance, etc.

None comments

Comments sorted by top scores.