Thoughts on "trajectory changes"post by evelynciara · 2021-04-07T02:18:36.962Z · EA · GW · None comments
"Trajectory changes" are "very persistent/long-lasting improvements to total value achieved at every time" conditional on not having an existential catastrophe - in other words, interventions that increase the chances of good futures as opposed to merely okay ones (Forethought Foundation).
From the vantage point of the present, it would be very hard to reliably make trajectory changes, because we can't foresee whether our actions will be good or not. However, we can take actions that empower future generations to create the futures they want. These include:
- Doing global priorities research [? · GW] so future generations have more knowledge they can use to decide what matters
- Improving future generations' ability to predict the effects of their actions [? · GW]
- Improving governance [? · GW] so future generations can more easily decide what matters and act on their collective values
- Improving humanity's general capacity to innovate and solve problems (e.g. science/tech policy and infrastructure [? · GW], public goods markets [EA · GW])
This approach may also be valuable from a deontological perspective: if future generations have a right to choose their own future, then we ought to help them make that decision themselves; making irreversible trajectory changes on their behalf would violate that right.
Based on this focus area taxonomy, I think the most pressing area is global priorities research. Compared to the other focus areas, it seems highly neglected relative to its importance [EA · GW], whereas lots of people inside and outside EA are doing object-level work on x-risks, forecasting, governance, etc.
Comments sorted by top scores.