Comment by Oliver Balfour on Why AGI Timeline Research/Discourse Might Be Overrated · 2022-07-04T03:39:36.868Z · EA · GW

I'd guess most 'placing well-intentioned people at important-seeming AI companies' efforts to date have been net-negative.

Could you please elaborate on this? The reasoning here seems non-obvious.

Comment by Oliver Balfour on Global health is important for the epistemic foundations of EA, even for longtermists · 2022-06-09T02:00:33.809Z · EA · GW

Absolutely. I'm exactly in that boat - I became convinced of some basic EA principles after reading Singer's work in 1st year uni last year but I don't think I would have committed to donating a large chunk of my salary and stuck to it for instance if GWWC didn't exist. I wouldn't be here if the community hadn't made it so tractable. I also was initially skeptical of the longtermist perspective - had EA been presented to me in terms other than the power law distribution of global health charity effectiveness it's also much less likely I'd be here (I'm now a longtermist :P)