Posts

LW4EA: Is Success the Enemy of Freedom? (Full) 2022-06-21T15:07:12.464Z
LW4EA: Value of Information: Four Examples 2022-06-14T02:26:13.451Z
LW4EA: Sabbath hard and go home 2022-06-07T02:46:59.138Z
LW4EA: How to Be Happy 2022-05-31T22:42:25.126Z
LW4EA: 16 types of useful predictions 2022-05-24T03:19:12.133Z
LW4EA: Some cruxes on impactful alternatives to AI policy work 2022-05-17T03:05:35.057Z
LW4EA: Beyond Astronomical Waste 2022-05-10T14:50:24.358Z
LW4EA: How to Not Lose an Argument 2022-05-03T02:43:38.871Z
Are you willing to talk about your experience attending EAG or EAGx with someone who's considering applying? 2022-04-28T12:42:53.615Z
On the fence about applying to EAG or EAGx? Talk to someone (me?) who went! 2022-04-28T12:38:20.517Z
LW4EA: Philosophical Landmines 2022-04-26T02:53:28.258Z
Is the EA Librarian still a thing? If so, what is the current turnaround? 2022-04-21T16:03:19.938Z
LW4EA: How Much is Your Time Worth? 2022-04-19T02:03:42.079Z
LW4EA: Can the Chain Still Hold You? 2022-04-12T18:26:05.309Z
Should the forum have posts (or comments) only viewable by logged-in forum members 2022-04-04T17:40:57.202Z
LW4EA: Yes Requires the Possibility of No 2022-04-04T16:00:48.252Z
LW4EA: Shoulder Advisors 101 2022-03-29T03:16:28.585Z
LW4EA: Being the (Pareto) Best in the World 2022-03-22T02:58:23.863Z
LW4EA: Paper-Reading for Gears 2022-03-14T17:26:02.141Z
LW4EA: Privileging the Question 2022-03-07T16:38:18.618Z
LW4EA: Humans are not automatically strategic 2022-02-28T18:39:16.942Z
Low-Commitment Less Wrong Book (EG Article) Club 2022-02-10T15:25:45.067Z
Post about risks of movement building? 2022-01-30T14:42:23.569Z
Activism to Make Kidney Sales Legal 2019-04-09T18:11:02.250Z
I'll Fund You to Give Away 'Doing Good Better' - Surprisingly Effective? 2019-03-19T18:21:04.086Z
EA/X-risk/AI Alignment Coverage in Journalism 2019-03-09T21:17:28.479Z

Comments

Comment by Jeremy (captainjc) on LW4EA: Is Success the Enemy of Freedom? (Full) · 2022-06-21T15:09:30.608Z · EA · GW

Of particular relevance I think to people coming to EA mid-career. 

And brilliantly, succinctly summarized by Villiam in the top comment.

Slack enables exploration.

Exploration enables exploitation.

Exploitation destroys slack.

Comment by Jeremy (captainjc) on Mastermind Groups: A new Peer Support Format to help EAs aim higher · 2022-06-06T16:20:23.361Z · EA · GW

This is a great idea!

Comment by Jeremy (captainjc) on EARadio Returns - Suggest Episodes and Shoutouts · 2022-06-03T14:25:43.206Z · EA · GW

I didn’t realize this existed. I have often wished it was easier to listen to YouTube videos while on the go. Great idea!

Comment by Jeremy (captainjc) on My slack budget: 3 surprise problems per week. · 2022-06-01T20:59:51.405Z · EA · GW

I’m adding the LessWrong For EA tag to this since this is a perfect example of the type of stuff I endeavor to repost under that.

Comment by Jeremy (captainjc) on LW4EA: 16 types of useful predictions · 2022-05-24T18:20:29.394Z · EA · GW

I agree. When I was facilitating the In Depth virtual program, people often had difficulty finding practical ways to make predictions. It would have been helpful to be able to refer them to this. I emailed to suggest that it be added to the syllabus. 

Comment by Jeremy (captainjc) on Should U.S. donors give to an EA-focused PAC like Guarding Against Pandemics instead of individual campaigns? · 2022-05-21T12:52:16.360Z · EA · GW

I’ve seen it referred to as a hybrid PAC, but I’m not sure what that means exactly. I guess that part of it is unlimited in funding but can’t donate to candidiases and part of it is limited and can.

Comment by Jeremy (captainjc) on Should U.S. donors give to an EA-focused PAC like Guarding Against Pandemics instead of individual campaigns? · 2022-05-21T11:25:02.961Z · EA · GW

I believe the comment you linked to in 1 is referring to the Protect Our Future super PAC, which was, in Carrick’s case buying ads for him and could not donate to his campaign directly.

My understanding is that the GAP (non-super) PAC donates directly to candidates (up to $5000), that they can then spend those funds the same as any other campaign contribution.

The benefit, as it was explained to me, was that GAP is in contact with the candidates, does some amount of vetting, and the candidates see that the money comes from them. An individual donation would not carry any association with preventing pandemics. Important because these are candidates that are not EA aligned or necessarily that committed to pandemic preparedness.

I believe that is the basic case for it. That said, it seems unlikely to be anywhere close to as impactful as a donation to an EA aligned candidate (not sure there are any of those right now though), and I am not aware of any kind of cost effectiveness analysis comparing such a donation to AMF or anything like that.

There is also the $5000 limit that you can donate to GAP as well.

There was this post from GAP about it a while back, but I didn’t find that it made a very strong case for it.

https://forum.effectivealtruism.org/posts/Btm562wDNEuWXj9Gk/guarding-against-pandemics

Comment by Jeremy (captainjc) on Should U.S. donors give to an EA-focused PAC like Guarding Against Pandemics instead of individual campaigns? · 2022-05-21T11:08:19.762Z · EA · GW

My understand is that this is different (maybe a PAC rather than super PAC?) and that, the way it is setup, it actually donates directly to the candidates, but is limited to $5000 per candidate, and $5000 per person donating to GAP.

Comment by Jeremy (captainjc) on EA and the current funding situation · 2022-05-18T15:24:46.677Z · EA · GW

Strongly agree as well!

Comment by Jeremy (captainjc) on Bad Omens in Current Community Building · 2022-05-13T13:45:36.277Z · EA · GW

I am looking to have the 1-to-2 hour long, 2-to-5 person thoughtful conversation, on literally dozens of existing and EA-adjacent topics.


I sympathize with this as it does seem like there aren't currently a ton of these opportunities like this.

This forum, unfortunately, has presented me with consistent misrepresentations and fallacies, which the commentators refuse to address when I point them out.

This is a pretty strong statement that seems like it would benefit from some examples to support it - though maybe it is beside the point as the forum probably isn't going to be the "1-to-2 hour long, 2-to-5 person thoughtful conversation" you are looking for anyway.

Comment by Jeremy (captainjc) on Bad Omens in Current Community Building · 2022-05-13T13:15:51.352Z · EA · GW

My intuitive understanding of the Alice personality type (independent, skeptical, etc.) is that they are often very entrepreneurial (a skill EA desperately needs), but not usually "joiners". I have no doubt that a lot could be improved about community building, but there may always be some tension there that is difficult to resolve. 

It may be that the best we can hope for in a lot of those cases are people who understand EA ideas and use them to inform their work, but don't consider themselves EAs. That seems fine to me. Like person 1 in your real life example seems like a big win, even if they don't consider themselves EA. If the EA intro talk she attended helped get her on that track, then it "worked" for her in some sense. 

Comment by Jeremy (captainjc) on US Citizens: Targeted political contributions are probably the best passive donation opportunities for mitigating existential risk · 2022-05-06T17:28:26.449Z · EA · GW

I have more-or-less come to this same conclusion. As I mentioned in a reply below, Guarding Against Pandemics has a PAC that can receive up to $5000 from individual donors, and can, in turn, donate up to $5000 directly to campaigns. 

As the linked post explains, "Donations to the PAC would go towards supporting candidates who are champions for pandemic preparedness in Congress and in state and local offices. ", so, not necessarily EA aligned in other ways. They could also be from either political party.

This seems potentially pretty impactful, but probably more risky than donating directly to an EA aligned candidate. I am curious what others think about this, or if anyone has done an analysis or anything.

Comment by Jeremy (captainjc) on US Citizens: Targeted political contributions are probably the best passive donation opportunities for mitigating existential risk · 2022-05-06T17:19:50.383Z · EA · GW

Zach, can you provide a source that it is "about zero"?

Comment by Jeremy (captainjc) on US Citizens: Targeted political contributions are probably the best passive donation opportunities for mitigating existential risk · 2022-05-06T17:18:35.511Z · EA · GW

My understanding is there are different kinds of PACs. Guarding Against Pandemics has a PAC that cannot receive more than $5000 from individual donors, but can donate up to $5000 directly to campaigns.

Comment by Jeremy (captainjc) on The case for not pursuing a career in an EA organization · 2022-04-26T02:55:05.030Z · EA · GW

Will do!

Comment by Jeremy (captainjc) on The case for not pursuing a career in an EA organization · 2022-04-25T20:04:27.598Z · EA · GW

I am thinking along  similar lines Miranda, and I may have some of that comparative advantage too :)

I don't like to talk about plans too much before actually getting down to doing, but I am working on a project to find ways to support people coming to EA mid-career/mid-life (as I did). I expect to write a top level post about this in the next few weeks.

The goals are crystalizing a bit:

1. helping to keep people engaged and feeling like a part of the community even if they can't (aren't a good fit for, or aren't yet ready to) consider a high impact career change
2. helping people figure out how to have the most impact in the immediate term, within current constraints
3. helping people work towards higher impact, even if it's in the longer term 

Some ideas for how to do it: 

1. compiling and organizing resources that are specifically relevant for the demographic
2. interfacing with EA orgs (80k, local groups, EA Anywhere, Virtual Programs, etc.) in appropriate, mutually beneficial ways 
3. peer-based support (because situations mid-career/life vary widely) - IE probably taking the form of a group to start and then hopefully figuring out what kind of 1-on-1 stuff could work too (mentorship, buddy system, etc.)

Comment by Jeremy (captainjc) on Is the EA Librarian still a thing? If so, what is the current turnaround? · 2022-04-21T17:59:19.567Z · EA · GW

Sorry to hear about your illness. I hope you feel better! It was very helpful the one time I used it, so I hope you guys can find a way to keep it going. Thanks!

Comment by Jeremy (captainjc) on How much current animal suffering does longtermism let us ignore? · 2022-04-21T13:55:00.587Z · EA · GW

Related thoughts in the recent 80k SBF interview (which I have only half finished, but is excellent). This link should take you directly to the audio of that part, or you can expand the full transcript on the page and ctrl/cmd-f to "Near-term giving" to read.

Comment by Jeremy (captainjc) on The case for not pursuing a career in an EA organization · 2022-04-21T13:31:26.496Z · EA · GW

I think having 1% of humanity lightly engaged in EA-related activities is more valuable than having 0,0001% deeply engaged. 

I agree that this is the crux, but I don't think it's an either-or scenario. I guess the question may be how to prioritize recruiting for high priority EA jobs, while also promoting higher-absorbency roles to those that can't work in the high priority ones. 

Being "mediocre, average, or slightly above average" is not always going to be a permanent state. People develop career capital and skills, and someone who isn't a good fit for a priority role out of university (or at any particular moment), may become one over time. Some of Marc's suggestions could be thought of as stepping stones (he mentioned this in a few places, but it seems worth calling out).

Related to that, the EA jobs landscape is going to change a lot in the next few years as funding pours in, and projects get started and need to staff-up. It seems worthwhile to keep the "collateral damage" engaged and feeling like a part of the EA community, so that they can potentially help fill the new roles that are created.

Comment by Jeremy (captainjc) on How I failed to form views on AI safety · 2022-04-17T17:47:12.110Z · EA · GW

I really appreciated this post as well. One thought I had while reading it - there is at least one project to red team EA ideas getting off the ground. Perhaps that’s something that would be interesting to you and could come closer to helping you form you views. Obviously, it would not be a trivial time commitment, but it seems like you are very much qualified to tackle the subject.

Comment by Jeremy (captainjc) on LW4EA: Can the Chain Still Hold You? · 2022-04-17T13:59:35.390Z · EA · GW

Yeah, the nuclear example is definitely one way it could not be a good thing. Reading through the LW comments, (once you get past the 100s of comments tangent about killing males and radical feminism) a lot of people thought it was vague. The author said it was just intended as general inspiration.

Another commenter (Gwern?) made the interesting point that the analogy works better if we think of ourselves as the baboons and AGI as humans.

Comment by Jeremy (captainjc) on Unsurprising things about the EA movement that surprised me · 2022-04-07T01:37:23.240Z · EA · GW

This is great. I remember a similar post from maybe a year or two ago, but I am unable to find it. Something along the lines of "things that it took me way too long to figure out about EA". Anyone else remember this?

Comment by Jeremy (captainjc) on 5-Minute Advice for EA Global · 2022-04-06T16:41:56.265Z · EA · GW

I think this would have been more useful to read before attending EAGx Boston this past weekend than any of the other posts that were recommended (though it's possible that the other posts gave me context important to appreciating this post). 

I might add one thing:  To me, 30 minutes felt a bit rushed and the best 1x1s I had were the ones where neither of us had something afterwards and tended to naturally wrap up in about 45 minutes. It's probably not possible to schedule them all like that (and you have no control over other person's schedule), but you may want to leave an empty space after a few that you feel are particularly important. 

Comment by Jeremy (captainjc) on Should the forum have posts (or comments) only viewable by logged-in forum members · 2022-04-04T20:53:19.934Z · EA · GW
  • First point is very good and I hadn't thought of it. I guess hiding something lessens the chance you get discovered, but always makes you appear more guilty if/when you are. I guess that that is more relevant the more people you think you have digging around for dirt.
  • Second point: Karma? But that does require you to be logged in of course.
  • Third: This could be addressed by not making it the choice of the poster, but by requiring a certain number of readers to click a "make this non-public" button. Then it's more of a community decides kind of thing. Of course if you want to make a non-public post about a controversial topic, you have to rely on others make it so.

Another approach would be to make posts less findable  (I'll add these ideas to the original post too)

  • There could be a check box for users that added a noindex tag
  • If someone doesn't want to draw attention from outside the community, they could use a codeword (and request that others do as well) for obvious search keywords - initials of a politician, etc. This is probably not all that reliable - and has the same issue as you mention in your first bullet point.
Comment by Jeremy (captainjc) on Future-proof ethics · 2022-04-01T03:18:20.181Z · EA · GW

Yup, works for me now.

Comment by Jeremy (captainjc) on Future-proof ethics · 2022-03-31T23:00:11.785Z · EA · GW

Huh, maybe someone else wants to weigh in? When I view in an incognito window, it prompts me to login. When I view it logged in, it says "You need access. Ask for access, or switch to an account with access." I'm not sure if you are the owner, but if so, you likely just need to click on "Share", then "Restricted" in the Get Link dialog (it doesn't really look like you can click there, but you can), then change the setting to "Anyone with the link".

Comment by Jeremy (captainjc) on Future-proof ethics · 2022-03-30T12:40:02.296Z · EA · GW

The link to Chapter 2 of On the Overwhelming Importance of Shaping the Far Future at the end links to a non-public Google Drive file.

Comment by Jeremy (captainjc) on LW4EA: Shoulder Advisors 101 · 2022-03-30T02:19:37.744Z · EA · GW

Also meant to point out the link in you second bullet point is the same as the first :)

Comment by Jeremy (captainjc) on LW4EA: Shoulder Advisors 101 · 2022-03-30T02:19:02.110Z · EA · GW

I'm glad you are finding the series helpful :) 

Yes, it was pretty fascinating to me too. I remembered when it came up in HPMOR and it sort of puzzled me then. I've never had much ability see images in my mind (I guess it's called aphantasia), and then at some point while reading the post, I realized that I am pretty sure I've only ever heard one voice in my mind as well. A shoulder advisor may not be in the cards for me, but at least I learned about a universal human experience I may be missing out on.

Comment by Jeremy (captainjc) on EAGxBoston: Updates and Info from the Organizing Team · 2022-03-29T12:43:15.287Z · EA · GW

This post has a lot of information that's not on the event page. I keep trying to find it by searching the Forum for EAGx Boston an it doesn't come up in results. I think it's because the post only mentions EAGx in the combined word "EAGxBoston". If you could edit the title or body to include "EAGx Boston" I think that would help fix it.

Comment by Jeremy (captainjc) on Brain preservation to prevent involuntary death: a possible cause area · 2022-03-24T03:42:40.868Z · EA · GW

Thank you for the thoughtful reply. I jotted down the original comment out on my phone and I am realizing it came across more argumentative than I intended. I apologize for that. 

I have similar intuitions that creating a new person doesn't make up for the badness of someone dying, but if it is better, I would like to have an idea how much better and why. 

Assuming we could create new people for some cost, and that those new people have value, it would be important to be able to compare that with the cost/value of reviving someone, to most efficiently spend limited resources.

Focusing on the subject of the intervention, the value of 1000 years lived to a new person would be the same as the value of 1000 years lived to the revived person, no?

The only difference would seem to be the value to anyone else - other people who care about them. 

I can't say precisely how you would quantify that, but additional relevant factors might be

  • how long it might take the technology to develop, and, by that point, how many preserved people would have anyone who cared about them remaining
  • the probability of revival technology working

I'm sure there's more I haven't thought of.

Comment by Jeremy (captainjc) on Brain preservation to prevent involuntary death: a possible cause area · 2022-03-23T17:34:17.670Z · EA · GW

So if I am understanding right, the main benefit of reviving a preserved brain, over creating a new person would be “relational and psychological factors”. I am interpreting that loosely as friends and family wouldn’t be sad that they are dead.

Is that enough to justify the costs? Presumably, by the time of reviving, friends and family of most people who’s brains have been preserved would either be dead or have had their own brains preserved. The obvious exception being those who died just before the technology developed.

It seems like the question is maybe, how many QALYs would you estimate from friends/family in that situation and how does the cost effectiveness work out for that?

Comment by Jeremy (captainjc) on LW4EA: Paper-Reading for Gears · 2022-03-20T13:37:57.662Z · EA · GW

I feel a bit weird copy/pasting the whole thing as I am not able to contact the authors first and I don’t think I can assume they would want their post completely reproduced on another forum.

Comment by Jeremy (captainjc) on LW4EA: Paper-Reading for Gears · 2022-03-15T15:59:30.601Z · EA · GW

Great ideas, thanks!

Comment by Jeremy (captainjc) on LW4EA: Privileging the Question · 2022-03-11T17:35:37.667Z · EA · GW

This is a great idea. I'm a slow writer, so this solves most of my reluctance to the summary idea. I think will do this for the time being and then leave a note in each post in case someone wants to write a summary - that is if you don't mind me stealing your idea :)

Comment by Jeremy (captainjc) on I want Future Perfect, but for science publications · 2022-03-09T12:28:55.712Z · EA · GW

I might add that I think this is a fantastic idea!

Comment by Jeremy (captainjc) on I want Future Perfect, but for science publications · 2022-03-09T12:26:58.126Z · EA · GW

I’m not sure why it took me so long to find this but this (second part) is what I was thinking of. I’m not sure if it’s explicitly for EA/X-risk stuff, but it is “… to facilitate investigative efforts into conquering issues of the current COVID-19 pandemic, biosecurity and public health preparedness.”

https://www.coinspeaker.com/ftx-future-fund-1b-improving-humanity/?amp

Comment by Jeremy (captainjc) on I want Future Perfect, but for science publications · 2022-03-08T18:23:26.510Z · EA · GW

This is not going to be very helpful, but I thought I remembered hearing about a recent launch of a new EA column somewhere. Perhaps by Future Fund or FTX Foundation. It could have been ProPublica, but searching didn’t seem to turn anything up. Hopefully someone knows more.

Comment by Jeremy (captainjc) on LW4EA: Privileging the Question · 2022-03-08T17:07:40.119Z · EA · GW

I've gotten feedback that a "summary of the post and/or why...it's interesting to an EA audience" would improve these reposts. I agree, but this has to be a pretty low effort thing for me to be able to keep it up. 

If anyone is able to do this once a week, I could get you the links a week ahead of time (or a few at a time if that's easier).

Comment by Jeremy (captainjc) on Organizations Encouraging Russian Desertion? · 2022-03-03T14:40:50.310Z · EA · GW

Bryan Caplan had some interesting ideas about this. It seems right that safe passage to the EU would be essential. https://betonit.blog/2022/03/02/make-desertion-fast/

Comment by Jeremy (captainjc) on Low-Commitment Less Wrong Book (EG Article) Club · 2022-02-28T18:45:57.748Z · EA · GW

First post is here.

Comment by Jeremy (captainjc) on What (standalone) LessWrong posts would you recommend to most EA community members? · 2022-02-28T18:44:56.243Z · EA · GW

It took me a while to get rolling, but I have done a first Less Wrong repost here and will continue weekly as long as there is enough interest. 

Comment by Jeremy (captainjc) on Low-Commitment Less Wrong Book (EG Article) Club · 2022-02-28T18:41:33.500Z · EA · GW

The first post is here

Comment by Jeremy (captainjc) on Nuclear attack risk? Implications for personal decision-making · 2022-02-27T18:06:45.695Z · EA · GW

The Metaculus link suggests it may also be worth considering (especially if you are in the US), how a cyber attack might effect you.

Comment by Jeremy (captainjc) on Low-Commitment Less Wrong Book (EG Article) Club · 2022-02-23T13:49:50.254Z · EA · GW

Sure thing!

Comment by Jeremy (captainjc) on Low-Commitment Less Wrong Book (EG Article) Club · 2022-02-22T13:52:19.746Z · EA · GW

I guess I was using the term "group" in a loose sense - maybe it's not the right word, as my plan would not require signing up or joining or anything (keeping in the spirit of low commitment), just an article posted each week that people can comment on freely. 

That's about all I have time for at the moment. If someone wants to do something more formal in parallel, I am all for it. You are welcome to use the tag (which I will update soon), or I am happy to coordinate if that would be useful. If there's not much engagement with the posts, that might indicate something more structured would have more success. 

Comment by Jeremy (captainjc) on Low-Commitment Less Wrong Book (EG Article) Club · 2022-02-21T16:03:14.080Z · EA · GW

For the tag description, I am thinking something like 

"Any EA-relevant LessWrong post, including posts from the weekly Less Wrong repost & low-commitment discussion group."

Comment by Jeremy (captainjc) on Low-Commitment Less Wrong Book (EG Article) Club · 2022-02-21T15:40:07.895Z · EA · GW

So I think I spent too much time thinking about the name, but, in the end I think it works well to have something simple and short that gets the basic idea across, with more details given in the post body/tag description. 

I think "Less Wrong for EA" accomplishes this. Each post title can be of the format "LW4EA: Post Title from Less Wrong" and the tag "Less Wrong for EA" isn't overly long either.

Below is what I plan to write in the post body (I will also edit/rewrite the tag). I would love any feedback. 

--

This link post is part of Less Wrong for EA, a Less Wrong repost & low-commitment discussion group (inspired by this comment). Each week I will revive an EA-relevant post from the Less Wrong Archives, more or less at random, from a list I started from the comments here (additional suggestions welcome via PM or in the comments).

Please feel free to,

Initially, I talked about hosting Zoom discussion for those that were interested, but I think it’s a bit more than I can take on right now (not so low-commitment). If anyone wants to organize one, comment or PM me and I will be happy to coordinate for future posts.

Comment by Jeremy (captainjc) on Low-Commitment Less Wrong Book (EG Article) Club · 2022-02-13T13:19:11.147Z · EA · GW

Looks good  thanks!

Comment by Jeremy (captainjc) on Low-Commitment Less Wrong Book (EG Article) Club · 2022-02-13T13:16:19.819Z · EA · GW

I think I misunderstood what you meant here at first. Adding it now.