Announcing "Naming What We Can"! 2021-04-01T10:17:28.990Z
Asaf Ifergan's Shortform 2020-06-14T10:25:36.023Z
Outreach concerns - What are your biggest and how to avoid them? 2020-04-30T08:50:21.511Z
Audience Targeting for EA communities 2020-04-12T07:22:12.487Z


Comment by Asaf Ifergan on Announcing "Naming What We Can"! · 2021-04-03T08:00:58.893Z · EA · GW

Or, you could change your name to Wise Julia. This will also allow you to signify your intellectual superiority.

Tail risk: if EA ends up voting for a top leader, and you get elected, this could sound pretty culty. If that risk seems significant to you, I would advise avoiding the obvious choice here - Julia the Wise - which is even worse.

Comment by Asaf Ifergan on Yale EA’s Fellowship Application Scores were not Predictive of Eventual Engagement · 2021-01-28T09:09:12.310Z · EA · GW

I don't have a strong opinion about this in the context of fellowships, but I can refer to setting a high entry bar in recruiting community members and volunteers in general, and specifically, by asking them to invest time in reading content. I hope this helps and not completely off-topic.

Though EA is a complex set of ideas and we want people to have a good understanding of what it's all about, demanding a lot from new people can be fairly offputting and counterproductive.

From my experience, people who are of high potential to be both highly-engaged and of high value to the community are often both quite busy people and relatively normal human beings.
As for the first point, if you sent someone a long list of content, he/she might just say "this is too demanding, can't handle this right now". As for the second point, we have to accept that people have much shorter attention spans than we would like to imagine, especially if they are not really familiar with the content.   

Me and Gidon Kadosh from EA Israel have thought long and hard about how to lower the perceived effort of people who come to our website by creating this "Learn More" page on our website. Though it's in Hebrew, you might be able to understand what we tried to do here on a structural level. We plan to make it even more attractive for readers, possibly by splitting this page into individual pages focusing on a specific subject, and allowing the user to conveniently move on to the next/previous subject - This way we both lower perceived effort of reading this content and create a feeling of progress for the user.    

I'm really not sure there is a correlation between the willingness of someone to invest a lot of time in reading lots of content or filling a long application before they have a clear view of the value in doing this. Going back to recruiting new community members and volunteers, there are brilliant people who are value-aligned, but just don't have the time (or are not willing) to invest the time needed to fill in a highly-demanding application, or read 20 articles about something they are not sure they care about yet.

Comment by Asaf Ifergan on Kelsey Piper on "The Life You Can Save" · 2021-01-21T12:26:20.761Z · EA · GW

Thanks for posting this! This can be pretty helpful for figuring out from which angle to approach broader audiences and people who are more skeptical about our ability to make a change.

Comment by Asaf Ifergan on Focusing on Career & Cause Movement Building · 2020-11-09T16:49:02.204Z · EA · GW

Hey David, enjoyed reading this post so thank you for investing your time in putting this together.
One thing I'm not sure is clear to me, is if the goal of these communities is to bring together people who are interested in, say, animal welfare, and then trying to expose them to more EA content?
Or is it aimed to bring together people who are already interested in EA, but are more focused or interested in one area than other areas?

Also, this made me think of an idea - building teams of EA's who are professionals from the same field (Finance, law, marketing, operations) that can provide advice and build strategies for community building. I'll think and talk about this some more, and will try to figure see if there's any benefit in doing that.    

Comment by Asaf Ifergan on Prioritization in Science - current view · 2020-10-31T19:01:06.886Z · EA · GW

Thank you for writing this Edo, it's really interesting to read about these topics as someone who's not really knowledgeable in research and academia. 

"it's not clear to me how much productivity loss is there when scientists are working on stuff they are less intrinsically interested in. The situation seems to be fine in commercial companies..."

I would assume there's a major difference in why most researchers in academia do what they do (interest and sheer curiosity, along with prestige) and why most professionals in the private sector do what they do (money and career development). This is not to say you're not right about that, but I think it's important to keep in mind the difference in the motivation that drives people's work in different work environments.


Comment by Asaf Ifergan on Desperation Hamster Wheels · 2020-10-31T16:37:35.272Z · EA · GW

Thank you so much for sharing this with us and investing time in writing this.
I found this really insightful and helpful, and I can empathize with a lot of what you've felt throughout this journey.

"I’m sad that I’m not better or smarter than I grew up hoping I might be."
I feel like this is a thinking pattern that many people from our generation have, which is problematic because it's a fact that not everybody can be the most X person in the world, be it most impactful, most beautiful, most talented, or most wealthy. I feel it's also not true on an individual level;  we tend to estimate our potential self while neglecting vital personal preferences - some of us just want to work less than others. and while for some people it feels good to work all the time, for others it's demotivating and depressing, and they are much happier when spending more time with friends and family, or watching Netflix on weekends instead of working and studying diligently.   

One of the biggest struggles for me, and I would assume that's true for other people too, is that it can prevent us from noticing and celebrating our own progress because it always feels that we're still miles away from the finishing line - We're not fulfilling our potential. Then we're demotivated, and that surely doesn't help.

Comment by Asaf Ifergan on Excited altruism · 2020-10-31T13:47:46.930Z · EA · GW

Great article!

Comment by Asaf Ifergan on [deleted post] 2020-10-31T13:34:33.144Z

I fully agree with you on that, and from my humble experience, it's rare for people in EA to be interested in doing good purely from a cold and calculated point of view. A lot of us probably had the will to do good much earlier in life and long before we got to EA, and for us Effective Altruism is just the way in which we follow our ever-existing passion to do good.
I also think we should make sure people who stumble upon us don't get the idea that we're not doing this because we're passionate about it. That can and does alienate a pretty substantial amount of people that discover EA, from my own anecdotal experiences with friends and newer community members.

Highlighting content that talks about motivation and excitement, and presenting it to people who are new to EA, might help us to:
1. Prevent people from feeling disconnected from our mission.
2. Be more appealing to people who have a strong desire to do good but are not very analytical or comfortable with the type of content we usually highlight. After we appeal to their emotion and establish common ground - we're all hopeful and excited to do good - then we can start talking about the HOW.

Comment by Asaf Ifergan on Asaf Ifergan's Shortform · 2020-06-14T10:25:36.316Z · EA · GW

Effective Altruism Israel and LessWrong Israel present a new talk - Introduction to existential risk from Artificial Intelligence with Vanessa Kosoy.

In this talk, which assumes no prior knowledge in artificial intelligence, Vanessa will explain the problem in question, and how researchers in the field are trying to solve it. Vanessa is a Research Associate with the Machine Intelligence Research Institute (MIRI) studying the mathematical formalization of general intelligence and value alignment.

The talk will be in English, is not technical, very accessible and quite comprehensive, and is great for both EA's and your non-EA friends that you think should know about this topic.

Here's the facebook event, We start at 19:00 Israel Daylight Time, 16:00 GMT.
See the time in your timezone here.

See you there!

Comment by Asaf Ifergan on Effective Altruism Stipend: A Short Experiment by EA Estonia · 2020-06-14T10:09:57.740Z · EA · GW

Exciting project!

I really love how it enables to do a lot of different things: helps produce content, allows a "trial period" to examine the potential of prospects, acquiring highly-engaged and highly-informed community members, and building the local community.

Waiting to hear about the longer term effects, but it already seems quite worthwhile.

Comment by Asaf Ifergan on Audience Targeting for EA communities · 2020-05-03T10:43:00.089Z · EA · GW

Hi Prabhat!

First things first, I'm also relatively new to EA (approximately 8 months) and I think that it's of great value to take into consideration the ideas of new community members who still have a kind of 'outsider view' on things.

By in large, I agree and I actually started working on strategies to target people who are involved in relevant cause areas or might be more open to EA's concepts of expanding the circle of morality.

There a few assumptions that we can be the base of building this strategy:

  • Communities that have a moral underpinning:
  1. Might be more inclined to be interested in effective altruism in general.
  2. Might be more open to long-term moral arguments, and possibly more easily convinced with them.
  3. Might already have a relatively ‘expanded moral circle’ (e.g, animal welfare activists, climate change activists). This can make expanding their moral circle easier than with other people.
  • Attracting people that are already interested in one of EA’s cause areas with content that relates to that cause area can help build credibility with them, and make them feel more comfortable, This, in turn, can enhance the openness and willing of those people to read further about other EA causes
  • Existing communities enable us to reach a great number of people semi-organically and with a low cost.

Having said that, I think we should be careful with popular causes like climate change and animal welfare, the reason is that a respectful amount of the people who support these causes do so for reasons that are not suitable with EA, don't really have reasoning for their views, or are even aggressive towards people who think differently.

It's completely anecdotal but yesterday when I mapped relevant facebook communities I noticed some groups explicitly state they do shaming to meat-eaters, or are conspiracy-based.

Comment by Asaf Ifergan on Binding Fuzzies and Utilons Together · 2020-04-28T18:20:11.894Z · EA · GW

Although I'm all for variance in opinions within the community, in the case of outreach and marketing I'm kind of happy that we do (:

Comment by Asaf Ifergan on Binding Fuzzies and Utilons Together · 2020-04-28T06:11:17.602Z · EA · GW

First of all, I want to make clear that entering the broader market of charities can simply mean a different website design - I don't know how this should play out, and I believe that we need to be very careful to spend budgets, but I do think that there could be a way for organizations to be both appealing for EA's and non EA's without investing too much on marketing. It doesn't necessarily mean competing with big, well-funded charities that spend enormous amounts of money on marketing, it could simply mean learning what they do well and implementing small changes to at least be easier for me and you to convince our friends to donate to effective charities.

Furthermore, I want to refer to the second point you've raised - I also think that emotional appeal can boost the motivation within EA's. Things like GDlive give me a boost in motivation, not because the numbers are not sufficient to make a strong case but simply because there are some EA's, like me, that are more feeling-oriented than other EA's and I personally want them on board as well.

In turn, if what we are proposing in the post is successful, it could be the case that this gain in motivation by EAs and EA-aligned people would lead them be more eager to learn more about EA, donate more, and maybe even change their career plans to work on EA cause areas.

I think that alone can be a good enough reason to make an effort to seem more emotionally appealing.

Comment by Asaf Ifergan on Which person-person introductions could be highly impactful, COVID related and otherwise? · 2020-04-27T09:06:03.405Z · EA · GW

Curious to know why you think Bill Gates meeting the Israeli prime minister would be extraordinarily beneficial (:

Comment by Asaf Ifergan on Binding Fuzzies and Utilons Together · 2020-04-26T07:47:25.995Z · EA · GW

I agree with the main premise of this post and I have been thinking about this a lot for the last few months. Having said that, I think this marketing strategies should be utilized mostly within charities that are EA aligned, and not within EA itself.

A very strong case for producing more emotional content is that there is already an X amount of money donated by people, and it's better that this money goes to effective charities than in-effective charities. I think this is also very important to do this in "saturated markets" that get a lot of donations, simply because all we will need to do is re-direct funds to better charities, in oppose to telling people why the cause area is important in the first place.

I do think that this is some kind of blind spot within EA. If we really want to do the most good and be as effective as we can be at doing good, we can not strictly rely on the work and donations of the minority who will be drawn to the core ideas of EA. The entrance of effective charities to the regular-people donation market is, in my opinion, a no-brainer, though i'm not sure when and how this should happen.

What I do think is very important is that there will be a clear separation between EA and EA aligned non-profits to avoid harming the culture within EA, and also making sure that the marketing of foundations that are related to us stays honest and "morally sound".

To summarize, I would aim to keep EA relatively small (but maybe more inviting than it is right now) and harness the power of EA - the framework of finding the best opportunities to do good - to redirect donors and volunteers to better, more effective charities. If the way to do this is more emotional marketing, than emotional marketing it is. The means definitely justify the cause in my opinion.

Comment by Asaf Ifergan on Audience Targeting for EA communities · 2020-04-13T18:46:22.237Z · EA · GW

That's pretty good for personal outreach, and I would agree that these assumptions can be helpful when trying to reach to people who will have a positive tendency towards EA.

Having said that, it's pretty unclear to me how you would translate that into ad targeting considering:

1. It's difficult to clearly target "rational and logical" people when you're trying not to approach a specific audience. I can obviously target engineers, mathematicians, and philosophy students, but that is excluding everybody else that is logical and rational and assuming others don't possess these characters. This can also decrease the variety of opinions and talents in our community even more. I might be more sensitive about this because i'm not that typical EA character (no academic background, not much of a technology guy).

2. The goal is to build the community, not necessarily finding people who will donate. This means that a lot of different people can be relevant for us and i'd like to open our reach but still keep it within a reasonable and logical audience.

Than again, sometimes you just can't win everything. Maybe it will be a good idea to target specific audiences with different posts, targeting each audience with content that is more palatable and interesting for them.