Posts

Comments

Comment by austen_forrester on Projects I'd like to see · 2018-01-10T19:11:36.232Z · EA · GW

I didn't mean to imply that it was hopeless to increase charitable giving in China, rather the opposite, that it's so bad it can only go up! Besides that, I agree with all your points.

The Chinese government already provides foreign aid in Africa to make it possible to further their interests in the region. I was thinking of how we could possibly get them to expand it. The government seems almost impossible to influence, but perhaps EAs could influence African governments to try to solicit more foreign aid from China? It could have a negative consequence, however, in that receiving more aid from China may make Africa more susceptible to accepting bad trade deals, etc.

I don't know how to engage with China, but I do strongly feel that it holds huge potential for both altruism and also GCRs, which shouldn't be ignored. I like CEA's approach of seeking expertise on China generalist experts. There are a number of existing Western-China think tanks that could be useful to the movement, but I think that a "China czar" for EA is a necessity.

Comment by austen_forrester on The extraordinary value of ordinary norms · 2018-01-02T17:04:09.883Z · EA · GW

I agree with you. "Effective altruists" are not interested in helping others, only furthering their elite white atheist demographic or showing that they are intellectually and morally superiors as individuals. They will steal my ideas and recommendations because they know they are robust while shunning me because I'm outside of their demographic and using the downvoting system to hide comments that are made by someone outside their demographic.

People use the concept of EA, especially x-risks, as a front for world destruction, their true goal. Literally. Who would suspect that the very same people who are supposedly trying to save the world are themselves the ones looking to destroy it using weapons of mass destruction? They are the most dangerous group in the world.

Comment by austen_forrester on The extraordinary value of ordinary norms · 2017-12-16T20:04:48.885Z · EA · GW

You're right, I changed "subsidiary" to "affiliates."

What is it about public safety that you find so objectionable? I take it you are for enacting the morals of FRI and destroying all life on earth?

Comment by austen_forrester on The extraordinary value of ordinary norms · 2017-11-14T17:38:46.206Z · EA · GW

Am I the only one who thinks that it should be taboo among EAs to promote how great death is and how important it is to take action on the realization that death is great (in other words, encouraging mass murder)?

I am, aren’t I? I thought so.

CEA pretends to promote action against global catastrophic risks to give itself a good rep while performing their real mission: encouraging cultists to wipe out all life on earth through affiliates like Foundational Research Institute. You are a fraud and the public should boycott you and your pseudo-movement! The “effective altruism” movement is unquestionably the most dangerous and fraudulent group in the world.

Comment by austen_forrester on Inadequacy and Modesty · 2017-10-29T16:38:46.563Z · EA · GW

I agree that financial incentives/disincentives result in failures (ie. social problems) of all kinds. One of the biggest reasons, as I'm sure you mention at some point in your book, is corruption. ie. the beef/dairy industry pays off environmental NGOs and government to stay quiet about their environmental impact.

But don't you think that non-financial rewards/punishment also play a large role in impeding social progress, in particular social rewards/punishment? ie. people don't wear enough to stay warm in the winter because others will tease them for being uncool, people bully others because they are then respected more, etc.

Comment by austen_forrester on Why to Optimize Earth? (post 1/3) · 2017-10-26T05:28:58.345Z · EA · GW

It could be a useful framing. "Optimize" to some people may imply making something already good great, such as making the countries with the highest HDI even better, or helping emerging economies to become high income, rather than helping the more suffering countries to catch up to the happier ones. It could be viewed as helping a happy person become super happy and not a sad person to become happy. I know this narrow form of altruism isn't your intention, I'm just saying that "optimize" does have this connotation. I personally prefer "maximally benefit/improve the world." It's almost the same as your expression but without the make-good-even-better connotation.

I think EA's have always thought about impact of collective action but it's just really hard, or even impossible to estimate how your personal efforts will further collective action and compare that to more predictable forms of altruism.

Comment by austen_forrester on EA Survey 2017 Series: Cause Area Preferences · 2017-09-09T04:36:08.352Z · EA · GW

Of course, I totally forgot about the "global catastrophic risk" term! I really like it and it doesn't only suggest extinction risks. Even its acronym sounds pretty cool. I also really like your "technological risk" suggestion, Rob. Referring to GCR as "Long term future" is a pretty obvious branding tactic by those that prioritize GCRs. It is vague, misleading, and dishonest.

Comment by austen_forrester on EA Survey 2017 Series: Cause Area Preferences · 2017-09-04T14:25:08.138Z · EA · GW

For "far future"/"long term future," you're referring to existential risks, right? If so, I would think calling them existential or x-risks would be the most clear and honest term to use. Any systemic change affects the long term such as factory farm reforms, policy change, changes in societal attitudes, medical advances, environmental protection, etc, etc. I therefore don't feel it's that honest to refer to x-risks as "long term future."

Comment by austen_forrester on Nothing Wrong With AI Weapons · 2017-09-01T23:54:23.214Z · EA · GW

I'm sure promoting killer robots will be popular among "effective altruists,"/ISIS, as it is a way to kill as many people as possible while making it look like an accident. "EAs" aren't fooling anyone about their true intentions.

Comment by austen_forrester on Blood Donation: (Generally) Not That Effective on the Margin · 2017-08-06T23:43:41.371Z · EA · GW

By regular morals, I mean basic morals such as treating others how you like to be treated, ie. rules that you would be a bad person if you failed to abide by them. While I don't consider EA superorogatory, neither do I think that not practicing EA makes someone a bad person, thus, I wouldn't put it in the category of basic morals. (Actually, that is the standard I hold others to, for myself, I would consider it a moral failure if I didn't practice EA!) I think it actually is important to differentiate between basic and, let's say, more “advanced” morals because if people think that you consider them immoral, they will hate you. For instance, promoting EA as a basic moral that one is a “bad person” if she doesn't practice, will just result in backlash from people discovering EA. No one wants to be judged.

The point I was trying to make is that EAs should be aware of moral licensing, which means to give oneself an excuse to be less ethical in one department because you see yourself as being extra-moral in another. If there is a tradeoff between exercising basic morals and doing some high impact EA activity, I would go with the EA (assuming you are not actually creating harm, of course). For instance, I don't give blood because last time I did I was lightheaded for months. Besides decreasing my quality of life, it would also hurt by ability to do EA. I wouldn't say giving blood is an act of basic morality, but it still an altruistic action that few people can confidently say they are too important to consider doing. Do you not agree that if doing something good doesn't prevent you from doing something more high impact, than it would be morally preferable to do it? For instance, treating people with kindness... people shouldn't stop being kind to others because it won't result in some high global impact.

Comment by austen_forrester on Blood Donation: (Generally) Not That Effective on the Margin · 2017-08-06T07:13:05.072Z · EA · GW

I think it may be useful to differentiate between EA and regular morals. I would put donating blood in the latter category. For instance, treating your family well isn't high impact on the margin, but people should still do it because of basic morals, see what I mean? I don't think that practicing EA somehow excuses someone from practicing good general morals. I think EA should be in addition to general morals, not replace it.

Comment by austen_forrester on The marketing gap and a plea for moral inclusivity · 2017-07-19T18:34:17.051Z · EA · GW

Perhaps I got it wrong, but I thought that the premise of your position that EA outreach should proportionally represent what people who identify as EAs consider their favourite cause is that EAs (however “effective altruist” is defined) are morally and intellectually superior to the public. I know for a fact that this is the prevailing attitude EAs have. I would really like to know why it is not enough to educate the public on EA-related issues. Why should the public care what is the favourite cause of an upper class 25 year old who donates $500 a year to the same charity since before he learned about effective altruism, discusses computer science concepts with his friends, and denies the reality that people in poor countries are themselves best positioned to solve their problems? How is that person special?

It's hard for me to imagine a more prejudiced group of people than EAs. You literally hate everyone different from you, ie. people who love God or have a different background or social class. Above all, EAs are extremely racist, denying that people in low income countries themselves have the power to solve their problems and perpetuating the colonial myth that improving the world is the sole realm of privileged white people. Most people in the movement have little empathy for others and are just using it to validate their feelings of superiority and further the dominance of their social class/race. (I am referring to EAs' attitudes. I don't mean to suggest that helping others is itself condescending/bad in any way.)

It is the public that should be teaching morals to “EAs”, not the other way around. God bless.

Comment by austen_forrester on An argument for broad and inclusive "mindset-focused EA" · 2017-07-18T19:04:23.907Z · EA · GW

My point was that EAs probably should exclusively promote full-blown EA, because that has a good chance of leading to more uptake of both full-blown and weak EA. Ball's issue with the effect of people choosing to go part-way after hearing the veg message is that it often leads to more animals being killed due to people replacing beef and pork with chicken. That's a major impetus for his direct “cut out chicken before pork and beef” message. It doesn't undermine veganism because chicken-reducers are more likely to continue on towards that lifestyle, probably more so even than someone who went vegetarian right away Vegetarians have a very high drop out rate, but many believe that those who transitioned gradually last longer.

I think that promoting effectively giving 10% of one's time and/or income (for the gainfully employed) is a good balance between promoting a high impact lifestyle and being rejected due to high demandingness. I don't think it would be productive to lower the bar on that (ie. By saying cause neutrality is optional).

Comment by austen_forrester on An argument for broad and inclusive "mindset-focused EA" · 2017-07-18T02:25:59.032Z · EA · GW

One thing to keep in mind is that people often (or usually, even) choose the middle ground by themselves. Matt Ball often mentions how this happens in animal rights with people deciding to reduce meat after learning about the merits vegetarianism and mentions that Nobel laureate Herb Simon is known for this realization of people opting for sub-optimal decisions.

Thus, I think that in promoting pure EA, most people will practice weak EA (ie. not cause neutral) on their own accord, so perhaps the best way to proliferate weak EA is by promoting strong EA.

Comment by austen_forrester on The marketing gap and a plea for moral inclusivity · 2017-07-11T03:14:04.705Z · EA · GW

Certainly, no one should be expected to promote things they don't believe. Which is why if you're like many in the community, using EA to promote your pre-existing atheist agenda, you should not do outreach, nor call your meetup an “effective altruism” group.

It is your EA community that considers the public stupid, Michael. I completely disagree! Perhaps if your group respected the public more, they might listen to you.

My second point was that the public, being smart, recognizes that the EA community has no moral authority and therefore doesn't care what their favourite causes are. EAs should thus use logic, not authority, to influence, the public.

Comment by austen_forrester on The marketing gap and a plea for moral inclusivity · 2017-07-08T23:04:16.839Z · EA · GW

I totally understand your concern that the EA movement is misrepresenting itself by not promoting issues proportional to their representation among people in the group. However, I think that the primary consideration in promoting EA should be what will hook people. Very few people in the world care about AI as a social issue, but extreme poverty and injustice are very popular causes that can attract people. I don't actually think it should matter for outreach what the most popular causes are among community members. Outreach should be based on what is likely to attract the masses to practice EA (without watering it down by promoting low impact causes, of course). Also, I believe it's possible to be too inclusive of moral theories. Dangerous theories that incite terrorism like Islamic or negative utilitarian extremism should be condemned.

Also, I'm not sure to what extent people in the community even represent people who practice EA. Those are two very different things. You can practice EA, for example by donating a chunk of your income to Oxfam every year, but not have anything to do with others who identify with EA, and you can be a regular at EA meetups and discussing related topics often (ie. a member of the EA community) without donating or doing anything high impact. Perhaps the most popular issues acted on by those who practice EA are different from those discussed by those who like to talk about EA. Being part of the EA community doesn't give one any moral authority in itself.

Comment by austen_forrester on Getting to the Mainstream · 2017-07-05T00:06:02.383Z · EA · GW

I don't see how TYLCS is selling out at all. They have the same maximizing impact message as other EA groups, just with a more engaging feel that also appeals to emotions (the only driver of action in almost all people).

Matt Ball is more learned and impact-focused than anyone in the animal rights field. One Step for Animals, and the Reducetarian Foundation were formed to save as many animals as possible -- complementing, not replacing, vegan advocacy. Far from selling out, One Step and Reducetarian are the exceptions from most in animal rights who have traded their compassion for animals for feelings of superiority.

Comment by austen_forrester on Changes to the EA Forum · 2017-07-04T23:52:50.419Z · EA · GW

I really respect the moderators of this forum for allowing me to advocate for public safety (ie. criticize NUE) and removing comments that could endanger public safety (ie. advocating suicide)!

Comment by austen_forrester on Introducing CEA's Guiding Principles · 2017-06-19T22:39:56.213Z · EA · GW

Those radicalization factors you mentioned increase the likelihood for terrorism but are not necessary. Saying that people don't commit terror from reading philosophical papers and thus those papers are innocent and shouldn't be criticized is a pretty weak argument. Of course, such papers can influence people. The radicalization process starts with philosophy, so to say that first step doesn't matter because the subsequent steps aren't yet publicly apparent shows that you are knowingly trying to allow this form of radicalization to flourish. Although, NUEs do in fact meet the other criteria you mentioned. For instance, I doubt that they have confidence in legitimately influencing policy (ie. convincing the government to burn down all the forests).

FRI and its parent EA Foundation state that they are not philosophy organizations and exists solely to incite action. I agree that terrorism has not in the past been motivated purely by destruction. That is something that atheist extremists who call themselves effective altruists are founding.

I am not a troll. I am concerned about public safety. My city almost burned to ashes last year due to a forest fire, and I don't want others to have to go through that. Anybody read about all the people in Portugal dying from a forest fire recently? That's the kind of thing that NUEs are promoting and I'm trying to prevent. If you're wondering why I don't elaborate my position on “EAs” promoting terrorism/genocide, it is for two reasons. One, it is self-evident if you read Tomasik and FRI materials (not all of it, but some articles). And two, I can easily cause a negative effect by connecting the dots for those susceptible to the message or giving them destructive ideas they may not have thought of.

Comment by austen_forrester on Projects I'd like to see · 2017-06-19T22:06:38.850Z · EA · GW

Have you considered combining the "GiveWell for impact investing" idea with the Effective Altruism Funds idea and create an EA impact investing biz within your charity? You could hire staff to find the best impact investing opportunities and create a few funds for different risk tolerances. Theoretically, it could pay for itself (or make serious money for CEA if successful enough) with a modest management fee. I'm not sure if charities are allowed to grant to businesses, but I know they can operate their own businesses as long as it's related to their mission.

Comment by austen_forrester on Projects I'd like to see · 2017-06-14T02:34:36.503Z · EA · GW

Entering China would be awesome. So many people with money and no one's donating it. It ranks dead freaking last on the World Giving Index. Which in a way is a good thing... it means lots of room to grow!

China's domestic charities are usually operated and funded by the government (basically part of the government). And starting this year, the government has basically taken control of foreign NGO's in China.

Often, rich Chinese elect to donate to foreign NGOs because they are more credible. Besides, being government-controlled, charities in China are not known for being reputable, prompting billionaire Jack Ma to famously quip "It's harder to donate money to Chinese charities than to earn it." The China Foundation Center was created a few years ago to promote transparency in the nonprofit sector.

India is also a good target. Like China, no one there trusts charities. Probably because they're all scams? But there is an organization called Credibility Alliance that accredits the more transparent ones. I'm a big fan of Transparency International India. They do so much on a shoestring in the single most important issue in the country (corruption), and are the most credible/transparent.

Comment by austen_forrester on Fact checking comparison between trachoma surgeries and guide dogs · 2017-06-06T23:08:45.258Z · EA · GW

Blind people are not a discriminated group, at least not in the first world. The extreme poor, on the other hand, often face severe discrimination -- they are mistreated and have their rights violated by those with power, especially if they are Indians of low caste.

Comparative intervention effectiveness is a pillar of EA, distinct from personal sacrifice, so they are not interchangeable. I reject that there is some sort of prejudice for choosing to help one group over another, whether the groups are defined by physical condition, location, etc. One always has to choose. No one can help every group. Taking the example of preventing blindness vs assisting the blind, clearly the former is the wildly superior intervention for blindness so it is absurd to call it prejudiced against the blind.

Comment by austen_forrester on Fact checking comparison between trachoma surgeries and guide dogs · 2017-06-06T22:10:58.608Z · EA · GW

Peter, even if a trachoma operation cost the same as training a guide dog, and didn't always prevent blindness, it would still be an excellent cost comparison because vision correction is vastly superior to having a dog.

Comment by austen_forrester on Considering Considerateness: Why communities of do-gooders should be exceptionally considerate · 2017-06-02T16:10:13.477Z · EA · GW

I believe in respecting people who are respectable and care about others, like Gleb Tsipursky, and standing up to frauds. Very few people in your EA movement are sincere. Most are atheist extremists hijacking EA to advance their religion, which often includes the murder of all life on Earth. CEA is in the latter group because it promotes and financial supports these atheist terrorism cells (ie. “Effective Altruism Foundation”).

CEA isn't being very considerate when they promote killing everyone. In fact, I would say that is decidedly INconsiderate! Couldn't help but notice that “genocide” is not on your list of considerate behaviours. You sickos consider that an act of altruism.

I, for one, am okay if “EAs” don't want to work with me. Knowing that they want to kill as much as possible, I would be happy if I never met one!

Comment by austen_forrester on Introducing CEA's Guiding Principles · 2017-03-15T02:24:17.662Z · EA · GW

They encourage cooperation with other value systems to further their apocalyptic goals, but mostly to prevent others from opposing them. That is different from tempering "strong NU" with other value systems to arrive at more moderate conclusions.

LOOOOL about your optimism of people not following FRI's advocacy as purely as they want! Lets hope so, eh?

Comment by austen_forrester on Introducing CEA's Guiding Principles · 2017-03-15T02:17:01.466Z · EA · GW

It's the only negative utilitarianism promoting group I know of. Does anyone know of others (affiliated with EA or not)?

Comment by austen_forrester on Introducing CEA's Guiding Principles · 2017-03-15T02:03:18.904Z · EA · GW

I know they don't actually come out and recommend terrorism publicly... but they sure go as far as they can to entice terrorism without being prosecuted by the government as a terrorist organization. Of course, if they were explicit, they'd immediately be shut down and jailed by authorities.

I promise you this – all those who endorse this mass termination of life ideology are going to pay a price. Whether by police action or public scrutiny, they will be forced to publicly abandon their position at some point. I implore them to do it now, on their volition. No one will believe them if they conveniently change their minds about no-rules negative utilitarianism after facing public scrutiny or the law. Now is the time. I warned CEA about this years ago, yet they still promote FRI.

I actually respect austere population-control to protect quality of life, even through seemingly drastic means such as forced sterilization (in extreme scenarios only, of course). However, atheists don't believe in any divine laws such as the sin of killing, are thus not bound by any rules. The type of negative utilitarianism popular in EA is definitely a brutal no-rules, mass killing-is-okay type. It is important to remember, also, that not everyone has good mental health. Some people have severe schizophrenia and could start a forest fire or kill many people to “prevent suffering” without thinking through all of the negative aspects of doing this. I think that the Future of Humanity Institute should add negative utilitarian atheism to their list of existential risks.

Anti-spirituality: Doesn't have anything to do with NU or FRI, I probably should have left it out of my comment. It just means that many EAs use EA as a means to promote atheism/atheists. Considering about 95% of the world's population are believers, they may have an issue with this aspect of the movement.

Comment by austen_forrester on Introducing CEA's Guiding Principles · 2017-03-15T01:54:55.411Z · EA · GW

LOL. Typical of my comments. Gets almost no upvotes but I never receive any sensible counterarguments! People use the forum vote system to persuade (by social proof) without having a valid argument. I have yet to vote a comment (up or down) because I think people should think for themselves.

Comment by austen_forrester on Introducing CEA's Guiding Principles · 2017-03-15T01:50:38.169Z · EA · GW

Their entire website boils down to one big effort at brainwashing people into believing that terrorism is altruistic.

Comment by austen_forrester on Introducing CEA's Guiding Principles · 2017-03-15T01:48:50.078Z · EA · GW

I raised my concerns with CEA years ago about promoting (albeit, indirectly) negative utilitarianism, and the mass terror that will arise from it. I am sorry to see that nothing has changed and I have to move my security concerns up to the level of dealing directly with authorities. I would like to see how long CEA's 501c3 status lasts once it is on the FBI's terrorism watch list.

Comment by austen_forrester on Introducing CEA's Guiding Principles · 2017-03-10T05:44:00.551Z · EA · GW

Those guiding principles are good. However, I wished you would include one that was against doing massive harm to the world. CEA endorses the “Foundational Research Institute,” a pseudo-think tank that promotes dangerous ideas of mass-termination of human and non-human life, not excluding extinction. By promoting this organization, CEA is promoting human, animal, and environmental terrorism on the grandest scale. Self-styled “effective altruists” try to pass themselves off as benevolent, but the reality is that they themselves are one of the biggest threats to the world by promoting terrorism and anti-spirituality under the cloak of altruism.

Comment by austen_forrester on Increasing Access to Pain Relief in Developing Countries - An EA Perspective · 2017-02-03T18:01:15.707Z · EA · GW

I'm a little confused as to why you are trying to promote a cause that you think is low priority and financially inefficient. Anyhow, I don't find your anti-corporate stance convincing. Lack of corporate involvement (ie. to distribute analgesics) is the missing link preventing some countries from having functional palliative care in some countries according to Dr. Foley. It's important to work with all stakeholders for progress in any space. The affordable anti-retroviral movement made progress by working with pharma. The risks of working with industry in the public's interest can be minimized with appropriate controls.

Access to properly regulated mobile phone, internet, and financial services have greatly helped the poor and require corporate involvement. Unfortunately, they are underutilized because SJW's like to maintain their purity and reject corporate involvement. I hope your palliative care movement doesn't suffer from the same self-defeating ideology.

Comment by austen_forrester on Increasing Access to Pain Relief in Developing Countries - An EA Perspective · 2017-02-02T23:17:13.768Z · EA · GW

One good thing about this space is that, unlike so much other policy work, access to pain relief doesn't have corporations interfering by paying off government, etc. If anything, corporations would stand to gain by increasing access to pharmaceuticals. So much other policy advocacy is stifled by corporate interference, so palliative care has a huge advantage in that regard. Would it be possible for advocates to work with pharma corporations to lobby for increased access? I know that sometimes governments have good regulations in place but can't find corporations willing to supply/distribute the country with the meds, which I find baffling.

Do you think that an effective strategy for pain relief would be to first convince a Ministry of Health of the importance of palliative care? Rather than putting drugs as the forefront of advocacy, perhaps getting government to agree to the principle of palliative care and pain control first would be more productive because once they agree to that, it is a given that narcotics are necessary.

Increasing pain relief is a notable cause in so many ways. It is a major issue in moderate income countries such as in former Soviet nations. Africa may be the worst, but pain relief restriction is by no means limited to the poorest regions of the world. Just shows that the best altruistic opportunities aren't always in the poorest countries. I would think that the more developed countries would be a priority target for advocacy because they would actually have functional health care systems that would permit implementation of increased palliative care.

From what I've studied so far, I don't see how you can say that increased analgesic access is low-medium in neglectedness and tractability. Dr. Kathleen Foley says that University of Wisconsin's fellows only spend 15% of their time on this and usually make progress in their respective countries. If true, that demonstrates that this issue is severely neglected and tractable with long-term pay-offs, at least in some countries.

Is it possible for existing major global health initiatives to lead this cause? PEPFAR is well-funded and pain relief is part of AIDS treatment. I know you mentioned them, but perhaps they haven't put an appropriate portion of their funding towards this area for political reasons.

Comment by austen_forrester on Increasing Access to Pain Relief in Developing Countries - An EA Perspective · 2017-02-01T17:22:02.515Z · EA · GW

Thank you, Lee, for this eye-opening and thorough introduction to the issue of lack of access to analgesics. I can't believe the scale of the problem! With the immense scale and striking neglectedness of the problem, and the potential for leaps in gains with changes to state/national policies, I'm sure it deserves a high priority for changemakers.

Causes like this are why I've always thought that effective altruism is just as important to be taken up in poor countries as much as rich ones – internal changemakers are invaluable here, as you've stated. University of Wisconsin's fellowship program does look promising. I'm sure they would accept external money if there was enough interest. Good luck with this important cause, Lee! Don't let any close-minded person tell you increasing access to analgesics isn't a suitable cause for EA's because it's not easily quantifiable.

Comment by austen_forrester on We Must Reassess What Makes a Charity Effective · 2017-01-06T20:01:49.016Z · EA · GW

I appreciate you posting on this forum, carneades. Your take on international development is in line with economic principles and what I've learned from people from Africa and India. EA badly needs this type of debate. What I am not hearing from you or others who take your point of view, however, is solutions. While your general criticisms of international aid are valid, what are the solutions? How do we help people in poor countries to develop and be more independent? There are charities like One Acre Fund that seem to only have a positive impact because increasing self-sufficiency. Should poverty philanthrobucks focus on those? What specific charities or interventions would you recommend?

Comment by austen_forrester on Concerns with Intentional Insights · 2016-10-25T20:37:33.729Z · EA · GW

I don't know much about Intentional Insights, so I won't comment about the organization. However, I'd like to say that I have thought Gleb's comments on the forum to be consistently the highest quality, so I don't want to see him stop posting. (Even though I never up/downvote out of principle – I think people should determine the quality of a post for themselves.) He is the voice of reason in an extremely elitist and conservative community that punishes anyone who is not likewise close-minded. Gleb acts in the public interest, trying to get the public to maximize their impact, whereas most in the EA community are seemingly pursuing personal interests, using EA to justify their belief in technology as the supreme power and discredit spirituality.

If you don't think InIn produces good quality content, it's not a given that they should be boycotted. Another solution would be to support them more to improve their quality. CEA, on the other hand, has critical moral issues, such as shunning people outside of their elite atheist demographic, even if they are GWWC members.

Comment by austen_forrester on All causes are EA causes · 2016-10-10T18:37:56.373Z · EA · GW

I've always thought the same as you, Ian. Great point about foundations, BTW. Very few people are willing to only give to the highest EV charity across all causes and countries, therefore they might as well give as effectively as possible within whatever criteria they have (ie. domestic, homeless). The only argument to the contrary is that there is a counterfactual if the all-or-nothing purist form of EA is broken and people donate to top domain-specific charities that would have given to the best cause neutral charity. I doubt there is much of a counterfactual because the EA community can still promote cause neutral charities while passively recommending domain-specific charities on the internet. It can even be a baby step on the way to practicing a more strict EA – people who wouldn't consider neutrality get used to the idea of effectiveness within their favourite domain. After having accepted the effectiveness doctrine, they start to open up to cause neutrality, increasing donation size, etc.

While it is good to have top domain-specific charities and interventions suggestions available for those who seek them, there remains 2 questions in my mind: 1) whether they should be branded as “effective altruism” even if they are in a low-potential cause, and 2) how much EA outreach and research should focus on them.

Ideally, those within the domain would find and promote the top charities in the domain themselves so there is no counterfactual from the EA community doing it. I wouldn't want the limited resources within EA going to low-medium potential domain-specific research and promotion, however, perhaps EA leaders could incite those within the domain to do it themselves (particularly foundations, as you mentioned). That seems like it could be quite high impact. Another point is that perhaps domain-specific discussion will bring in new people to EA that otherwise aren't interested. These new people could then promote effectiveness within their preferred domains. I think this is very plausible.

Regarding the responses about it being more impactful to persuade someone to give slightly more to a high impact cause neutral charity than significantly more to a domain-specific one, I think that depends on the situation. For a person in Europe, for instance, I would think that it would be more impactful to put effort into persuading them to give to a top overall charity than the best charity their favourite personal cause. However, for people in poor countries, like India, I think outreach would have the greatest impact by persuading them to give to the most effective Indian charities. I find that people from poor countries are primarily concerned about the poverty and problems in their home country.

Comment by austen_forrester on Philanthropy Advisory Fellowship: Mental Health in Sub-Saharan Africa · 2016-07-23T03:18:52.829Z · EA · GW

Very good report, James. I have a few comments:

  1. The DALYs calculated for mental health don't factor in the huge effect that mental health has on physical health. This may be laboursome to estimate, but should at least be considered. And you mentioned that people with MNS issues are often treated horrendously by their family/society, but that also hasn't been factored into the DALY cost estimate. An MNS disorder with a 0.4 DALY could really have a 0.9 DALY when you factor in mistreatment. I realize this is probably impossible to do, but it important to recognize that socialization side effects have huge impacts on DALYS.
  2. Sri Lanka pesticide ban cost per DALY: $1000 is pretty high. Eddleston estimated it at $2 per YLL using the actual costs of running Sri Lanka's pesticide regulation department. That figure doesn't even factor in savings in health care costs. Also, Sri Lanka and other countries have only banned a few HHPs. A total ban of HHPs could yield drastically different cost estimates. I should note that only a fraction of pesticides are classified as highly hazardous. A total HHP ban still leaves farmers with lots of choices to buy pesticides, in addition to non-chemical forms of pest control.
  3. No choice for donating to advocate for pesticide bans: Later this year, I expect that the Global Initiative for Pesticide Poisoning Prevention will begin our anti-HHP campaign. It takes a long time to do the initial steps of receiving charity status and input from all the experts in the field.
  4. Room for more funding for a program like StrongMinds doesn't make sense because it can be scaled up to LMIC around the world.
  5. I don't understand what the mental health charities have to do with children. Do StrongMinds and BasicNeeds treat children?
Comment by austen_forrester on Philanthropy Advisory Fellowship: Mental Health in Sub-Saharan Africa · 2016-07-23T03:06:02.219Z · EA · GW

Excellent paper! One important factor in LMIC mental health work is sustainability. Take helplines. Far as I know, they are locally funded in poor countries, yet there are very few of them. A foreign NGO or individual could have an extremely high impact founding a helpline in a location, turning the fundraising and operation over to the local community once it gets going, and then repeating the process in subsequent cities. Dependency on foreign donors is always a last resort. The absolute cost of running a helpline is less important than the ability of the local community to support it on their own once it's set up, although startup costs are major consideration.

Also, rather than trying to extrapolate the cost of running a helpline in Africa using Australian data, I think it would be more accurate to just call one up and ask them. Or call Befrienders International, they should know. I'm sure they'd be thrilled to hear from you!

Comment by austen_forrester on $500 prize for anybody who can change our current top choice of intervention · 2016-07-18T16:51:49.879Z · EA · GW

Hi pasha,

Suicide prevention is an extremely neglected area and I believe has many high impact opportunities that are not yet taken up. 85% of suicides are in the developing world, little of which are covered by helplines, so I would think proliferating helplines would be high impact, especially when you factor in the low opportunity cost of this volunteer-based activity. India, in particular, is desperate for more helplines. One way of reducing cost is by having calls to the helpline automatically directed to volunteers' phones, so that they don't need to have a call centre.

Means restriction has by far the best evidence behind it for reducing suicide rates, particularly prohibition of highly toxic pesticides (currently the second most common means of suicide). Pesticide bans have decreased Sri Lanka's suicide rate by three quarters and China's suicide rate has decreased by more than half due to reduced pesticide access from bans and urbanization. My NGO, the Global Initiative for Pesticide Poisoning Prevention is currently being formed to advocate for more bans of highly poisonous pesticides around the world. Our website should be live in a few weeks at www.pesticidepoisoning.org.

Comment by austen_forrester on Is effective altruism overlooking human happiness and mental health? I argue it is. · 2016-06-26T05:36:03.862Z · EA · GW

When you mention the $1000/DALY by "Stronger Minds," are you referring to strongminds.org? I asked them if they had a cost estimate for DALYs but never received a reply. If it does refer to StrongMinds, do you know if they have predicted a cost per DALY once there viral group therapy model of treating depression grows significantly? Mayberry says in his TED talk that he expects it to become cheaper as it grows.

Also, does your discussion of DALYs for mental health interventions only include YLD, or also YLL? I would think there could be a large difference between the two considering the huge impact depression has on morbidity and mortality (comparable to obesity).

Comment by austen_forrester on Evaluation Frameworks (or: When Importance / Neglectedness / Tractability Doesn't Apply) · 2016-06-18T23:43:24.639Z · EA · GW

I disagree about the cause area and organization being more important than the intervention. To me, it's all about the intervention in the end. Supporting people that you "believe in" in a cause that you think is important is basically a bet that you are making that a high impact intervention will spring forth. That is one valid way of going about maximizing impact, however, working the other way – starting with the intervention and then supporting those best suited to implement it, is also valid.

The same is true for your point about a funder specializing in one knowledge area so they are in the best position to judge high impact activity within that area. That is a sensible approach to discover high impact interventions, however, as with strategy of supporting people, the reverse method can also be valid. It makes no sense to reject a high potential intervention you happen to come across (assuming its value is fairly obvious) simply because it isn't in the area that you have been targeting. You're right, this is what Open Phil does. Nevertheless, rejecting a high potential intervention simply because it wasn't where you were looking for it is a bias and counter to effective altruism. And I object to your dismissal of interventions from surprising places as “random.” Again, this is completely counter to effective altruism, which is all about maximizing value wherever you find it.

Comment by austen_forrester on Evaluation Frameworks (or: When Importance / Neglectedness / Tractability Doesn't Apply) · 2016-06-11T07:42:24.200Z · EA · GW

I think INT would be more clear/useful if it was applied to the most high impact intervention of the cause, rather than the cause itself. Because a cause can be low in scale and neglectedness but high in tractability if there is one very high impact intervention that has been ignored or simply isn't known about. Or vise versa – the scale and neglectedness could be high while it's best intervention isn't that promising (thus the cause has low tractability). So the importance in this usage would be that of the best intervention of the cause being successful (including side benefits like gaining knowledge applicable to other areas), the tractability would be the odds of it being successful (estimated using the strength of the evidence for the intervention), and neglectedness would refer to how much attention that top intervention has received, as opposed to the cause in general. I think this is basically what you are arguing but phrased differently.

Comment by austen_forrester on Philosophical Critiques of Effective Altruism by Prof Jeff McMahan · 2016-06-06T05:00:04.652Z · EA · GW

Absolutely. That is such a common tactic. I think all of the criticisms against EA use one cheap rhetorical trick or another. Someone needs to make up a definitive web page that lists all the criticisms of EA with responses, and most importantly, calls out the rhetorical device that was used. It's mostly the same tired, discredited criticisms and persuasive tricks that are used over and over, so rather than responding to each individually, we can simply refer people to the web page.

Comment by austen_forrester on Call for papers for a special journal issue on EA · 2016-03-14T20:55:11.916Z · EA · GW

Can anybody submit an essay or do authors have to meet certain qualifications?

Comment by austen_forrester on Call for papers for a special journal issue on EA · 2016-03-14T17:34:13.252Z · EA · GW

What is the desired range of length, if any? Is there any provision in submitting to a journal for originality? I want to avoid writing something that is similar to something already published. I'll double check myself, of course, but I could still miss it.

Are you SURE that's the deadline? ;^)

Comment by austen_forrester on Do EAs underestimate opportunities to create many small benefits? · 2016-01-30T18:47:39.041Z · EA · GW

The most common way of helping many people by a small amount is through business. Whether it's one you've started or one you work for, in a large business you can make many, many people's lives slightly (at least) better by improving upon, or inventing a new, superior service or product. Businesses can scale up far more than charities. And if you create a social-impact focused business, you can also significantly improve each of your clients' lives. Businesses have improved the world far more than charities have, through economic growth and technological progress.

Comment by austen_forrester on EA is elitist. Should it stay that way? · 2016-01-27T06:02:03.238Z · EA · GW

The fact of the matter is that people in the EA community are prejudiced against anyone different from them and look for any justification to keep them out. Since there are no genuine justifications, it generally takes the form of instilling fear of the unknown into others, no different from any other type of bigotry: “But if we let blacks in this school, who knows what will happen! We must keep them out because, well, you just never know what horrible things may happen!”

The whole premise of the debate is prejudiced. EA's being more accepting of people different from them (including others' levels of commitment to EA) is “lowering the bar”? That is clearly bigoted – viewing people different from you as necessarily inferior. Of course, prejudiced people measure inferiority by how much others differ from them. I'm sure many people who are different from EA's view EA's as inferior.

Comment by austen_forrester on The Important/Neglected/Tractable framework needs to be applied with care · 2016-01-24T18:00:23.788Z · EA · GW

It may be more technically correct not to have neglectedness as a separate criterion, but I find that it is the single most important factor in cause prioritization, despite the fact that it's only utility is its affect on the other two factors. Just my personal observation. For example, the causes that I think have the highest expected value: pesticide poisoning, self harm/suicide, depression in the Third World, loneliness, are all great in magnitude and have huge potential for progress precisely because they have been severely neglected. That's why I've come to see neglectedness as the starting point in cause selection, even though I now view it as a subfactor rather than an independent factor. So technical correctness may not necessarily lead to the most usefulness.

Perhaps someone will come up with a linear process for strategic cause prioritization one day, rather than the current practice of referring to static criteria. For instance, the process could start with brainstorming what you think the biggest sources of suffering (or impediments to flourishing) in the world are, followed by an examination of why those sources exist, then on to how they can be solved, and so on. It could be represented visually in a diagram. A process rather than set criteria could also be useful in unexpected ways. For example, I consider loneliness (lack of positive socialization or a romantic partner) to be one of the biggest social issues in the world because it is one of the largest sources of sorrow in the world. However, most people wouldn't even think about taking action on it because they don't even view it as a social issue. Thinking logically, however, it must be a social issue because it causes so much suffering, but cause prioritization using set criteria would be unlikely to lead you to that conclusion without using some sort of critical thinking process.

Similar benefits could probably be had without using a process by branching off each criterion into sub-components. For example, the importance bubble could have bubbles around it of “Why are people not happier?”, “What prevents flourishing?”, “Why is a society not better than it currently is?”, “What is an advancement in society that would lead to other improvements?”

Comment by austen_forrester on The Important/Neglected/Tractable framework needs to be applied with care · 2016-01-24T16:32:16.174Z · EA · GW

I completely agree that neglectedness is often the route cause of the size and tractability, but that's the whole point. Neglectedness only matters inasmuch as it affects the other two criteria and is included on it's own mostly to aid analysis of tractability – it has no value on it's own. For instance, if a cause is low in importance/scale and tractability, it wouldn't matter what the neglectedness is at all. I think the neglectedness factor only comes into play if a problem is important and tractable (including on the margin), yet crowded. In that case, even though it would be a high priority for an organization to enter because the only two relevant factors (scale and tractability) are high, if it is a very popular cause, it may become less tractable/important in the near future with more entrants into the space.

I think that the type of importance analysis depends on whether the intervention is scalable or systemic, not whether or not it is new. In systemic action, you are tackling the entire issue, so you want to judge the importance of the issue's entirety. Scalable action works on a per unit basis, so the severity of the issue per person/animal/acre, etc is what matters, as you mentioned regarding GiveDirectly.

Lastly, I think that flow through effects should always be included among cause prioritization criteria, because (cheese warning) what matters at the end of the day, is how much the entire world benefits from an intervention, not just how much it benefits the targeted cause. For instance, research intended to progress one subject often ends up providing unforseen benefits in other areas.