Posts

EA could benefit from a general-purpose nonprofit entity that offers donor-advised funds and fiscal sponsorship 2020-06-27T00:16:58.808Z · score: 43 (18 votes)
EA Angel Group: Applications Open for Personal/Project Funding 2019-03-19T18:29:32.777Z · score: 37 (24 votes)
EAs and EA Orgs Should Move Cash from Low-Interest to High-Interest Options 2019-02-23T12:24:02.970Z · score: 51 (26 votes)
Requesting community input on the upcoming EA Projects Platform 2018-12-10T17:41:32.212Z · score: 23 (16 votes)
Ideas for Improving Funding for Individual EAs, EA Projects, and New EA Organizations 2018-07-10T06:12:29.993Z · score: 22 (21 votes)

Comments

Comment by brendon_wong on How Much Leverage Should Altruists Use? · 2020-07-07T05:23:22.150Z · score: 1 (1 votes) · EA · GW
I don't really understand what you're saying here.

I meant that if a donor isn't going to make generic grants, like funding a GiveWell top charity, and will instead do things like fund small EA projects that might not otherwise be funded, then pursuing a more reliable investing approach would be a better bet.

If a non-generic donor pursues a risk neutral approach or invests in a single asset class, that could jeopardize their grantmaking, and from the donor's perspective the downside of not being able to fund a considerable number of projects if there are bad investment outcomes likely outweighs the expected benefit of fractionally shifting the EA community as a whole towards choosing more unusual asset classes.

That's true, I said that in the post. QMHIX/EQCHX might also have a worse ex-ante Sharpe ratio than GAA. The argument for investing in managed futures is that it has positive expected return (not guaranteed) and has basically zero correlation with stocks and bonds.

Right, and I stated that to emphasize other approaches I mentioned later in my comment that might have had and may continue to have decent returns with zero correlation.

VC is highly correlated with equities, and as an asset class, historically it has performed worse than the S&P 500.
Gold has only performed well relatively recently. In the long run, there is no reason to expect gold to have a real return above 0% because it doesn't generate any cash flows like stocks and bonds do, and it doesn't gain value over time except via inflation.

There are a wide variety of views on whether it is wise to invest in every single asset class, from Bitcoin, to gold and venture capital which I mentioned, to managed futures which you mentioned.

I don't have strong asset class views because Antigravity Investments follows the approach of using quantitative/evidence-based investing to allocate different amounts to different asset classes at various points in time rather than sticking with a particular one for the long haul. I think that writing off entire asset classes may not be a good approach due to the inherently challenging-to-predict nature of future investment returns.

Regarding venture capital, this document from Invesco (which is not trying to sell a VC investment) notes there was a -0.06 correlation between venture capital and large-cap equities from 1990-2014. That document also notes that "top quartile absolute returns for venture capital have historically exceeded those for other asset classes." Top-quartile outperformance in VC is especially interesting because unlike equity funds in recent decades, it seems like VC funds may experience consistent outperformance across time. Whether that's due to skill, simply having access to better networks and deal flow, or some combination of both is the question.

Gold definitely struggles with some of the issues commodities and currencies as a whole have (debatable long-term value), but it's also recommended by the founder of the largest hedge fund in the world, so I don't think there's no case to be made for it (not saying there is, either, since I don't hold strong asset class views). A 7.8% non-inflation-adjusted return from 1972 to 2020 with 0.02 market correlation doesn't seem that terrible, although there are rather awful, extended drawdowns of course. I'm not saying that gold is or isn't a good long-term investment, but clearly gold and other things that have uncertain intrinsic value like Bitcoin can be good investments if held during the appropriate times.

Comment by brendon_wong on Long-term investment fund at Founders Pledge · 2020-07-07T04:39:49.071Z · score: 3 (2 votes) · EA · GW

Thanks for sharing your thoughts! I think that making some fund distributions in the present also serves to demonstrate the decision making and grantmaking capabilities of the fund's grantmakers. Some donors might consider it an uncertainty to donate to a fund that has not made any grants for, say, three decades, whereas having the fund make microgrants or having a version of the fund that makes grants demonstrates that the fund has been and will continue to make a positive social impact.

Comment by brendon_wong on How Much Leverage Should Altruists Use? · 2020-07-04T01:58:14.533Z · score: 7 (2 votes) · EA · GW

I'm the founder of Antigravity Investments, an EA social enterprise and SEC-registered investment advisor with the mission of donating millions of dollars to high-impact causes by increasing returns on charitable capital held by donors, nonprofits, foundations, etc. We've advised over $20 million in charitable capital and been supported by CEA's EA Grants program, the Berkeley SkyDeck accelerator, American Express, and Ashoka.

If anyone would like implementation assistance, I think we're a good alternative to Alpha Architect—we also operate in the evidence-based investing space, and we provide free advising to EAs along with lower-cost investment management. We're an advisor on the Interactive Brokers platform and also support other brokerage firms like Vanguard.

I like the breadth of content covered in this post. Regarding implementation details, if a small donor is going to fund things that are different than what other EAs would typically fund—an approach that various EAs have advocated for and one that I personally support—then I think there's a strong argument to not "invest all your altruistic funds into a managed futures fund." Separately, I think there's a high likelihood that this approach (i.e. 100% in QMHIX or EQCHX) will underperform a balanced portfolio, GAA, and a lot of other approaches over a short-term, medium-term, and long-term timeframe.

If someone is taking the approach of diversifying into other assets that most of the money in EA is not invested in, I'm more enthusiastic about speculating in asset classes that have historically experienced good returns (venture capital or even gold), or perhaps more promising, investing in a market that isn't that efficient or that the investor believes they might have an edge in (cryptocurrencies, prediction markets, angel investing, etc).

I believe that Good Ventures' investment data may be available on their Form 990. I am writing an upcoming article on how EAs can use Form 990 data to increase funding for charitable causes, potentially by millions of dollars with only a few hours of effort. I will try to update this comment when my article is out.

Comment by brendon_wong on EA could benefit from a general-purpose nonprofit entity that offers donor-advised funds and fiscal sponsorship · 2020-07-01T05:28:15.872Z · score: 8 (3 votes) · EA · GW

That's great, I'm happy fiscal sponsorship exists within EA now! I'll definitely refer any projects I'm aware of. Now I'm wondering how long it'll take for DAFs to pop up!

Comment by brendon_wong on EA could benefit from a general-purpose nonprofit entity that offers donor-advised funds and fiscal sponsorship · 2020-06-28T19:21:37.451Z · score: 1 (1 votes) · EA · GW

I agree that the second bullet point is likely more novel/compelling. Regarding the first point, I think that barriers like "high minimums to create a DAF, high annual fees, high minimum grant amounts, high minimum maintenance amounts, and limited investment options" mentioned in my post may reduce the counterfactual amount that would otherwise go to DAFs from EA by a considerable degree exceeding the $200,000 figure mentioned. For example, the minimum to create a DAF at Vanguard Charitable is $25,000, which is a somewhat large amount of capital for some people just to invest their money in something safe like a 2% savings account or money market fund prior to donation.

I think that some DAF applications I mentioned are more novel/compelling than others, such as allowing people to easily invest intended donations, fund their own future charitable work in a tax-deductible manner, and create additional "EA funds" offering a wider range of cause areas and methodologies beyond what the existing EA Funds offer.

I think that the second bullet point is viable so long as all appropriate best practices are followed, such as giving fair compensation for the level of future work (e.g. not paying people $500,000 a year for work that charities normally pay $50,000 a year for) and ensuring that all future work is actually charitable and appropriately documented.

I find it unlikely this will cause any PR issues unless this is actually broadly advertised to the general public, and even if so, it's important to note that this idea requires people to actually do charitable work in the future at a lower pay rate as opposed to simply saving for retirement in a 401(k) which offers similar tax and investing benefits. It only seems amazing to us because we would actually like to work full-time on charitable work in the future at a low pay rate—this is not an idea that seems popular in the mainstream.

Comment by brendon_wong on EA could benefit from a general-purpose nonprofit entity that offers donor-advised funds and fiscal sponsorship · 2020-06-28T18:50:52.283Z · score: 8 (3 votes) · EA · GW

Regarding fiscal sponsorship, some EA organizations like CEA and CFAR have done something similar, albeit to a highly limited extent. CEA and CFAR have occasionally hosted efforts seen as independent initiatives under their legal entity. I think this demonstrates the value fiscal sponsorship can bring to the community.

Fiscal sponsorship generally refers to the service of hosting independent efforts under the same legal entity with the associated expectations: (1) explicitly offered as a service, (2) allows independent efforts to remain under the umbrella organization for an indefinite period of time, (3) allows independent efforts to migrate their assets to another organization at any point in time, (4) allows independent efforts to independently fundraise under their own brand name, (5) offers an administrative portal, tools (such as expense reporting and fundraising portals), and procedures to reduce the administrative overhead of offering such a service, (6) accepts applications to use the service on an ongoing basis, (7) accepts a significant number of applications onto the service to fulfill the service's goal of making it easier to people to launch social impact efforts, (8) offers the service at scale to a large number of independent efforts, (9) tracks the finances of each independent effort separately from other independent efforts, (10) financed based on flat monthly fees and/or a percentage fee charged on incoming donations.

CEA may have offered something similar at some point in time, but it doesn't seem like they are currently focused on doing fiscal sponsorship. It is not my understanding that CEA is advertising or accepting applications for such a service, and is only hosting a very small number of efforts that could be seen as independent, which are the most important distinctions. CEA probably offers at most expectations 2, 3, 4, 5, and 9.

Also, CEA does not look like it's in the business of offering DAFs (I can provide an enumerated list of DAF provider expectations if that would help clarify), although the EA Funds are vaguely reminiscent of "collective DAFs."

Comment by brendon_wong on EA could benefit from a general-purpose nonprofit entity that offers donor-advised funds and fiscal sponsorship · 2020-06-28T03:05:41.959Z · score: 1 (1 votes) · EA · GW

Have you chatted with John Beshir? He mentioned to me that he was working on setting this up as a trust in the UK in mid-2019.

Update: Some difficulties came up, so John is not actively pursuing this right now.

Comment by brendon_wong on EA could benefit from a general-purpose nonprofit entity that offers donor-advised funds and fiscal sponsorship · 2020-06-27T22:48:54.130Z · score: 1 (1 votes) · EA · GW

Thanks for sharing your thoughts! Which of the applications of fiscal sponsorship seem most promising to you?

Regarding DAFs, I'd have to do the math, but I think that the benefit of the $15,000–$25,000 could be realized extremely quickly. For example:

  • If just $200,000 in intended donations for 2022 or later were counterfactually invested at a 7.5% ROI rather than held in cash, the initial investment would be recouped in a year ($200,000 * 0.075 = $15,000)
  • If someone in California was earning $250,000 a year and saved $50,000 in a DAF in both 2020 and 2021 to do direct work in 2022 and 2023 by $50,000 a year from their DAF, they would pay a tax rate of 19.46% in 2022 and 2023 on the $50,000 per year instead of paying a marginal tax rate of 46.65% if they saved $50,000 in 2020 and 2021 without depositing it into a DAF, leading to a $27,190 tax reduction ($100,000 * (0.4665-0.1946) = $27,190)

I think it's highly unlikely an existing DAF provider will customize their offerings (for example by offering to pay individuals a salary directly to do charitable work instead of just regranting to 501(c)(3)s) because their expected benefit from doing so is simply too low to justify the time investment.

Comment by brendon_wong on EA could benefit from a general-purpose nonprofit entity that offers donor-advised funds and fiscal sponsorship · 2020-06-27T22:30:10.089Z · score: 1 (1 votes) · EA · GW

Are you referring to the DAF or FS side of things, or both? My prior was that it would be fairly straightforward because there are UK DAFs in existence, and CEA does both DAF-like and FS-like things to a limited extent (sponsoring EA orgs and running EA funds).

While CEA might have charitable purposes that seem restrictive, it doesn't seem like that's impacting their ability to try to do everything under the sun.

You tried to create a trust to do this before, but it was rejected because the charitable objects were too broad?

Comment by brendon_wong on EA could benefit from a general-purpose nonprofit entity that offers donor-advised funds and fiscal sponsorship · 2020-06-27T03:09:50.337Z · score: 1 (1 votes) · EA · GW

It's interesting you mention that! In my post I link to the article Long-term investment fund at Founders Pledge which relates to patient philanthropy, and I also left a comment on that article.

I think one of the many possibilities made possible by an EA DAF provider would be to enable people to set up DAFs that are designed to impact the far future. It might or might not be better to run a centralized fund as a separate entity. A separate entity might be able to focus more on survivability; however, DAF providers are by their very nature designed a last a long time, so offering a long-term fund as part of a DAF provider's offerings might be an even better way to guarantee that the fund lasts into the future. For example, Vanguard Charitable has billions of dollars within it, and is thus very likely to have an outsized impact and last for many years to come.

Comment by brendon_wong on EA Forum feature suggestion thread · 2020-06-27T01:50:38.630Z · score: 1 (1 votes) · EA · GW

Making it possible for people to add a bio in their profile (that supports external links) so people can get a better idea of someone's background and interests when reading posts and comments.

Comment by brendon_wong on Non-Profit Insurance Agency · 2020-06-26T22:21:09.457Z · score: 1 (1 votes) · EA · GW

I run Antigravity Investments, an SEC-registered RIA. Let me know if you'd be interested in collaborating!

Comment by brendon_wong on EA is risk-constrained · 2020-06-26T22:10:19.692Z · score: 9 (4 votes) · EA · GW

I think this could be set up by launching a 501(c)(3) as a donor-advised fund and fiscal sponsor and then setting up funds inside the entity that support specific purposes. For example, having a fund that pays UBI for people working on high-impact entrepreneurship.

I welcome anyone to get in touch with me if they're interested in collaborating on and/or funding such a proposal (estimated setup cost of the entity and necessary legal work: $15,000–$25,000).

Edit: Was inspired to write an EA Forum post on this!

Comment by brendon_wong on What EA questions do you get asked most often? · 2020-06-26T21:52:41.067Z · score: 2 (2 votes) · EA · GW

Here are the early-stage funding opportunities I am aware of:

  • CEA's EA Funds, some of which provide early-stage funding with occasional posts on the EA Forum announcing new rounds with occasional forum announcements
  • CLR's Fund (formerly EAF Fund), which only funds longtermist projects

EA Grants and BERI Grants are no longer active.

Quite a few EAs, including me, have written about significant perceived bottlenecks in early-stage project funding.

Comment by brendon_wong on Problem areas beyond 80,000 Hours' current priorities · 2020-06-23T18:37:48.364Z · score: 6 (5 votes) · EA · GW

They are referring to financial investments (stocks, bonds, etc) as covered in the linked podcast episode with Philip Trammell.

Comment by brendon_wong on What EA questions do you get asked most often? · 2020-06-23T18:34:45.223Z · score: 2 (2 votes) · EA · GW

Unfortunately EA Grants is no longer operating.

Comment by brendon_wong on Aligning Recommender Systems as Cause Area · 2020-06-23T09:48:27.141Z · score: 1 (1 votes) · EA · GW
It’s likely that the flow-through effects on the rest of users’ lives will be even greater, if the studies showing effects on mental health, cognitive function, relationships hold out, and if aligned recommender systems are able to significantly assist users in achieving their long term goals. Even more speculatively, if recommender systems are able to align with users’ extrapolated volition this may also have flow-through effects on social stability, wisdom, and long-termist attitudes in a way that helps mitigate existential risk.

I am very interested in these sorts of positive effects of aligned recommender systems. In addition to improving people's effectiveness at large, I think they can be a valuable tool for improving individual/organizational decision making and personal productivity which are EA focus areas.

I think that building a collaborative search engine is a tractable starting point and has the potential to improve information discovery within EA and in general—if anyone is interested in collaborating on this, please get in touch!

Comment by brendon_wong on Long-term investment fund at Founders Pledge · 2020-06-23T09:33:44.786Z · score: 8 (3 votes) · EA · GW

Here are a few ideas that come to mind.

It could be interesting to explore/offer funds with different distribution thresholds (for example, saving all funds for 100+ years out versus donating a small percentage every year or nearly every year while still letting assets compound) for donors that have different distribution preferences. Knowing your money will be used to better the world every year in the present while also compounding indefinitely into the future to help future generations may be appealing.

As an alternative to a fixed set of people governing the fund, it could be interesting to consider a model of collaborative democracy/liquid democracy in which donors influence fund decisions and distributions, with each donor's voting power done via equal weighting, donation weighting, or some other mechanism. Succession could be easily incorporated into such a system, with one's votes being distributed in an even or preference-weighted fashion to living donors/stakeholders.

Having the fund structured as a corporate entity could be an interesting possibility as well; it seems some corporations have lasted for over 1,000 years. It should also be possible to set up different legal entities in different countries for maximum continuity (which also easily supports donors from different countries).

The fund could have staff that explore impact investing, such as directly funding high-impact startups or impacting the direction of corporations (private equity, shareholder activism, etc), so that the assets can be used to do good even as they compound indefinitely.

Edit to include a recent EA Forum post I wrote: Having the long-term investment fund be hosted within a more generalized entity (ideally one that is controlled by or aligned with the fund management), such as a provider of donor-advised funds, might increase the chances of the fund surviving into the far future due to it having a lot more assets and also a lot more living stakeholders at every point in time.

Comment by brendon_wong on EA Forum feature suggestion thread · 2020-06-23T08:10:15.729Z · score: 3 (2 votes) · EA · GW

I was initially thinking of including a link to the tags page in the sidebar on the home page, but that is another good idea as well. Including tags in the metadata subheader under article titles on the home page would also increase the prominence/usage of this feature.

Comment by brendon_wong on EA Forum feature suggestion thread · 2020-06-23T08:04:03.051Z · score: 1 (1 votes) · EA · GW

Can tags be linked to (this page) for easy access? How about grouping the tags into a hierarchy for ease of use and discovery, rather than just organizing them alphabetically?

Comment by brendon_wong on Investing to Give Beginner Advice? · 2020-06-23T07:41:48.786Z · score: 1 (1 votes) · EA · GW

I think that even within EA people will have varying opinions on investing, with a bent towards using standard low-cost index funds, employing leverage, and/or doing factor investing. I second the recommendation for Bogleheads to learn about implementing a standard investing approach. This 80,000 Hours post titled Common investing mistakes in the effective altruism community provides an introduction to alternative asset allocations, leverage, and factor investing.

I wrote an EA forum post that focuses on advising EAs to move cash into higher-interest accounts, but it covers various aspects of investing from donating appreciated securities to asset location in the appendix. I hope that is a helpful resource!

Using a donor-advised fund could make sense for donating later to gain some immediate tax benefits and to have the money compound with zero taxes.

I personally believe in using evidence-based investment approaches at an asset class level rather than at a securities level (for example, tolerance band rebalancing). The 80,000 Hours post references this at the end of the post. Unfortunately, most of these approaches range in difficulty from being inconvenient to requiring an investing algorithm to implement. That's one of the reasons why I started an EA-aligned investment firm, Antigravity Investments, which helps EA organizations and donors address implementation barriers. Feel free to get in touch.

Comment by brendon_wong on EAs and EA Orgs Should Move Cash from Low-Interest to High-Interest Options · 2020-06-23T07:13:07.583Z · score: 1 (1 votes) · EA · GW

I'm happy you discovered my post! The general recommendations (e.g. high-interest accounts are better than low-interest accounts, particularly if both accounts have identical risk) hold true in essentially all market and interest rate conditions. The specific recommendations for bank accounts and investments can vary with time and interest rate changes.

For instance, some banks will offer higher yields than other banks at certain times due to their cost of borrowing, revenue when lending, and desired profit level. The expected returns and risk of investments can also change. Feel free to get in touch if you have specific questions or would like our latest guidance!

Comment by brendon_wong on EAGxVirtual Unconference (Saturday, June 20th 2020) · 2020-06-09T23:06:38.306Z · score: 31 (21 votes) · EA · GW

How financial improvements can counterfactually increase funding for EA charities by tens of thousands to millions of dollars per charity

I run Antigravity Investments, an EA social enterprise with the mission of indirectly donating millions to charity by helping charities invest more effectively. Last year, we published this EA Forum article explaining why charities should shift cash from low-interest to high-interest accounts: https://forum.effectivealtruism.org/posts/YjN6cGoXxPZeqCh4Z/eas-and-ea-orgs-should-move-cash-from-low-interest-to-high.

This talk will cover new research done by Antigravity Investments on approximating opportunity costs that charities incur by not following best practices in cash management. We will cover applying our opportunity cost estimation methodology across selected EA charities as well as across a data set of over 300,000 U.S. charities.

We will also cover how our outreach strategy has fared over the past year, and perhaps most importantly, recommend concrete steps EA community members and operations/finance staff at EA organizations can take to increase funding for high-impact causes.

I am based on the West Coast and would prefer the late sessions.

Comment by brendon_wong on Do we have any recommendations for financial advisers for earning to give? · 2019-04-14T20:27:07.319Z · score: 2 (2 votes) · EA · GW

I run Antigravity Investments, an EA-aligned investing firm that helps EA nonprofits and individuals with investing. Our EA Forum article with public recommendations is mostly focused on cash management, although Appendix B discusses higher-EV investing options.

We give free advice and typically charge a low fee for directly managing portfolios. Feel free to reach out at support@antigravityinvestments.com.

For a DIY approach in the United States, we recommend a portfolio with low-fee ETFs. A DIY approach with ETFs makes it possible to donate investments that have gone up in value without paying any taxes, which is an optimal way to donate.

Comment by brendon_wong on EA Angel Group: Applications Open for Personal/Project Funding · 2019-03-28T18:16:28.694Z · score: 1 (1 votes) · EA · GW

Thanks for asking! At this time we do not specifically limit the types of early-stage high-impact activities that can apply. Early-stage nonprofits, for-profits, and personal projects would all fall under the scope of acceptable activity types.

Comment by brendon_wong on $100 Prize to Best Argument Against Donating to the EA Hotel · 2019-03-28T07:47:13.282Z · score: 21 (13 votes) · EA · GW

The following arguments are ideas and have not been thoroughly researched. They may not reflect my actual views. Counterarguments are not mentioned because the OP is "mainly interested in seeing critiques." I may post counterarguments after the reward deadline has passed.

Claim to argue against: "$172,000 to the EA Hotel has at least as much EV as $172,000 distributed randomly to grantees from EA Meta Fund grantees or EA grants grantees."

Argument 1: The EA Hotel has a low counterfactually-adjusted impact

In this post, the EA Hotel states:

Out of 19 residents, 15 would be doing the same work counterfactually, but the hotel allows them to do, on average, 2.2 times more EA work -- as opposed to working a part time job to self-fund, or burning more runway.

This datapoint supports the view that most EA Hotel residents would be doing the same work whether or not they stay at the hotel. The claim that "the hotel allows them to do, on average, 2.2 times more EA work" could be incorrect. To gain more certainty about this, the EA Hotel should track what residents that are not accepted actually end up doing instead.

EA Hotel residents have many options to consider to do the same work while not staying at the hotel. For example, depending on the time and location requirements of the work, they could do some combination of: (1) part-time work to finance their living expenses, (2) living with parents, friends, or another location with near-zero living expenses, or (3) living in very low-cost housing that resembles the cost of the EA Hotel.

If someone pursues option (2), the EA Hotel is negative EV because someone can choose a free option instead of the EA Hotel, which consumes community funds.

If someone pursues options (1) and (3), they might only have to work a very limited amount of time. For example, I believe I recently heard of someone that was able to find a one bedroom living arrangement in Berkeley, CA in a large house for $500 a month, although they have to share a bathroom with many people. So someone might only need to do paid work 25% of the time and can do EA work 75% of the time. This suggests that the "2.2 times more EA work" figure greatly overstates the benefit of the EA Hotel in terms of reducing living expenses. Pursing options (1) and (3) seems to be feasible for the vast majority of people.

If direct funding allows people to pursue option (3) and secure low-cost housing, and if the cost is around the same as the EA Hotel, there may be no need for the EA Hotel itself to exist. The question becomes what is the counterfactually-adjusted impact of funding living expenses at the EA Hotel compared to option (3)? Adjustments should be made for things like missing out on the benefits of living elsewhere than Blackpool as well as relocation time and expenses which would further reduce counterfactual impact. The EA Hotel community certainly provides benefits, although coworking out of REACH may provide similar benefits.

Argument 2: The EA Hotel should charge users directly instead of raising funding

Rather than fundraising from EAs, the hotel should try to directly charge people who are benefiting from their services and community, which is an argument against donating to the hotel.

There doesn't seem to be a need to fund people who can afford the hotel. It's not clear what proportion of people fall under this category, but considering that it only takes 13 weeks of work at $15/hour to pay $7,900 for a one year stay at the hotel, it is possible that majority of residents can already afford to stay at the hotel.

For people who cannot afford the EA Hotel, applicants to funding organizations like EA Grants can include that they are requesting funding for living expenses and indicate EA Hotel expenses as part of their requested grant funding. EA Grants evaluators and other funders may be better equipped to evaluate the EV of projects people are working on as opposed to EA Hotel staff. If EA Grants can already cover this, there is no need to donate to the EA Hotel.

Argument 3: Funding projects has a higher impact than funding living expenses

I assume that EA Grants funds applicants' project expenses as well as their personal salary and living expenses. This could be higher impact than solely funding living expenses. Working at the EA Hotel with an unfunded project may be quite unproductive, particularly if the project requires funding to get anywhere. Seeking early-stage EA project funding seems to require waiting for long periods of time (perhaps months) for funders to get back to you rather than working full-time trying to acquire funding.

Argument 4: People should not donate to the EA Hotel until they improve their impact metrics and reporting

The EV estimation for the EA Hotel is highly mathematical and commenters have expressed that it is difficult to follow. Actual impact reporting appears to consist of testimonials which are hard to evaluate. It's even trickier to evaluate the counterfactually-adjusted impact.

Comment by brendon_wong on Mental support · 2019-03-28T06:34:47.363Z · score: 1 (1 votes) · EA · GW

There is probably a nontrivial number of people who do not seek support due to the presence of a fee, even if they can theoretically afford it (see trivial inconveniences). Unfortunately, I've seen this happen in practice.

Comment by brendon_wong on Bayesian Investor proposes you can predictably beat the market by ~3% following a simple and easy strategy · 2019-03-27T19:13:29.507Z · score: 2 (2 votes) · EA · GW

The potential downside (and upside) of diversifying by adding some tilts and consistently sticking with them is limited, so I don’t see a major problem with “non-advanced investors” following the advice. Investors should be aware of things like rebalancing and capital gains tax; perhaps “intermediate investor” is a better term.

Comment by brendon_wong on Bayesian Investor proposes you can predictably beat the market by ~3% following a simple and easy strategy · 2019-03-27T06:06:39.175Z · score: 2 (2 votes) · EA · GW

It takes a certain degree of investment knowledge and time to form an opinion about the historical performance of different factors and expected future performance. It also requires knowledge and time to determine how to appropriately incorporate factors into a portfolio and how to adjust exposure over time. For example, what should be done if a factor underperforms the market for a noticeable period of time? An investor needs to decide whether to reduce or eliminate exposure to a factor or not. Holding an investment that will continue to underperform is bad, but selling an investment that is experiencing cyclical underperformance is a bad timing decision which will worsen performance each time such an error is made.

As a concrete example, the momentum factor has had notable crashes throughout history that could cause concern and uncertainty among investors that were not expecting that behavior. Decisions to add factors to portfolios need to take into account maintaining an appropriate level of diversification, tax concerns (selling a factor fund could incur capital gains taxes, and factor mutual funds will pass capital gains the fund incurs while following factors onto investors whereas factor ETFs almost definitely won't), and the impact of fees, among other considerations.

Comment by brendon_wong on EA Angel Group: Applications Open for Personal/Project Funding · 2019-03-22T19:55:47.134Z · score: 5 (5 votes) · EA · GW

This post was intended as a grant application announcement post that also happened to contain some information about new funder-friendly and applicant-friendly policies we are adopting. I did not include any information about our evaluation process or risk reduction process in the body of the post, so I would not expect the post to convey high awareness of either reasons why long-termist applications don't get funded.

I am curious what ideas we included you think address your first point about grantmakers being unable to vet the project. I'm not sure if application sharing, rolling applications, or providing feedback to grant applicants address your first or second points.

To elaborate more on risk, I wrote in another comment on this post that:

We have several layers of checks to help reduce risks and improve grant decision making including initial staff review of incoming applications, angels sharing their evaluations with one another and talking with external contacts/experts if appropriate, and hearing opinions of external grantmakers on grant applications we have received (we still need to talk with grantmakers to set this up).

I think that an initial staff review can help detect risks, and if we notice a large problem with downside risk in incoming projects, we can enhance the initial staff review process. The angel evaluation period is where a lot of nuanced considerations about risk can come up, since angels can share their perspectives on a grant proposal with other angels and external experts, and we have angels with significant experience in areas like meta and AI. Finally, this wasn't mentioned in the post, but we are aiming to share evaluations both ways with funders in EA. I think this can go a long way towards making all funders aware of all of the potential risks of a project.

Angels in the group seem to actively avoid funding projects that they feel they are not qualified to evaluate. Angels can point out funding behavior that they perceive is risky from other angels, although from what I've seen, our angels lean more on the side of risk avoidance than anything else.

High-quality grant applications tend to get funded quickly and are thereby eliminated from the pool of proposals available to the EA community, while applicants with higher-risk proposals tend to apply/pitch to lots of funders. This means that on average, proposals submitted to funders will be skewed towards high-downside-risk projects, and funders could themselves easily do harm if they end up supporting many of them. I'd be interested in your thoughts on that.

As Denise mentioned in a post on Jan's project evaluation idea, there is a category of project that is "projects which are simply bad because they do have approximately zero impact, but aren't particularly risky. I think this category is the largest of the the four." This lines up with many of the applications I am seeing. This might be different with long-term/x-risk projects specifically, but since we are a general funding group with individual EA funders with a wide variety of backgrounds and experiences, we are not receiving a large number of such applications relative to the entire pool of applications.

Therefore, I wouldn't say that our applications are likely to be "skewed towards high-downside-risk projects." I expect to continue to receive a large number of projects that may have very low impact just like other funders are likely receiving. As Oliver mentioned, "in practice I think people will have models that will output a net-positive impact or a net-negative impact, depending on certain facts that they have uncertainty about, and understanding those cruxes and uncertainties is the key thing in understanding whether a project will be worth working on." I think that other EA funders will fund projects that match the model of the funders, but because people's models differ wildly and are very likely wrong in many cases due to the high failure rate of funded startups for the most successful VCs, I don't know if other funders are actually funding a significant fraction of the opportunities that end up having the highest impact.

To my understanding EA Grants is the only other funder that is funding general grants, with BERI Grants and EAF Fund focusing on long-term projects exclusively, and the EA Funds focusing on their respective areas and funding larger organizations as well. Since EA Grants is currently closed for applications (I support rolling applications rather than application rounds), we are receiving applications that have not been funded by other funders because the only other funder isn't accepting applications right now. Since I support funder application sharing, with this method funders will be able to see the entire pool of proposals, rather than the pool without the projects other funders have funded. This will help each funder evaluate the quality of the projects they are funding relative to the quality of other projects that other funders have funded.

I really like that you're providing feedback to applicants! In general, I wish the EA community was more proactive with providing critical feedback.

Thanks! I completely agree.

Comment by brendon_wong on Request for comments: EA Projects evaluation platform · 2019-03-22T19:31:42.841Z · score: 3 (3 votes) · EA · GW
I think it is fair to say you expected very low risk from creating an open platform where people would just post projects and seek volunteers and funding, while I expected with minimum curation this creates significant risk (even if the risk is coming from small fraction of projects). Sorry if I rounded off suggestions like "let's make an open platform without careful evaluation and see" and "based on the project ideas lists which existed several years ago the amount of harmful projects seems low" to "worrying about them is premature".

The community has already had many instances of openly writing about ideas, seeking funding on the EA Forum, Patreon, and elsewhere, and posting projects in places like the .impact hackpad and the currently active EA Work Club. Since posting about projects and making them known to community members seems to be a norm, I am curious about your assessment of the risk and what, if anything, can be done about it.

Do you propose that all EA project leaders seek approval from a central evaluation committee or something before talking with others about and publicizing the existence of their project? This would highly concern me because I think it's very challenging to predict the outcomes of a project, which is evidenced by the fact that people have wildly different opinions on how good of an idea or how good of a startup something is. Such a system could be very negative EV by greatly reducing the number of projects being pursued by providing initial negative feedback that doesn't reflect how the project would have turned out or decreasing the success of projects because other people are afraid to support a project that did not get backing from an evaluation system. I expect significant inaccuracy from my own project evaluation system as well as the project evaluation systems of other people and evaluation groups.

Thanks - both of that happened after I posted my comment, and also I still do not see the numbers which would help me estimate the ratio of projects which applied and which got funded. I take as mildly negative signal that someone had to ask, and this info was not included in the post, which solicits project proposals and volunteer work.
In my model it seems possible you have something like chicken-and-egg problem, not getting many great proposals, and the group of unnamed angels not funding many proposals coming via that pipeline.
If this is the case and the actual number of successfully funded projects is low, I think it is necessary to state this clearly before inviting people to work on proposals. My vague impression was we may disagree on this, which seems to indicate some quite deep disagreement about how funders should treat projects.

I wrote about the chicken and the egg problem here. As noted in my comments on the announcement post, the angels have significant amounts of funding available. Other funders do not disclose some of these statistics, and while we may do so in the future, I do not think it is necessary before soliciting proposals. The time cost of applying is pretty low, particularly if people are recycling content they have already written. I think we are the first grantmaking group to give all applicants feedback on their application which I think is valuable even if people do not get funded.

The whole context was, Ryan suggested I should have sought some feedback from you. I actually did that, and your co-founder noted that he will try to write the feedback on this today or tomorrow, on 11th of Mar - which did not happen. I don't think this is large problem, as we had already discussed the topic extensively.

Ben commented on your Google Document that was seeking feedback. I wouldn't say we've discussed the topic "extensively" in the brief call that we had. The devil is in the details, as they say.

Comment by brendon_wong on EA Angel Group: Applications Open for Personal/Project Funding · 2019-03-22T07:23:23.213Z · score: 1 (1 votes) · EA · GW

John Maxwell brought up some interesting points. He suggests that platforms can experience the chicken and egg problem when it comes to getting started, and that intensive networking is a way to overcome this issue. I agree that platforms often have this problem, but the EA Angel Group resolved this not by networking intensely but instead by offering a lot of value to angels. This would incentivize them to join the platform even without a large number of existing grant applicants which would in turn incentivize grant applicants to apply.

Of course, we do need a stream of incoming grant applications to remain viable, and unfortunately we encountered some unexpected issues when attempting to collaborate with EA Grants and speak to many community members as part of several strategies to acquire grant applications. As mentioned in my progress update comment, I am currently pursuing alternate strategies to achieve this objective which involve steps that I have greater control over (and less steps that require the approval of entities whose decisions I cannot influence). That being said, I think networking and collaboration is highly valuable, and am scaling that up even as I pursue strategies that do not require networking to succeed.

Comment by brendon_wong on EA Angel Group: Applications Open for Personal/Project Funding · 2019-03-22T07:04:47.552Z · score: 2 (2 votes) · EA · GW

I wrote a progress update comment regarding the EA Angel Group which covered our grant opportunity discovery activities over the last few months. We spoke with EA Grants several months ago, and to the best of my knowledge they are still determining whether to send and receive grant applications with other funders. At least one major funding group has expressed significant interest in sending and receiving grant applications with the EA Angel Group, and we are in the process of talking with various funders about this.

I mentioned the one concern I heard and my response to it in my progress update comment:

One objection to sharing grant applications among funders is that a funder would fund all of the grant proposals they felt were good and classify all other grant proposals as not suitable to be funded. From the funder's perspective, sharing the unfunded grant proposals would be bad since other organizations could subsequently fund them, and the funder classified those grant proposals as not worth funding. I personally disagree with this objection because the argument assumes that a funder has developed a grant evaluation process that can actually identify successful projects with a high degree of accuracy. Since the norm in the for-profit world involves large and successful venture capital firms with lots of experienced domain experts regularly passing on opportunities that later become multibillion-dollar companies, I find it unlikely that any EA funding organization will develop a grant evaluation process that is so good it justifies hiding some or all unfunded applications.

Can you elaborate on:

I think for example that a ‘just-another-universal-protocol’ worry would be very reasonable to have here.

Are you suggesting that funders may be concerned about adopting a protocol which ends up providing limited value? As I've stated in several other comments, I think sharing grant applications can be of considerable value since arbitrarily limiting the pool of projects seems pretty suboptimal.

To avoid that I think we need to do the hard work of reaching out to involved parties and have many conversations to incorporate their most important considerations and start mutually useful collaborations. I.e. consensus building.

I agree. I did some initial outreach at first and will begin additional outreach shortly.

Comment by brendon_wong on Request for comments: EA Projects evaluation platform · 2019-03-22T06:06:58.377Z · score: 8 (5 votes) · EA · GW

Thanks for pointing that out! Jan and I have also talked outside the EA Forum about our opinions on risk in the EA project space. I’ve been more optimistic about the prevalence of negative EV projects, so I thought there was a chance that greater optimism was being misinterpreted as a lack of concern about negative EV projects, which isn’t my position.

Comment by brendon_wong on Request for comments: EA Projects evaluation platform · 2019-03-21T20:08:04.055Z · score: 14 (9 votes) · EA · GW
We had some discussion with Brendon, and I think his opinion can be rounded to "there are almost no bad projects, so to worry about them is premature". I disagree with that.

I do not think your interpretation of my opinion on bad projects in EA is aligned with what I actually believe. In fact, I actually stated my opinion in writing in a response to you two days ago which seems to deviate highly from your interpretation of my opinion.

I never said that there are "almost no bad projects." I specifically said I don't think that "many immediately obvious negative EV projects exist." My main point was that my observations of EA projects in the entire EA space over the last five years do not line up with a lot of clearly harmful projects floating around. This does not preclude the possibility of large numbers of non-obviously bad projects existing, or small numbers of obviously bad projects existing.

I also never stated anything remotely similar to "to worry about [bad projects] is premature." In fact, my comment said that the EA Angel Group helps prevent the "risk of one funder making a mistake and not seeking additional evaluations from others before funding something" because there is "an initial staff review of projects followed by funders sharing their evaluations of projects with each other to eliminate the possibility of one funder funding something while not being aware of the opinion of other funders."

I believe that being attentive to the risks of projects is important, and I also stated in my comment that risk awareness could be of even higher importance when it comes to projects that seek to impact x-risks/the long-term future, which I believe is your perspective as well.

Also, given the Brendon's angel group is working, evaluating and funding projects since October, I would be curious what projects were funded, what was the total amount of funding allocated, how many applications they got.

Milan asked this question and I answered it.

Based on what I know I'm unconvinced that Brendon or BERI should have some outsized influence how evaluations should be done; part of the point of the platform would be to serve broader community.

I'm not entirely sure what your reasons are for having this opinion, or what you even mean. I am also not exactly sure what you define as an "evaluation." I am interpreting evaluations to mean all of the assessments of projects happening in the EA community from funders or somewhat structured groups designed to do evaluations.

I can't speak for BERI, but I currently have no influence on how evaluations should be done, and I also currently have no interest in influencing how evaluations should be done. My view on evaluations seems to align with Oliver Habryka's view that "in practice I think people will have models that will output a net-positive impact or a net-negative impact, depending on certain facts that they have uncertainty about, and understanding those cruxes and uncertainties is the key thing in understanding whether a project will be worth working on." I too believe this is how things work in practice, and evaluation processes seem to involve one or more people, ideally with diverse views and backgrounds, evaluate a project, sometimes with a more formalized evaluation framework taking certain factors into account. Then, a decision is made, and the process repeats at various funding entities. Perhaps this could be optimized by having argument maps or a process that involves more clearly laying out assumptions and assigning mathematical weights to them, but I currently have no plans to try to go to EA funders and suggest they all follow the same evaluation protocol. Highly successful for-profit VCs employ a variety of evaluation models and have not converged on a single evaluation method. This suggests that perhaps evaluators in EA should use different evaluation protocols since different protocols might be more or less effective with certain cause areas, circumstances, types of projects, etc.

Comment by brendon_wong on EA Angel Group: Applications Open for Personal/Project Funding · 2019-03-21T19:51:17.603Z · score: 11 (5 votes) · EA · GW

That is correct! The EA Angel Group is designed to help individual funders who are already making grants with discovering more opportunities and hearing from other funders about possible benefits and risks of individual funding opportunities. Many people in the angel group have been heavily involved with the EA community for many years and have a history of making successful grants. Analogous to a for-profit angel group, we do not force angels to do everything through our group, we just seek to add value in terms of helping people fund better opportunities through improving opportunity discovery, evaluation, and funding processes.

We have several layers of checks to help reduce risks and improve grant decision making including initial staff review of incoming applications, angels sharing their evaluations with one another and talking with external contacts/experts if appropriate, and hearing opinions of external grantmakers on grant applications we have received (we still need to talk with grantmakers to set this up).

Comment by brendon_wong on EA Angel Group: Applications Open for Personal/Project Funding · 2019-03-21T19:26:18.346Z · score: 1 (1 votes) · EA · GW

Thanks for the suggestion Remmelt! I just added your primary wording recommendation to the post.

Comment by brendon_wong on EA Angel Group: Applications Open for Personal/Project Funding · 2019-03-21T19:19:39.732Z · score: 6 (5 votes) · EA · GW

According to information that we requested from angels around our launch in October 2018, our individual funders had ~$600,000 in available capital to make early-stage grants for the remainder of 2018. Angels have been making grants during the time the group has been operating, although I am not sure of the exact volume aside from the fact that one angel recently made a grant of ~$25,000 to a project.

I am not sure of the exact volume because angels have not made a grant through a project that has submitted our grant application form yet. This is because we had lower than expected grant application volume since we were unexpectedly delayed for many months pursuing grant sharing with EA funders and trying to launch the EA Project Platform rather than doing a public call for applications and working with volunteers to source evaluations. We are now switching to doing public requests for proposals and active grant opportunity sourcing which I expect will significantly increase the number of grant opportunities we can present to angels. We are continuing to talk with EA funders about grant sharing, and one major funder just expressed an interest in sharing grant applications, so things may be moving forward on that front.

Comment by brendon_wong on EA Angel Group: Applications Open for Personal/Project Funding · 2019-03-21T03:07:02.422Z · score: 9 (4 votes) · EA · GW

Bringing up that possible concern is a good point Remmelt! My paragraph was specifically suggesting that established EA funders should share applications with one another. As I mentioned in my comment to Ruth, if the application systems of 5 funders capture equal fractions of all projects in existence, each funder would only be able to make funding decisions with a pool of projects that is 1/5 the size of the total number of opportunities. Arbitrarily limiting the pool of projects to evaluate seems clearly suboptimal.

I agree that people may be concerned about inexperienced EA funders making unwise funding decisions. People with that concern should actually be supporting the EA Angel Group, because if they had read our introductory article or my recent comment about this they may have realized that:

the EA Angel Group [has] an initial staff review of projects followed by funders sharing their evaluations of projects with each other to eliminate the possibility of one funder funding something while not being aware of the opinion of other funders.

We help individual funders of all experience levels avoid issues like the unilateralist’s curse by benefiting from the perspectives of other funders. Funders can point out potential risks or downsides of a project and strongly warn each other against funding a project that appears to have a material chance of causing significant harm.

But that’s just a guess and I don’t really know. I do share in the sentiment that the option to downvote something is too easy for people who pattern-match abstract EA ideas like that, instead of putting in the somewhat strenuous and vulnerable work of sharing their impressions and asking further in the comment section about how the platform concretely works.

It is unfortunate that people may be downvoting without engaging in what is actually being proposed. I think that asking good questions or commenting is far better for everyone involved than giving a strong downvote based on a quick impression (possibly wiping out several standard upvotes) and leaving.

@Brendon, I thought you tried to address possible risks of a making applications available online in a previous post.

That is correct, I wrote about that in my post about the EA Projects Platform, which I recently mentioned has been indefinitely delayed. The EA Angel Group does not and was not designed to make projects available online.

How do you think right now about how to address funder blindspots in built-up knowledge and evaluation frameworks – for both established EA grantmakers and new venture capitalist-style funders (who might have valuable for-profit start-up experience to build on)?

I don't have a readily prepared analysis of addressing funder blindspots. Something that might be helpful in reducing that would be having funders share evaluations with one another, so that if one funder recognizes a potential risk that is hard to detect, other funders can factor it into consideration as well. To prevent groupthink, funders should use a process where they conduct an initial or full evaluation before seeing what other funders think about a proposal.

Can you elaborate on what a "new venture capitalist-style funder" is? I'm not sure what this refers to, I believe the EA early-stage funding space is currently made of small number of entities like EA Grants and BERI grants and a larger number of individual donors.

Comment by brendon_wong on EA Angel Group: Applications Open for Personal/Project Funding · 2019-03-21T02:42:42.591Z · score: 8 (4 votes) · EA · GW

Thanks for sharing your thoughts Ruth! I agree, I was surprised both by the negative votes and also by the lack of comments, particularly since our original article announcing the EA Angel Group was received quite positively. I linked to the EA Forum Post introducing the EA Angel Group at the beginning of the article. I felt that if people had thoughts or concerns with the idea of the angel group they could comment or vote on the original angel group article, but the article had no new votes or comments.

Regarding the many different funding systems and separate application forms that currently exist across EA, I wholeheartedly agree with your perspective. Simplifying a bit, if we assume there are 3 EA funders and 15 EA projects and each funder's application captures an equal fraction of all projects, each funder can only make funding decisions from their pool of 5 projects rather than the 15 projects that exist. Choosing the best projects to fund out of a smaller set of projects that are randomly selected out of a larger pool seems clearly suboptimal.

Comment by brendon_wong on EA Angel Group: Applications Open for Personal/Project Funding · 2019-03-21T02:21:00.792Z · score: 6 (5 votes) · EA · GW

Our main objective is to fund the highest impact projects regardless of form. We have historically received several applications from EAs working on projects that are structured as for-profit entities.

Comment by brendon_wong on Sharing my experience on the EA forum · 2019-03-19T08:09:06.785Z · score: 6 (6 votes) · EA · GW

I feel similarly! I like your idea of a prompt before downvoting new users. Perhaps in general there could be a message with no user action required that appears whenever a downvote is made to encourage people to downvote with an explanatory comment in the event the reason for downvoting isn't obvious (i.e. it hasn't already been expressed in a comment).

Comment by brendon_wong on Concept: EA Donor List. To enable EAs that are starting new projects to find seed donors, especially for people that aren’t well connected · 2019-03-19T07:33:10.423Z · score: 14 (14 votes) · EA · GW

I'd like to point out for the benefit of other forum readers that EAs have different views on the average expected value of projects, variance in expected value of projects, and prevalence and severity of negative expected value projects. Based on the applications that the EA Angel Group has received, as well as lengthy lists of projects that have existed or currently exist in the EA community, at present I do not think that many immediately obvious negative EV projects exist (it is possible that people come up with negative EV projects but then receive feedback on potential harms prior to the project's existence becoming known to many people). I have seen a lot of projects that could potentially have a near-zero EV by failing to achieve their intended objectives or underperforming a top-rated EA charity, but people will often have highly varied opinions on a project's EV.

Jan has a focus on x-risk and the long-term future. A project seeking to directly impact x-risks by doing something like AI safety research has not yet applied to the EA Angel Group, and I have rarely or ever seen projects like that in EA project lists. It is possible that people behind projects like those are already aware of the risks of sharing information or do not see the need to either share the project's existence with many people or apply for early-stage funding from funders that do not focus exclusively x-risk. It is possible that complex projects that are doing direct work to impact the long-term future can have a greater potential to create harm and should be reviewed more rigorously.

In the case that most EA projects are EV positive and are in need of funding, then this article's suggestion is likely net positive. Also, essentially all individual funders I've spoken with already consult other funders and experts if they see the need before making funding decisions. If this is the norm, which I think it is, this makes it much less likely that the unilateralist's curse will happen in practice with regard to EA project funding.

Most importantly, this article’s proposal will probably only have a marginal impact on project and funder discoverability. Historically, many resources have existed online to enable EAs and funders to discover projects, like the .impact Hackpad (which shut down when Hackpad was acquired), various lists of projects that have popped up on the EA forum and elsewhere on the internet, and the EA Work Club. Announcing that a project exists or is seeking funding is simply sharing information, and there doesn't appear to be any easy way to prevent people from sharing information if they want to.

Therefore, I do not think it is fair to label this proposal a "bad idea." Implementing the article's proposal only makes it marginally easier for funders to learn about things than existing methods like someone posting a project idea directly on the EA Forum and even seeking funding, as has been done many times in the past. Someone who is motivated enough about seeking funding can simply speak with a lot of EAs they encounter and ask for funding, sidestepping this article's list of funders.

Nevertheless, there still may be a risk of one funder making a mistake and not seeking additional evaluations from others before funding something. That is why I created the EA Angel Group, which has an initial staff review of projects followed by funders sharing their evaluations of projects with each other to eliminate the possibility of one funder funding something while not being aware of the opinion of other funders. For optimal safety, a setup like the EA Angel Group is safer than publicly posting everyone's contact information online and seems to achieve the same overall objectives as this article's proposal.

Comment by brendon_wong on Bayesian Investor proposes you can predictably beat the market by ~3% following a simple and easy strategy · 2019-03-17T17:18:21.585Z · score: 3 (3 votes) · EA · GW

There are various frameworks like the transtheoretical model (TTM) that try to explain why individual behavior change is difficult. There are many prerequisites to change, like making people aware there may be an issue in the first place, convincing them that the possible problem is an actual problem, persuading them that the issue is urgent enough they should work on it in the near future, and helping them develop an effective plan of action. There are reasons why people do not proceed ahead at every step of change, like smokers believing that smoking is not harmful to them, or a perceived lack of urgency or time to implement changes in the near future. This problem may be magnified within organizations because multiple people within an organization often need to agree that change is necessary and should be implemented before anything gets done, and anyone that disagrees in the chain of command could prevent the intended change from happening.

Comment by brendon_wong on EA is vetting-constrained · 2019-03-17T12:28:28.208Z · score: 8 (5 votes) · EA · GW

I am unclear on whether or not the main constraint of evaluating EA projects in general is the "time of senior people with domain expertise." For-profit venture capitalists are usually not the world's leading experts in a particular area. Domain familiarity is valuable, but it does not seem like a "senior" or "expert" level of domain knowledge is all that helpful in assessing the likelihood of something succeeding or not. Like VCs, many EA funders I've spoken with rely strongly on factors that do not require a high level of domain familiarity to determine whether or not to fund a project, such as the strength of the founding team. Some amount of domain expertise may be helpful in evaluating certain types of highly complex or research-heavy projects, but most of the projects that I've seen and that other funders are funding do not seem to involve this level of deep domain complexity.

Comment by brendon_wong on EA is vetting-constrained · 2019-03-17T11:43:19.099Z · score: 13 (10 votes) · EA · GW

To provide more information on the status of the EA Angel Group, Benjamin Pence and I are working together on the EA Angel Group (and its parent project Altruism.vc). The EA Angel Group is operating, although it received a lower than expected number of referrals from angels within the group which has significantly reduced the benefit that the group currently provides to its members.

I anticipated this concern months ago and tried to resolve the issue, but was delayed by ~5 months in our attempt to discuss sharing grant proposals with EA Grants. I felt like sharing grant proposals would be more efficient than launching our own "competing" grant application. I think a common app with rolling submissions is a much more sensible idea than having many separate applications that all do not share the applications they receive with other funders. To my understanding EA Grants currently doesn't have an opinion on whether sharing grant applications with other funders is a good idea or not, and it is unclear when they will develop an opinion on this topic.

One objection to sharing grant applications among funders is that a funder would fund all of the grant proposals they felt were good and classify all other grant proposals as not suitable to be funded. From the funder's perspective, sharing the unfunded grant proposals would be bad since other organizations could subsequently fund them, and the funder classified those grant proposals as not worth funding. I personally disagree with this objection because the argument assumes that a funder has developed a grant evaluation process that can actually identify successful projects with a high degree of accuracy. Since the norm in the for-profit world involves large and successful venture capital firms with lots of experienced domain experts regularly passing on opportunities that later become multibillion-dollar companies, I find it unlikely that any EA funding organization will develop a grant evaluation process that is so good it justifies hiding some or all unfunded applications.

Around the time I became more concerned that application sharing with EA Grants would be indefinitely delayed, I began to think an EA Project Platform would be a really great way to share not only grant opportunities but also other project-related opportunities like volunteering opportunities with the community. After building a prototype and seeking feedback, much of it positive, one EA decided to try to unilaterally block our platform from launching for reasons like wanting a central organization like CEA to back such a platform rather than a newer team like Ben and I. I personally disagreed with their reasoning, since it does not appear like a major organization has indicated any substantial interest in launching such a platform in the near future, and the launch of such a platform does not preclude the possibility of CEA or some other organization having a key role in the platform in the future. Not wanting to upset this person I decided to pause work on the EA Project Platform.

Ben and I are currently evaluating whether or not we want to work on a common app for funders or defer that plan and launch our own separate grant application.

Since our project to improve the EA project space is itself an EA project, our project also has the same capacity and funding constraints as other EA projects. If anyone would like to collaborate with us or provide some funding, please let me know!

Comment by brendon_wong on Bayesian Investor proposes you can predictably beat the market by ~3% following a simple and easy strategy · 2019-03-17T10:09:40.115Z · score: 3 (2 votes) · EA · GW

Bayesian Investor's recommendations are actually pretty similar to the more advanced portfolio recommended in Ben Todd's post Common investing mistakes in the effective altruism community. The article recommends "[adding] tilts to the portfolio for value, momentum and low volatility (either through security selection or asset selection or adding a long-short component) and away from assets owned for noneconomic reasons" as an advanced move that should only be done if "you know what you’re doing."

Likewise, Bayesian Investor's recommended portfolio heavily involves low-volatility and fundamentally weighted (value-tilted) ETFs.

These articles reach fairly similar conclusions because academic research indicates that these strategies have historically outperformed market capitalization–weighted indexes (commonly known as "the market"). Various theories exist about why these strategies outperformed historically and whether they can be expected to outperform in the future.

Your observation that investing is important for EA because it can significantly increase funding for the EA community is why I'm working on Antigravity Investments, a social enterprise with the goal of improving investment returns in the EA community. Right now, we're picking the lowest hanging fruit by recommending that EA organizations move low-interest cash reserves into high-interest and low-risk savings options (see my EA forum article), which is essentially a guaranteed 2.5% improvement in returns every year at current interest rates. If we shift $15 million in cash, that's another $1 million in direct funding for high impact charities over three years.

Interestingly enough our most recommended option is both safer and higher interest than storing large amounts of cash in a checking account.

While there may be obvious things that EAs should be doing, unfortunately it is very difficult to invoke behavior change. My current approach to behavior change with regards to investing is to have an organization with specific expertise in investing help other EAs and EA organizations implement sensible recommendations. This approach seems to be more effective than writing articles online since it removes the prerequisites of having adequate expertise and time to learn about and implement sensible investing practices.

I wrote the high-yielding cash equivalents article because the recommendation seems particularly easy and obvious to implement. So far, although my article was well received, I haven't heard from any EA organization that has attempted to implement the recommendation based on reading the article, although organizations I've directly reached out to in the past have implemented the recommendation. I'm currently in the (very slow) process of doing more direct outreach to EA organizations to determine for them (and for us) whether our recommendation is worth implementing.

To answer your question about whether the advice is worth following, my personal opinion is that some of Bayesian Investor's recommendations are worth diversifying (tilting) into at a level that reflects each investor's confidence about how likely the anomaly will persist into the future. The low volatility factor in particular has achieved very high out-of-sample risk-adjusted and absolute returns, which is promising, but of course a prolonged period of underperformance could be on the horizon—hence the importance of diversifying.

Comment by brendon_wong on Concept: EA Donor List. To enable EAs that are starting new projects to find seed donors, especially for people that aren’t well connected · 2019-03-17T09:19:51.099Z · score: 8 (9 votes) · EA · GW

Hi there! Ben Pence and I launched the EA Angel Group several months ago, which seems related to your proposal. We wrote an EA Forum post announcing the launch. It'd be great to jump on a call and compare thoughts on what we're working on and how we might be able to collaborate! One thought, some funders may be uncomfortable with being publicly listed (perhaps due to concerns about lots of people contacting them), but a certain subset of funders could be pretty on board with the idea.

Comment by brendon_wong on EAs and EA Orgs Should Move Cash from Low-Interest to High-Interest Options · 2019-02-26T06:59:01.080Z · score: 5 (4 votes) · EA · GW

Based on our preliminary research into nonprofit financial documents, there may be quite a few organizations within or adjacent to EA that have significant cash reserves, so we think there is the chance we can increase funding for EA causes by millions of dollars relatively quickly.

Last year, we directly reached out to two well-known EA organizations and recommended that they move cash to a money market fund. This recommendation was more complex than our current StoneCastle recommendation. One of them implemented our recommendation with millions of dollars. The other felt like they weren't large enough to gain a significant benefit (I am not sure the exact amount of cash they had on hand when making this judgement).

It is hard from our end to determine exactly how much an organization can gain because financial documents like the public Form 990 nonprofit tax return can be outdated (the latest Form 990s available are from 2016) and not clearly indicate how effectively an organization is managing its cash and exactly how much cash it has on hand.

We are currently evaluating whether to pursue a slow, networking-based outreach approach versus something like emailing staff members at dozens of EA and EA-aligned organizations to try to accelerate the rate of adoption.

We are being cautious with planning our outreach strategy because if a recommendation is accepted or rejected, in some cases a single charity may be holding tens of millions of dollars, and so a single successful or failed recommendation could impact funding by millions of dollars over the course of one year or several years.

Any insight into how we should approach outreach to maximize our impact would be appreciated! Perhaps emailing a lot of organizations as you mention is the best option.

Another interesting thought is that while EA is focused on driving a limited amount of funding to the most effective charities, we can essentially "create funding out of thin air" so to speak. That means that we may be able to have a very high impact even if we advise organizations that are likely a bit lower impact than say GiveWell's top charities. If anyone knows of a convenient list of dozens or hundreds of charities that are high impact enough where advising them would be a high impact use of time, I'd love to see it!

Comment by brendon_wong on EAs and EA Orgs Should Move Cash from Low-Interest to High-Interest Options · 2019-02-25T16:32:14.999Z · score: 1 (1 votes) · EA · GW

Thanks Cullen! Your post was great as well! At the beginning of February an EA reached out to us with a question on how to implement your recommendation, so it seems like people are following your advice :)