EA Hotel Fundraiser 5: Out of runway!

post by CEEALAR (EA Hotel) · 2019-10-25T14:16:41.079Z · EA · GW · 90 comments

UPDATE 5th Nov 2019: since this post went up, we have received ~£9,500 in donations and a further ~£4000 in backdated payments from residents. Following the latter, and adjusting costs for last 6 months downward to ~£5,100/month, our runway now extends ~3 months, to the beginning of Feb. Thanks for all the support, this is a great start to our latest fundraiser. We are working on more posts in the series listed below (will link to them in the list as they are completed).

This is a quick post to say that our financial situation is looking pretty dire right now. We were going to wait until “Giving Season” starts in December to start a fresh fundraiser, but now can’t afford to do that. We have several things in the pipeline to bolster our case (charity registration, compiling more outputs, more fundraising posts detailing the case for the hotel, hiring round for the Community & Projects Manager, refining internal systems), but they may not reach fruition in time unfortunately. If you are interested in donating, now is the perfect timing for you to have a large impact on the project! Happy to answer questions in the comments (have anticipated some below).

Relatively small amounts would allow us to keep things going long enough to gain the information needed to determine whether the experiment is/was worthwhile. As it is, if the project fails down to lack of funds now, we feel that we are leaving a lot of Value of Information on the table. Our costs are ~£5,700/month (based on the last 5 months’ spending). Even 1 month of funding will allow us some breathing space to get some of the work done.

To donate, and for more info (past posts making the case for the hotel), see: eahotel.org/fundraiser.

List of proposed posts for the continuation of this series (renewed runway permitting; provisional; linked when complete):


Comments sorted by top scores.

comment by Eli Rose (reallyeli) · 2019-10-28T18:41:40.154Z · EA(p) · GW(p)

I donated $1000 since it seems to me that something like the EA Hotel really ought to exist, and it would be really sad if it went under.

I'm posting this here so that, if you're debating donating, you have the additional data point of knowing that others are doing so.

Replies from: Will Kirkpatrick, KevinO
comment by marswalker (Will Kirkpatrick) · 2019-10-29T21:49:05.488Z · EA(p) · GW(p)

Thank you both for donating. And I'll add another +1 to the number of people who have donated 1,000. I'm hoping that an update on the status of the hotel comes soon.

Replies from: EA Hotel, EA Hotel
comment by CEEALAR (EA Hotel) · 2019-10-30T12:20:50.552Z · EA(p) · GW(p)

It's looking like we now have runway until the end of the year, thanks to you three and others who have donated in the last few days :) Will post a more detailed update (and update the fundraiser page) in the coming days.

comment by CEEALAR (EA Hotel) · 2019-11-05T16:04:02.978Z · EA(p) · GW(p)

OP and fundraiser page now updated. tldr: runway now ~3 months :)

comment by KevinO · 2019-10-28T20:48:02.562Z · EA(p) · GW(p)

I also made a similarly sized donation.

comment by Greg_Colbourn · 2019-10-25T18:42:37.617Z · EA(p) · GW(p)

Going to say that personally, I still very much think this is the best use of EA money on the margin, considering the low costs per person-year of work, hits-based giving, community building and network effects, and room for more funding (i.e. the current accute need of funding). Especially in the current situation, I think it's an outstanding opportunity for small/medium-sized donations to move the needle.

However, I'm at the stage where I'm having to consider losing my own financial independence / potential for investing in the future (including in EA things) if I want to give further financial support to the EA Hotel. And I'm not quite ready to do that.

Replies from: Open_Thinker
comment by Open_Thinker · 2019-10-26T23:20:41.579Z · EA(p) · GW(p)

On what do you base that claim? Effective Altruism should be evidential, rational, and objective where possible. What is the value of the measurable output or utility of the EA Hotel, in particular compared to alternatives?

The fact that there are only 18 total donations totaling less than $10k is concerning, it shows that the EA Hotel does not have a broad base of support and is likely not sustainable, i.e. society or its environment does not calculate it as providing net value.

Either way, whether the project survives or fails, a deep retrospective should probably be performed to learn from the situation.

Replies from: John_Maxwell_IV, reallyeli, EA Hotel
comment by John_Maxwell (John_Maxwell_IV) · 2019-11-01T04:13:48.518Z · EA(p) · GW(p)

The fact that there are only 18 total donations totaling less than $10k is concerning

If you are well-funded, they'll say: "You don't need my money. You're already well-funded." If you aren't well-funded, they'll say: "You aren't well-funded. That seems concerning."

Replies from: Open_Thinker
comment by Open_Thinker · 2019-11-02T15:02:51.091Z · EA(p) · GW(p)

Oh, found your 2nd reply to me.

This is an astute point, I fully acknowledge and recognize the validity of what you are saying.

However, it is not that simple, it depends on the expected yield curve of the specific effort and its specific context. In some cases that are already "well-funded" there is high value generation which is still below full potential and should be increasingly funded further, e.g. due to economies of scale; in other cases, there are diminishing returns and they should not be funded further.

Similarly, the same is true for "not well-funded" efforts. There are some efforts which have high potential and should be lifted off the ground, and there are others which should be neglected and left to die.

So that is the difference, in general terms. Which case a specific example is in takes some careful consideration to the details to determine.

comment by Eli Rose (reallyeli) · 2019-10-28T21:37:11.806Z · EA(p) · GW(p)

(I think the tone of this comment is the reason it is being downvoted. Since we all presumably believe that EA should be evidential, rational and objective, stating it again reads as a strong attack, as if you were trying to point out that no assessment of impact had been done, even though the original post links to some.)

Replies from: Open_Thinker
comment by Open_Thinker · 2019-10-29T01:14:18.804Z · EA(p) · GW(p)

Noted; however upon lightly reviewing said information, it seems to be lacking. Hence the request for further information.

It does not seem like this is expected until update 10, as I noted previously. The fact that it is not a higher priority for an EA organization is a lack in my opinion.

Replies from: John_Maxwell_IV, Greg_Colbourn
comment by John_Maxwell (John_Maxwell_IV) · 2019-11-01T04:12:49.077Z · EA(p) · GW(p)

This seems like a disagreement that goes deeper than the EA Hotel. If your focus is on rigorous, measurable, proven causes, great. I'm very supportive if you want to donate to causes like that. However, there are those of us who are interested in more speculative causes which are less measurable but we think could have higher expected value, or at the very least, might help the EA movement gather valuable experimental data. That's why Givewell spun out Givewell Labs which eventually became the Open Philanthropy Project. It's why CEA started the EA Funds to fund more speculative early stage EA projects. Lots of EA projects, from cultured meat to x-risk reduction to global priorities research, are speculative and hard to rigorously measure or forecast. As a quick concrete example, the Open Philanthropy Project gave $30 million to OpenAI, much more money than the EA Hotel has received, with much less public justification than has been put forth for the EA Hotel, and without much in the way of numerical measurements or forecasts.

If you really want to discuss this topic, I suggest you create a separate post laying out your position - but be warned, this seems to be a fairly deep philosophical divide within the movement which has been relatively hard to bridge. I think you'll want to spend a lot of time reading EA archive posts before tackling this particular topic. The fact that you seem to believe EAs think contributing to the sort of relatively undirected, "unsafe" AI research that DeepMind is famous for should be a major priority suggests to me that there's a fair amount you don't know about positions & thinking that are common to the EA movement.

Here are some misc links which could be relevant to the topic of measurability:

And here's a list of lists of EA resources more generally speaking:

Replies from: Open_Thinker
comment by Open_Thinker · 2019-11-02T14:57:50.824Z · EA(p) · GW(p)

Hey, thanks for the reply, it looks like there is a lot of interesting / useful information there. Also, it looks like you replied twice to me based on notifications, but I can only find this comment, so sorry if I missed something else.

With all due respect, I think there is a bit of a misunderstanding on your part (and others voting you up and me down).

If your focus is on rigorous, measurable, proven causes, great. I'm very supportive if you want to donate to causes like that. However, there are those of us who are interested in more speculative causes which are less measurable but we think could have higher expected value, or at the very least, might help the EA movement gather valuable experimental data.

First of all, I am interested in rigorous & evidential efforts--what one could perhaps call "scientific" in method generally; however, I am not exclusively interested in such. It is therefore not correct to label that Open_Thinker is not "interested in more speculative causes" with deep, far-reaching potential x consequences, similarly to your apparent position. The difference (e.g. with OpenAI) is that the EA Hotel is a relatively minor organization (at least based on existing budget and staffing levels) in comparison which is less likely to have much of an effective impact, compared to other organizations such as DeepMind and OpenAI (as well as academic groups) with concrete achievements and credentials--e.g. AlphaGo and Elon Musk's achievements and former involvement, respectively. This should not be underestimated, as it is a critical component; the EA Hotel has a fairly non-existent achievements list relatively it looks like, and from what I can tell by skimming ongoing activities, it may remain so (at least for the near future). It is not only I who thinks this way, as it directly explains the funding gap.

So secondarily, no, I think we do not have a fundamental disagreement vis-a-vis fundamental priorities, although certainly we do regarding implementations, details, specific methods, etc.

Actually, I am being quite on topic, because the topic of this thread specifically is about funding the EA Hotel, which has been my focus throughout. Specifically within the context of the top comment that the EA Hotel is "the best use of EA money," which I was directly responding to and questioning. It is merely me expressing my skepticism and due diligence, is that wrong? Based on a specific claim like that, where is the evidence to support it?

So just so there is clear understanding, I am really only interested here in whether or not I personally should help fund the EA Hotel, because I am willing to do so if there is convincing logic or evidence to do so. Up until this point I still do not see it; however, the EA Hotel also has the funding it needs from other sources now, so perhaps we should just leave the matter--although I am still willing to continue, although increasingly less so, as there are diminishing apparent returns for all involved, in my estimation.

However, thank you again for the response and information above, I will take some time to peruse it.

Replies from: Greg_Colbourn
comment by Greg_Colbourn · 2019-11-07T18:55:40.781Z · EA(p) · GW(p)
which is less likely to have much of an effective impact, compared to other organizations such as DeepMind and OpenAI..

Do you think this is true even in terms of impact/$ (given they are spending ~1,000-10,000x what we are)?

however, the EA Hotel also has the funding it needs from other sources now

We now have ~3 months worth of runway. It's a good start to this fundraiser, but is hardly conducive to sustainability (as mentioned below [EA(p) · GW(p)], we would like to get to 6 months runway to be able to start a formal hiring process for our Community & Projects Manager. The industry standard for non-profits is 18 months runway).

comment by Greg_Colbourn · 2019-10-30T12:41:36.396Z · EA(p) · GW(p)

I would appreciate it if you could review the information a bit more thoroughly. Perhaps you could generate your own estimate using the framework developed in Fundraiser 3 [EA · GW] and the outputs listed here. Fundraiser 10 was listed last because I want to try and do a thorough job of it (but also have other competing urgent priorities with respect to the hotel). There are also many considerations [EA · GW] as to why any such estimates will be somewhat fuzzy and perhaps not ideal to rely on too heavily for decision making (hoping to go into detail on this in the post).

Replies from: Open_Thinker
comment by Open_Thinker · 2019-10-31T01:31:05.938Z · EA(p) · GW(p)

The 3rd update was reviewed, that was what led me to search for Part 2 which is expected in update 10. Frankly I am personally not interested in the 3rd update's calculations, because simply based on personal time allocations I would prefer a more concise estimate than having to tediously go through the calculations myself.

Please understand that this is not an insult, and I think it is a reasonable point in fact--for example, with other [in some ways competing] efforts (e.g. startups), it would not be acceptable to most venture capitalists to present slides of incomplete calculations in a pitch deck and ask them to go through it manually on their own time rather than having the conclusive points tidily summarized. It is likely just not worth the effort in most cases.

It looks like the EA Hotel has obtained funding through 2019 though now, so I congratulate your team on that. If you would like to continue the discussion, I suggest replying to my other comment (below) so that there are not diverging threads.

Replies from: Greg_Colbourn
comment by Greg_Colbourn · 2019-10-31T20:41:42.016Z · EA(p) · GW(p)

See reply below [EA(p) · GW(p)].

comment by CEEALAR (EA Hotel) · 2019-10-27T05:55:34.149Z · EA(p) · GW(p)

Please read the posts linked to on eahotel.org/fundraiser (and as stated in the OP, we have more in the pipeline).

See also the totaliser on that page (will be updated soon) - total donations (in addition to those made founder Greg Colbourn) are currently ~£36k from >50 individuals, they have come through various means - the PayPal MoneyPool, GoFundMe, Patreon and privately).

UPDATE 6th Nov 2019: the fundraiser page has now been updated, and a histogram of donations added: https://eahotel.org/fundraiser/

Replies from: Open_Thinker
comment by Open_Thinker · 2019-10-27T18:53:31.381Z · EA(p) · GW(p)

Yes, I see that that page is more up-to-date, I was looking at this page https://donations.vipulnaik.com/donee.php?donee=EA+Hotel#doneeDocumentList which is linked on eahotel.org. The inconsistency is itself a little concerning.

The question remains on the value or utility of the EA Hotel though, it looks like this is not expected to be answered until a future update 10. In my opinion this is a mistake, a clear estimate should have been provided in a prospectus prior to the initial launch of the EA Hotel as part of due diligence, and then this forecast should have been measured against continuously in the early months of the project.

Replies from: riceissa
comment by riceissa · 2019-10-27T20:58:39.843Z · EA(p) · GW(p)

The inconsistency is itself a little concerning.

I am one of the contributors to the Donations List Website (DLW), the site you link to. DLW is not affiliated with the EA Hotel in anyway (although Vipul, the maintainer of DLW, made a donation to the EA Hotel). Some reasons for the discrepancy in this case:

  • As stated in bold letters at the top of the page, "Current data is preliminary and has not been completely vetted and normalized". I don't think this is the main reason in this case.
  • Pulling data into DLW is not automatic, so there is a lag between when the donations are made and when they appear on DLW.
  • DLW only tracks public donations.
Replies from: Open_Thinker
comment by Open_Thinker · 2019-10-29T01:11:32.914Z · EA(p) · GW(p)

That is understandable, however when presenting information (i.e. linking to it on your homepage) there is an implicit endorsement of said information; else it should not be presented. This is irrespective of whether the source is formally affiliated or not--its simple presence is already an informal affiliation. The simple fact that the EA Hotel does not have a better presentation of information is itself meta information on the state of the organization and project.

However, this was not really the main point; it was only a "little" concern as I previously wrote. The more significant concern is that there does not seem to be a ready presentation of the effort's value (expected or real), and that this is not expected until update 10--as I wrote previously, this was in my opinion a mistake, as it should be one of the primary priorities for an EA organization.

Replies from: riceissa
comment by riceissa · 2019-10-29T23:22:34.612Z · EA(p) · GW(p)

Can you give some examples of EA organizations that have done things the "right way" (in your view)?

Replies from: Open_Thinker
comment by Open_Thinker · 2019-10-30T06:26:55.993Z · EA(p) · GW(p)

Thanks for asking, that's a good question.

It basically comes down to yield or return on investment (ROI). It seems quite common for utilitarianism and effective altruism to be related, and in the former to have some quantification of value and ratio of output per unit of input; one might say that the most ethical stance is to find the min / max optimization that produces the highest return. Whether EA demands or requires such an optimal maximum, or whether a suboptimal effectivity is still ethical, is an interesting but separate discussion.

So in a lot of the animal welfare threads, there is commonly some idea that they produce superior yield in ethical utility, usually because there is simply more biomass, it is much cheaper, etc. Even if I usually don't agree, there is still the basic quantification that provides a foundation for such an ethical claim.

Another example are environmental organizations such as Cool Earth, which gives x acres of rain forest preserved and y oxygen production or carbon sequestration for $ given. That is not exactly utility per se, but it is a good measure that could probably be converted into some units of generic utility.

For the EA Hotel, I am not sure what the yield is for $ given. In order to make a claim that x is the "best use" of resources, this sort of consideration is required and must be clear, IMO.

Consider that if allocation of resources for yield is an ethical decision, then asking for some clarification is not intended to be rude at all or an off-topic question, it is simply due diligence. Even if I have the funds to donate to the EA Hotel, if the yield is lower than the utility that can be produced by an alternative (and in fact competing) effort, then it is my ethical obligation to fund the alternative. Is it not?

Perhaps there is still a misunderstanding that my ask is overly aggressive or impolite; however, what is worse is for someone simply not to care and not to engage in the discussion. But from my perspective, the EA Forum is seemingly giving a pass to one of our own. Again, my ask is simply due diligence. For context, at £5.7k/month, I could fund the EA Hotel for over a year. However, does the EA Hotel provide better benefit than animal welfare efforts? Polio? Global warming? Political campaigns? Poverty alleviation?

The answer does not seem clear to me. Without that, it is difficult to proceed in making an ethical decision.

Replies from: Halffull, Greg_Colbourn
comment by Halffull · 2019-10-31T15:32:02.764Z · EA(p) · GW(p)

I think there's a clear issue here with measurability bias. The fact of the matter is that the most promising opportunities will be the hardest to measure (see for instance investing in a startup vs. buying stocks in an established business) - The very fact that opportunities are easy to measure and obvious makes them less likely to be neglected.

The proper way to evaluate new and emerging projects is to understand the landscape, and do a systems level analysis of the product, process, and team to see if you think the ROI will be high compared to other hard to measure project. This is what I attempted to do with the EA hotel here: https://www.lesswrong.com/posts/tCHsm5ZyAca8HfJSG/the-case-for-the-ea-hotel [LW · GW]

Replies from: Open_Thinker
comment by Open_Thinker · 2019-11-01T03:57:56.637Z · EA(p) · GW(p)

This point is reasonable, and I fully acknowledge that the EA Hotel cannot have much measurable data yet in its ~1 year of existence. However, I don't think it is a particularly satisfying counter response.

If the nature of the EA Hotel's work is fundamentally immeasurable, how is one able to objectively quantify that it is in fact being altruistic effectively? If it is not fundamentally immeasurable but is not measured and could have been measured, then that is likely simply incompetence. Is it not? Either way, it would be impossible to evidentially state that the EA Hotel has good yield.

Further, the idea that the EA Hotel's work is immeasurable because it is a meta project or has some vague multipler effects is fundamentally dissatisfying to me. There is a page full of attempted calculations in update 3, so I do not believe the EA Hotel assumes it is immeasurable either, or at least originally did not. The more likely answer, a la Occam's Razor, is that there is simply insufficient effort in resolving the quantification. There are, after all, plenty of other more pressing and practical challenges to be met on a day-to-day basis; and [surprisingly] it does not seem to have been pressed much as a potential issue before (per the other response by Greg_Colbourn).

Even if it is difficult to measure, a project (particularly one which aspires to be effective--or greatly effective, or even the most effective) must as a requirement outline some clear goals against which its progress can be benchmarked in my opinion, so that it can determine its performance and broadcast this clearly. It is simply best practice to do so. This has not been done as far as I can tell--if I am mistaken, please point me to it and I will revise my opinions accordingly.

There are a couple additional points I would make. Firstly, as an EA Hotel occupant, you are highly likely to be positively biased in its favor. Therefore, you are naturally inclined to calculate more generously in its favor; and certainly the article you wrote and linked to is positive in its support indeed. Is this refutable? It is also likely an objective fact that your interests align with the EA Hotel's, and someone whose interests were less aligned could easily weight the considerations you stated less heavily. You are therefore not an objective or the best judge of the EA Hotel's value, despite (or because of) your first-hand experience.

The other point, which I think is common throughout the EA community, is that it is somewhat elitist in thinking that the EA way is the best (and perhaps only) way--there is some credibility to this claim I believe as it was noted on the recent EA survey. For example, is Bill Gates an EA? He does not visit the EA forum much AFAIK, focuses on efforts that differ somewhat from the EA's priorities, etc. But undeniably I would think that his net positive utility vastly outweighs the entire EA Forum's even if he does not follow EA, or at least does not follow strictly. Bill Gates does not (to my knowledge) support the EA Hotel, and if he does then not to a level to make it financially sustainable in perpetuity. Should he--and if he does not, is he wrong for not doing so? If you believe that the EA Hotel is the best use of funds (as has been claimed at the top of this thread and is supported in your article), then yes, you would probably conclude that he is wrong based on his inaccurate allocation of resources which results in a sub-ideal outcome in terms of ethical utility. This logic is misguided in my opinion.

Contrarily to EA puritanism, the fact in my opinion is that there are generally EAs commonly beyond EA borders, e.g. celebrities like Bill Gates and Elon Musk, but also plenty of anonymous people in general. Is the "Chasm" you described real? I am not sure that it is, or at least not so acutely. In particular for the EA Hotel's context, there are plenty of other organizations already which are richly-funded such as Google's DeepMind that are active and significantly contributing to the same fields which the EA Hotel is interested in (from my understanding). The EA Hotel's contributions in such an environment are therefore likely not to be a large multipler (although it is not impossible to the opposite, and I am open to that possibility), but instead small relatively. It is a possibility that contributing to the EA Hotel is actually suboptimal or even unethical because of its incremental contributions which yield diminished returns relative to what could be the result via alternative avenues. This is not a definite conclusion, but I am noting it for completeness or inclusivity of contrary viewpoints.

To be clear, none of what I have written is intended as an insult in any way. The point is only that it is not clear that the EA Hotel is able to substantiate its claim to being effectively altruistic (e.g. via lack of measurability, which seems to be your argument), particularly "very" or even "the most" effective (in terms of output per resource input). Based on this lack of clarity, I find that I cannot personally commit to supporting the project.

However, it looks like the EA Hotel already has the funding it needs now, so perhaps we may simply go our separate ways at this point. My aim throughout was to be constructive. Hopefully some of it was useful in some way.

Replies from: Greg_Colbourn
comment by Greg_Colbourn · 2019-11-07T19:01:07.976Z · EA(p) · GW(p)
Bill Gates does not (to my knowledge) support the EA Hotel

We are far too small to be on Bill Gates' radar. It's not worth his time looking at grants of less than millions of $ (Who know's though, maybe we'll get there eventually?)

comment by Greg_Colbourn · 2019-10-30T18:05:31.705Z · EA(p) · GW(p)
does the EA Hotel provide better benefit than animal welfare efforts? Polio? Global warming? Political campaigns? Poverty alleviation?

The EA Hotel is a meta level project, as opposed to the other more object level efforts you refer to, so it's hard to do a direct comparison. Perhaps it's best to think of the Hotel as a multiplier for efforts in the EA space in general. We are enabling people to study and research topics relevant to EA, and also to start new projects and collaborations. Ultimately we hope that this will lead to significant pay-offs in terms of object level value down the line (although in many cases this could take a few years, considering that most of the people we host are in the early stages of their careers).

Replies from: Open_Thinker
comment by Open_Thinker · 2019-10-31T01:20:47.175Z · EA(p) · GW(p)

That is understandable; however it is dissatisfactory in my personal opinion--I cannot commit to funding on such indefinite and vague terms. You (and others) clearly think otherwise, but hopefully you can understand this contrary perspective.

Even if "the EA Hotel is a meta level project," which to be clear I can certainly understand, there should still be some understanding or estimate of what the anticipated multiplier should be, i.e. a range with a target within a +/- margin. From what I can see upon reviewing current and past guests' projects, I am not confident that there will be a high return of ethical utility for resource inputs.

Unless it can be demonstrated contrarily, my default inclination (similar to Bill Gates' strategy) is that projects in developing regions are generally (but certainly not always) significantly higher in yield than in developed regions; which is not unlike the logic that animal welfare efforts are higher yield than human efforts. The EA Hotel is in a developed society and seems focused on field(s) with some of the highest funding already, e.g. artificial intelligence (AI) research. Based on this, it seems perhaps incorrect (or even unethical) to allocate to this project. This isn't necessarily conclusive, but evidence to the opposite has not been clear.

Hopefully you understand that what I am describing above is not meant as a personal insult by any means or the result of rash emotions, but rather the result of rational consideration along ethical lines.

Replies from: Greg_Colbourn
comment by Greg_Colbourn · 2019-10-31T20:40:32.177Z · EA(p) · GW(p)

[Replying to above thread] One reason I asked you to plug some numbers in is that these estimates will depend a lot on what your priors are for various parameters. We will hopefully provide some of our own numerical estimates soon, but I don't think that too much weight should be put on them (Halffull makes a good point about measurability above [EA(p) · GW(p)]). Also consider that our priors may be biased relative to yours.

I'll also say that a reason for Part 2 of the EV estimate being put on the back burner for so long was that Part 1 [EA · GW] didn't get a very good reception (i.e. people didn't see much value in it). You are the first person to ask about Part 2!

[Replying to this thread]

projects in developing regions are generally (but certainly not always) significantly higher in yield than in developed regions

I think this is missing the point. The point of the EA Hotel is not to help the residents of the hotel, it is to help them help other people (and animals). AI Safety research, and other X-risk research in general, is ultimately about preventing the extinction of humanity (and other life). This is clearly a valuable thing to be aiming for. However, as I said before, it's hard to directly compare this kind of thing (and meta level work), with shovel-ready object level interventions like distributing mosquito nets in the developing world.

Replies from: Open_Thinker
comment by Open_Thinker · 2019-11-01T03:02:54.290Z · EA(p) · GW(p)

The fact that others are not interested in such due diligence is itself a separate concern that support for the EA Hotel is perhaps not as rigorous as it should be; however, this is not a concern against you or the EA Hotel, but rather against your supporters. This to me seems like a basic requirement, particularly in the EA movement.

I think this is missing the point. The point of the EA Hotel is not to help the residents of the hotel, it is to help them help other people (and animals). AI Safety research, and other X-risk research in general, is ultimately about preventing the extinction of humanity (and other life). This is clearly a valuable thing to be aiming for. However, as I said before, it's hard to directly compare this kind of thing (and meta level work), with shovel-ready object level interventions like distributing mosquito nets in the developing world.

No, I recognize, understand, and appreciate this point fully; but I fundamentally do not agree, that is why I cannot in good conscience support this project currently. Because it is such a high value and potentially profitable family of fields (e.g. AI research in particular) to society, it has already attracted significant funding from entrenched institutions, e.g. Google's DeepMind. In general, there is a point of diminishing returns for incremental investments, which is a possibility in such a richly-funded area as this one specifically. Unless there is evidence or at least logic to the contrary, there is no reason to reject this concern or assume otherwise.

Also, as part of my review, I looked at the profiles of the EA Hotel's current and past occupants; the primary measurable output seems to be progressing in MOOCs and posting threads on the EA Forum; this output is frankly ~0 in my opinion--this is again not an insult at all, it is simply my assessment. It may be that the EA Hotel is actually fulfilling the role of remedial school for non-competitive researchers who are not attracting employment from the richly funded organizations in their fields; such an effort would likely be low yield--again, this is not an insult, it is just a possibility from market-based logic. There are certainly other, more positive potential roles (which I am very open to, otherwise I would not bother to be continuing this discussion to this thread depth), however these have not yet been proven.

Re: the measurement bias response, this is an incomplete answer. It is fine to not have much data in support as the project is only ~1 year old at this point, however some data should be generated; and more importantly the project should have a charter with an estimate or goal for which to anticipate measurable effects, against which data can be attempted (whether successfully or unsuccessfully) to record how well the organization is doing in its efforts. How else will you know if you are being effective or not, or successful or not?

How do you know that the EA Hotel is being effectively altruistic (again particularly against competing efforts), in the context of your given claim at the top about being effectively "the best use of money"?

These issues still remain open in my opinion. Hopefully these critiques will at least be some food for thought to strengthen the EA Hotel and future endeavors.

comment by Gregory Lewis (Gregory_Lewis) · 2019-10-26T07:02:09.470Z · EA(p) · GW(p)

We were going to wait until “Giving Season” starts in December to start a fresh fundraiser, but now can’t afford to do that. We have several things in the pipeline to bolster our case (charity registration, compiling more outputs, more fundraising posts detailing the case for the hotel, hiring round for the Community & Projects Manager, refining internal systems), but they may not reach fruition in time unfortunately.

I'd expect the EA hotel to have fairly constant operating costs (and thus reliable forecasts of runway remaining). So I'd be keen to know what happened to leave the EA hotel out of position where its planned fundraising efforts would occur after they had already run out of money.

More directly, I'm concerned that the already-linked fb group discussion suggests the EA hotel bought the derelict building next door. I hesitate to question generosity, and it is unclear when this happened - or how much it cost - but CapEx (especially in this case where the CapEx doesn't secure more capacity but an option on more capacity, as further investment is needed to bring it online) when one has scarce reserves and uncertain funding looks inopportune.

Replies from: Greg_Colbourn, KevinO, Khorton
comment by Greg_Colbourn · 2019-10-26T12:56:48.150Z · EA(p) · GW(p)

I bought the hotel next door with my own money, and I've not spent any of the EA Hotel's money on it. Given it's relatively low cost, I see it as a decent investment largely independent of the EA Hotel (i.e. even if the EA Hotel fails I think there's a reasonable chance property prices will go up in Blackpool in the next 5-10 years).

Perhaps in terms of maximising my positive impact it would've been best for me to donate the money to the EA Hotel. I think that remains to be seen though. Although I think I probably was a little over-optimistic about the funding prospects for the EA Hotel at the time, in hindsight.

(Note the timing of the purchase wasn't ideal. It came up for auction. Strategically, I didn't want to lose the opportunity to enable the EA Hotel to easily expand (i.e. through knocking through the wall and using all same resources in terms of kitchen, appliances, stock etc) in the event it is successful enough to warrant it.)

comment by KevinO · 2019-10-26T09:22:23.988Z · EA(p) · GW(p)

From the body of the top post and this page: https://eahotel.org/fundraiser/, it sounds like they had estimated they would spend £5000 per month but instead spent £5700 per month. That may have contributed to running out early.

Replies from: KevinO
comment by KevinO · 2019-10-26T13:27:01.064Z · EA(p) · GW(p)

I guess the question could remain where the extra £700/month came from.

Replies from: Pablo_Stafforini
comment by Pablo (Pablo_Stafforini) · 2019-10-26T14:18:59.693Z · EA(p) · GW(p)

Is this really important? A discrepancy of £700 relative to the £5000 projection seems acceptable to me.

Replies from: Greg_Colbourn, KevinO
comment by Greg_Colbourn · 2019-10-26T17:15:29.034Z · EA(p) · GW(p)

The £5000/month was an estimate based on earlier spending. Our costs are variable dependent on occupancy, hours worked by staff, random maintenance costs etc. It's unfortunate that I didn't adjust the totaliser earlier based on the actual spend, and I considered just paying out of my own pocket to hide the mistake, given that it's likely to be a black mark against me/the EA Hotel (and there seems to be very little tolerance for mistakes in EA these days). I hope at least some people appreciate the honesty.

comment by KevinO · 2019-10-26T22:12:06.551Z · EA(p) · GW(p)

I was just thinking that I didn't really answer Gregory_Lewis's question.

comment by Kirsten (Khorton) · 2019-10-26T10:35:26.562Z · EA(p) · GW(p)

The Facebook link isn't working for me. Can someone from the EA Hotel confirm: did you buy the building next door?

Replies from: Greg_Colbourn
comment by Greg_Colbourn · 2019-10-26T18:34:30.248Z · EA(p) · GW(p)

I bought it, for £47k (see above [EA(p) · GW(p)]).

comment by richard_ngo · 2019-10-25T15:24:12.705Z · EA(p) · GW(p)

I'm planning to donate to the EA hotel. Given that it isn't a registered charity, I'm interested in doing donation swaps with EAs in countries where charitable donations aren't tax deductible (like Sweden) so that I can get tax deductions on my donations. Reach out or comment here if interested.

Replies from: Lukas T
comment by Lukas Trötzmüller (Lukas T) · 2019-10-25T18:41:47.585Z · EA(p) · GW(p)

You could use the donation swap system at EAHub: https://donationswap.eahub.org

comment by David_Moss · 2019-10-25T20:03:53.655Z · EA(p) · GW(p)

This is pretty sad to hear. I would already have used the EA Hotel at least once this year (when I moved back to the UK, before I found somewhere to rent) and more recently when looking for somewhere to rent in Blackpool (as it happens, I expect I may be renting in Blackpool in the near future, partly because the EA Hotel is there), were there any space in the EA Hotel. So I would like to see the EA Hotel expanding, rather than at risk of shutting down.

Replies from: Milan_Griffes
comment by Milan_Griffes · 2019-10-25T21:26:35.147Z · EA(p) · GW(p)


It reminds me of the situation the Berkeley REACH was in last summer: 1, 2 [LW · GW]

(I don't know where the REACH receives funding from now. It was awarded a $5,000 grant from the Meta Fund in November 2018.)

comment by CEEALAR (EA Hotel) · 2019-10-25T14:20:27.214Z · EA(p) · GW(p)

Here's an idea: sponsorship of individual rooms: £500/month, minimum commitment of 12 months, voided if hotel ceases to exist in current form in the mean time. You'd get to name the room - e.g. “x suite” - and perhaps specify a cause area it's to be reserved for. If you're interested in this, let's talk.

Replies from: Grue_Slinky
comment by Grue_Slinky · 2019-10-25T20:36:27.714Z · EA(p) · GW(p)

Not sure Greg officially approves of this, but there's also an octagon-shaped common room which we typically call "The Octagon". If you want to help financially and also troll all of us to no end, you could stipulate that we rename it to some other shape, e.g. "The Triangle".

Replies from: Greg_Colbourn, Khorton
comment by Greg_Colbourn · 2019-10-26T16:54:16.311Z · EA(p) · GW(p)

I approve. Anyone want to start the bidding?

comment by Kirsten (Khorton) · 2019-10-25T21:30:11.791Z · EA(p) · GW(p)

Upvoted because it makes me laugh

Replies from: willbradshaw
comment by Will Bradshaw (willbradshaw) · 2019-10-26T12:39:20.080Z · EA(p) · GW(p)

Just wanted to note that the room is, in fact, ten-sided.

Replies from: Greg_Colbourn
comment by Greg_Colbourn · 2019-10-26T18:44:03.313Z · EA(p) · GW(p)

It is, in fact, an irregular decagon (or 7 sides of an octagon with fitted seating, and a passage to the door :)

comment by Greg_Colbourn · 2019-10-25T14:22:21.532Z · EA(p) · GW(p)

As things stand I’m likely to give people notice soon to start paying rent or leave from 1 Dec. I feel that this could then cause a death spiral from people leaving, cost/person increasing, further people leaving because of that etc :(

comment by CEEALAR (EA Hotel) · 2019-10-26T19:12:09.058Z · EA(p) · GW(p)

Another idea: we'd be happy to offer a value-aligned significant donor a seat on our Board of Trustees (kind of like Holden Karnofsky and OpenAI, but on a smaller scale). Let us know if you'd like to discuss this with us.

comment by CEEALAR (EA Hotel) · 2019-10-25T14:29:50.936Z · EA(p) · GW(p)

"What about EA Grants / EA Funds?" We are still talking to EA Grants and the EA Long Term Future Fund. They have one or two sticking points that we're hoping to resolve with their input. We need funds to bridge the gap in the mean time.

Replies from: Nicole_Ross, Stefan_Schubert, Elizabeth
comment by Nicole_Ross · 2019-11-07T16:20:48.564Z · EA(p) · GW(p)

Hey all,

I'm the EA Grants evaluator. We don't usually comment publicly on reasons for not granting to something, but Greg gave us permission and encouragement in this case given the community interest. At this point I'm not excited to fund the EA Hotel's general costs. My concerns are:

- Hotel management generally (including selection of guests/projects)
- Potential for community health issues, and concern about handling of a staffing issue
- Some concern about the handling of past PR situations; I think these were very difficult situations, but I think an excellent version of the hotel would have handled these better

I'm still happy to help with finding an excellent Project and Community Manager, and to consider topping up/extending funding for the right candidate, should the EA Hotel find general operating funding elsewhere.

Replies from: Greg_Colbourn, Greg_Colbourn, Greg_Colbourn, EA Hotel
comment by Greg_Colbourn · 2019-11-07T18:20:53.340Z · EA(p) · GW(p)
Some concern about the handling of past PR situations; I think these were very difficult situations, but I think an excellent version of the hotel would have handled these better

I think this is a little unfair. It would be good to know exactly what we (or an excellent version of the hotel) could've (would've) done better regarding the PR situations (I assume this is referring to the Economist and Times articles). Oliver Habryka says here [EA(p) · GW(p)] "I still think something in this space went wrong", but doesn't say what (see my reply [EA(p) · GW(p)] to Habryka for detail on what happened with the media). Jonas Vollmer says in reply [EA(p) · GW(p)] to Habryka's comment:

... "better than many did in the early stages (including myself in the early stages of EAF) but (due to lack of experience or training) considerably worse than most EA orgs would do these days." There are many counterintuitive lessons to be learnt, many of which I still don't fully understand, either.

but doesn't elaborate. I have also talked to someone at CEA at length about media, including what happened with the hotel, and they didn't suggest anything that we could've done better given the situation (of the media outlets publishing whether we liked it or not). So I'm genuinely curious here. Although, ok, I guess maybe we could’ve removed the flipboard sheet from the wall before the journalist came in, even though it was a surprise visit.

Replies from: Julia_Wise
comment by Julia_Wise · 2019-11-07T21:07:08.111Z · EA(p) · GW(p)

Hi Greg - from my perspective, CEA did discuss all of these points with you. For example, last week when you emailed and asked what the Hotel could have done better on media, I replied about something I saw as a mistake and what I thought should have been done differently. We’ve also discussed the staffing issue. I'm happy to discuss more by email or call if you'd like.

I understand we may view these situations differently and that you may disagree with CEA's recommendations for improvements. We might also have different views about how much explicit direction (versus just advice) it’s appropriate for us to give an external org. I don't think it's accurate, though, to indicate that we haven't provided feedback or suggestions.

Replies from: Greg_Colbourn, Greg_Colbourn
comment by Greg_Colbourn · 2019-11-07T21:15:45.216Z · EA(p) · GW(p)

Hi Julia, ok but to me the point you raised about media was tangential, i.e. it was not directly related to the PR situations themselves. For those curious - I missed a meeting with a professional communications advisor at EAG London last year, on account of missing an email (in which the meeting was arranged for me) sent the day before whilst I was driving to London. I was overwhelmed at the time with interest in the hotel, and that wasn't the only email (or meeting) I missed.

Replies from: Julia_Wise
comment by Julia_Wise · 2019-11-07T21:30:25.070Z · EA(p) · GW(p)

The point I raised was not about the round of media stories from Sept 2018, but was about preparing for future media inquiries. So you're right that it's not about past media situations, but about how further situations might be handled.

Replies from: Greg_Colbourn
comment by Greg_Colbourn · 2019-11-07T21:51:44.642Z · EA(p) · GW(p)

Fair point, but Nicole refers to:

Some concern about the handling of past PR situations

Also it's worth mentioning our actual subsequent track record over the past year (i.e. 0 further PR situations).

comment by Greg_Colbourn · 2019-11-07T21:30:55.858Z · EA(p) · GW(p)

Regarding explicit direction vs advice - for me it was the fact that something I thought had been dealt with acceptably seems to have - unbeknownst to me - remained a live issue in terms of it effecting funding decisions. More explicit direction at the time in terms of "if you want to get funding from CEA you need to do this" seems like it would've been better in hindsight.

Replies from: Julia_Wise
comment by Julia_Wise · 2019-11-07T21:57:45.589Z · EA(p) · GW(p)

From my perspective, I repeatedly gave you information about a situation that I saw as a problem. How you decided to handle the problem as the manager of the project was up to you. We don't see it as a good idea for funders to make ultimatums about the staffing decisions of potential grantees. But as Nicole said, this was one of the things among many she considered when looking back at the history of the project.

Replies from: Julia_Wise
comment by Julia_Wise · 2019-11-08T21:52:48.012Z · EA(p) · GW(p)

After talking more with Greg, I realized I should clarify that I don't mean I think something went badly wrong at the Hotel.

As Nicole said above, CEA would be happy to help the Hotel with finding an excellent Project and Community Manager, and to consider helping to fund these roles if there's another source of general funding.

I have no objections to other donors supporting the Hotel. The default is that projects don't get EA Grants, and this situation should be seen as "the default happened" rather than "the Hotel did something unusually bad to disqualify itself from funding it would otherwise have gotten."

Replies from: Milan_Griffes
comment by Milan_Griffes · 2019-12-14T18:06:58.413Z · EA(p) · GW(p)
As Nicole said above, CEA would be happy to help the Hotel with finding an excellent Project and Community Manager, and to consider helping to fund these roles if there's another source of general funding.

Is CEA helping out with this?

Replies from: Greg_Colbourn, Nicole_Ross
comment by Greg_Colbourn · 2020-08-10T08:51:15.613Z · EA(p) · GW(p)

We now have general funding for the next few months and are hiring for both a Community & Projects Manager and an Operations Manager, with input from Nicole and others at CEA. Unfortunately with the winding down of EA Grants [EA · GW] the possibility of funding for the Community & Projects Manager salary has gone. If anyone would like to top up the salaries for either the Community & Projects Manager or Operations Manager (currently ~£21.5k/yr pro rata including free accommodation and food), please get in touch!

comment by Nicole_Ross · 2019-12-20T19:01:50.939Z · EA(p) · GW(p)

We are still happy to help with hiring and to consider helping to fund the role if there's enough general funding. We haven't received new info from Greg about whether there is enough general funding that it's worth moving forward hiring, so we're currently on standby.

comment by Greg_Colbourn · 2019-11-07T18:10:34.839Z · EA(p) · GW(p)
Potential for community health issues, and concern about handling of a staffing issue

It's true that there is potential for community health issues whenever you have a group of people living together. I think we have generally faired well in this regard so far though. It has been suggested that there is a significant reputational risk involved with funding a project such as the EA Hotel given the interpersonal dynamics of a large group of people living together, and therefore it might be better for it to be funded by individuals instead of grant-making organisations. However, as a counter-point: most universities provide massively-communal student accommodation.

Regarding the staffing issue, I'm afraid there's not much I can say publicly. Although it was my understanding at the time that we dealt with it appropriately, after taking advice from prominent community members.

Replies from: John_Maxwell_IV
comment by John_Maxwell (John_Maxwell_IV) · 2019-11-25T01:05:25.267Z · EA(p) · GW(p)

I'm not convinced community health issues are uniquely problematic when you have people living together. I feel like one could argue just as easily that conferences are risky for community health. If something awkward happens at EA Global, you'll have an entire year to chew on that before running into the person next year. (Pretty sure that past EA Global conferences have arranged shared housing in e.g. dormitories for participants, by the way.) And there is less shared context at a conference because it happens over a brief period of time. One could also argue that having the community be mostly online runs risks for community health (for obvious reasons), and it's critical for us to spend lots of time in person to build stronger bonds. And one could argue that not having much community at all, neither online nor in person, runs risks for community health due to value drift. Seems like there are risks everywhere.

If people really think there are significant community health risks with EA roommates, then they could start a charity which pays EAs who currently live with EA roommates to live alone. To my knowledge, no one has proposed a charity like that. It doesn't seem like a very promising charity to me. If you agree, then by the reversal test, it follows that as a community we should want to move a bit further in the direction of EAs saving money by living together.

Replies from: Gregory_Lewis
comment by Gregory Lewis (Gregory_Lewis) · 2019-11-25T07:48:09.788Z · EA(p) · GW(p)

The reversal test doesn't mean 'if you don't think a charity for X is promising, you should be in favour of more ¬X'. I may not find homeless shelters, education, or climate change charities promising, yet not want to move in the direction of greater homelessness, illiteracy, or pollution.

If (like me) you'd prefer EA to move in the direction of 'professional association' rather than 'social movement', this attitude's general recommendation to move away from communal living (generally not a feature of the former, given the emphasis on distinguishing between personal and professional lives) does pass the reversal test, as I'd forecast having the same view even if the status quo was everyone already living in group house (or vice versa).

Replies from: John_Maxwell_IV
comment by John_Maxwell (John_Maxwell_IV) · 2019-11-25T08:42:30.156Z · EA(p) · GW(p)

The reversal test doesn't mean 'if you don't think a charity for X is promising, you should be in favour of more ¬X'. I may not find homeless shelters, education, or climate change charities promising, yet not want to move in the direction of greater homelessness, illiteracy, or pollution.

Suppose you're the newly appointed director of a large charitable foundation which has allocated its charitable giving in a somewhat random way. If you're able to resist status quo bias, then usually, you will not find yourself keeping the amount allocated for a particular cause at exactly the level it was at originally. For example, if the foundation is currently giving to education charities, and you don't think those charities are very effective, then you'll reduce their funding. If you think those charities are very effective, then you'll increase their funding.

Now consider "having EAs live alone in apartments in expensive cities" as a cause area. Currently, the amount we're spending on this area has been set in a somewhat random way. Therefore, if we're able to resist status quo bias, we should probably either be moving it up or moving it down. We could move it up by creating a charity that pays EAs to live alone, or move it down by encouraging EAs to move to the EA Hotel. (Maybe creating a charity that pays EAs to live alone would be impractical or create perverse incentives or something, this is more of an "in principle" intuition pump sort of an argument.)

Edit: With regard to the professionalism thing, my personal feelings on this are something like the last paragraph in this comment [EA(p) · GW(p)] -- I think it'd be good for some of us to be more professional in certain respects (e.g. I'm supportive of EAs working to gain institutional legitimacy for EA cause areas), but the Hotel culture I observed feels mostly acceptable to me. Probably some mixture of not seeing much interpersonal drama while I was there, and expecting the Hotel residents will continue to be fairly young people who don't occupy positions of power (grad student housing comes to mind). FWIW, my personal experience is that the value of professionalism comes up more often in Blackpool EA conversations than Bay Area EA conversations. With the Bay Area, you may very well be paying more rent for a less professional culture. Just my anecdotal impressions.

Replies from: Khorton, Gregory_Lewis
comment by Kirsten (Khorton) · 2019-11-25T09:16:39.911Z · EA(p) · GW(p)

I find this thought experiment really weird because I don't think EAs living together should be centrally managed. It seems really obvious to me that EA as a movement faces less risks when a few friends who met through EA decide to move in together, rather than when people apply to an 'EA house' with social programmes where they don't know anyone.

Like, if a couple living in the EA Hotel break up, there's a good chance they'll both continue to living there and it'll be very awkward. If you're in a flatshare, I'd expect one of them to move out ASAP. The social norms are just so different.

comment by Gregory Lewis (Gregory_Lewis) · 2019-11-25T09:49:33.692Z · EA(p) · GW(p)

I agree it would surprise if EA happened upon the optimal cohabitation level (although perhaps not that surprising, given individuals can act by the lights of their best interest which may reasonably approximate the global optimum), yet I maintain the charitable intervention hypothetical is a poor intuition pump as most people would be dissuaded from 'intervening' to push towards the 'optimal cohabitation level' for 'in practice' reasons - e.g. much larger potential side-effects of trying to twiddle this dial, preserving the norm of leaving people to manage their personal lives as they see best, etc.

I'd probably want to suggest the optimal cohabitation level is below what we currently observe (e.g. besides the issue Khorton mentions, cohabitation with your employees/bosses/colleagues or funder/fundee seems to run predictable risks), yet be reluctant to 'intervene' any further up the coercion hierarchy than expressing my reasons for caution.

comment by Greg_Colbourn · 2019-11-07T17:30:20.882Z · EA(p) · GW(p)

Thanks for commenting Nicole. To address your points (will post a separate comment for each):

Hotel management generally (including selection of guests/projects)

In terms of general management, I agree that there is always room for improvement, but I don't think things have been too bad so far.

Regarding the selection of guests/projects, I have a lot to say about this, which I hope to cover in EA Hotel Fundraiser 10: Estimating the relative Expected Value of the EA Hotel (Part 2), and possibly also a separate post focusing more on my personal opinions. For now I will say that I think there might be some philosophical disagreement between us, although I can't be certain as I don't know the specifics of which guests/projects you are referring to in particular.

comment by CEEALAR (EA Hotel) · 2019-11-07T17:22:27.331Z · EA(p) · GW(p)

The Project and Community Manager (or Community & Projects Manager) is a role that largely involves overseeing the EA-focused work being done at the Hotel, facilitating productivity and offering practical and strategic advice to guests, in order to help maximise the value of their work to the world.

Other tasks for this role include: answering email enquiries; video calls with applicants; coordinating with Trustees and Advisors to vet applicants; helping maintain community morale at a high level, and resolving conflict if it arises, in coordination with the Operations Manager; developing overall strategy for the EA Hotel, in coordination with Trustees.

We hope to do a hiring round for the role as and when we get back to 6 months runway of general operating costs, and appreciate Nicole's interest in potentially funding the role. Denisa Pop is currently in the role in the interim.

comment by Stefan_Schubert · 2019-10-25T15:21:42.367Z · EA(p) · GW(p)

Can you say what those sticking points are? I guess that could be relevant to know for other potential donors.

Replies from: EA Hotel, Greg_Colbourn
comment by CEEALAR (EA Hotel) · 2019-10-25T16:43:26.664Z · EA(p) · GW(p)

Not sure how much we're allowed to say. Will ask the grantmakers.

comment by Greg_Colbourn · 2019-11-07T18:25:26.809Z · EA(p) · GW(p)

See Nicole's comment [EA(p) · GW(p)] in the parent thread.

comment by Elizabeth · 2019-10-25T20:49:04.797Z · EA(p) · GW(p)

One of the fund managers published some thoughts here [EA(p) · GW(p)] six months ago.

Replies from: Greg_Colbourn
comment by Greg_Colbourn · 2019-10-26T17:34:12.659Z · EA(p) · GW(p)

Yes, where I say above [EA(p) · GW(p)] that I "probably was a little over-optimistic about the funding prospects for the EA Hotel at the time [when buying the hotel next door]", it was largely based on this exchange.

comment by CEEALAR (EA Hotel) · 2019-10-25T14:34:49.093Z · EA(p) · GW(p)

"How might you get on a sustainable footing in terms of funding?" We're hopeful that within 2-5 years we could be sustained by alumni donating back amounts higher than their stay cost.

comment by CEEALAR (EA Hotel) · 2019-10-25T14:33:20.983Z · EA(p) · GW(p)

"Tax deductibility?" We're still in the process of trying to get a charity registered. This has ended up being a lot more complex and time-consuming than initially expected down to the uniqueness of our project. We do have some potential donors waiting on it, but again, we need funds to bridge the gap to getting charitable status (best case scenario: this could happen by the end of 2019).

Replies from: Peter_Hurford
comment by Peter Wildeford (Peter_Hurford) · 2019-10-25T15:38:48.905Z · EA(p) · GW(p)

Could you get a fiscal sponsor? I guess at this point it makes sense to just wait until the end of the year...

Replies from: EA Hotel
comment by CEEALAR (EA Hotel) · 2019-10-25T16:41:12.912Z · EA(p) · GW(p)

We have talked to people about this but it often comes down to the fact that even if they could hold the money for us, they'd only be able to give it to us if we get non-profit status (and this isn't a certainty).

Replies from: Peter_Hurford
comment by Peter Wildeford (Peter_Hurford) · 2019-10-25T19:13:25.619Z · EA(p) · GW(p)

Huh, maybe it's a UK thing? In my US / CA experience, you can get money (and offer tax deductions) from a fiscal sponsor even if you are not a non-profit.

Replies from: Denkenberger
comment by Denkenberger · 2019-10-28T22:25:20.848Z · EA(p) · GW(p)

Would CEA be willing to accept donations and route them to EA hotel so the donors in the US or UK can get the tax advantage?

comment by CEEALAR (EA Hotel) · 2019-10-25T14:17:52.003Z · EA(p) · GW(p)

“Why not just charge people?" I think that would end up missing most of the counterfactual value. We are providing grants in the form of free accommodation and board for those working full time on EA related endeavors (at a very low cost per person year of work - ~£6k). Having a default of charging would curtail the interest of the people and projects most likely to benefit (and the value produced by them). It kind of goes against most of the point of the project (like trying to save a scholarship by asking the recipients to pay).” See thread here for further discussion: https://www.facebook.com/groups/EAHotel/permalink/2617815558273857/?comment_id=2618854804836599 [Note: Closed group, welcome for people to join]

Replies from: Liam_Donovan, David_Moss, Milan_Griffes
comment by Liam_Donovan · 2019-10-26T18:01:00.203Z · EA(p) · GW(p)

What if rooms at the EA Hotel were cost-price by default, and you allocated "scholarships" based on a combination of need and merit, as many US universities do? This might avoid a negative feedback cycle (because you can retain the most exceptional people) while reducing costs and making the EA Hotel a less attractive target for unaligned people to take resources from.

Replies from: EA Hotel
comment by CEEALAR (EA Hotel) · 2019-10-27T08:08:59.173Z · EA(p) · GW(p)

With the charity structure we're setting up, charging cost price will also amount to a grant in the form of a partial subsidy. Charging anyone less than market rate (~double cost price) means they are a beneficiary of the charity. So in practice everyone will have to apply for a grant of free accommodation, board and stipend, and the amount given (total or partial subsidy) will depend on their need and merit.

comment by David_Moss · 2019-10-25T19:53:19.005Z · EA(p) · GW(p)

I think that would end up missing most of the counterfactual value... It kind of goes against most of the point of the project (like trying to save a scholarship by asking the recipients to pay).

There could be of significant value to some people to have subsidised much cheaper than usual rent (in a hotel with a ready-made dedicated EA community), even if it's not free. Of course, it's a further question whether there are enough such people to sustain the hotel in the short term, if the hotel transitions away from fully covering expenses.

I think it would be interesting to see how many current/potential guests could/would pay some small sum. Going forward, one could also have some kinds of honour-based system, where people indicate whether they would be able to pay some rent while staying at the hotel or whether they would require full coverage plus a stipend.

Replies from: EA Hotel
comment by CEEALAR (EA Hotel) · 2019-10-26T17:27:44.375Z · EA(p) · GW(p)

We already have an honour-based system where people with an income or >24 months runway in savings are asked to pay cost price. We could perhaps tighten this up, but don't really want to end up with a system where people with very limited resources would feel obliged to pay and thus don't apply.

comment by Milan_Griffes · 2019-10-25T21:29:04.467Z · EA(p) · GW(p)

People tend to value things more if they pay for them than if they're free. [Citation needed]

Replies from: Liam_Donovan
comment by Liam_Donovan · 2019-10-26T17:55:32.115Z · EA(p) · GW(p)

What does this mean in the context of the EA Hotel? In particular, would your point apply to university scholarships as well, and if not, what breaks the analogy between scholarships and the Hotel?