Posts

Introduction to Effective Altruism + Socialising (Free Pizza!) 2022-10-01T20:31:09.777Z
EA Germany Meetup @TEAMWORK 2022-09-30T15:44:03.816Z
Learning from Matching Co-Founders for an AI-Alignment Startup 2022-09-24T13:11:43.430Z
Worldbuilding Reading Picnic 2022-06-01T09:16:20.608Z
Mastermind Groups: A new Peer Support Format to help EAs aim higher 2022-05-31T18:37:14.213Z
Consider Changing Your Forum Username to Your Real Name 2022-04-26T17:07:10.211Z
Proposal for Forecasting Givewell-Charity Impact-Metrics 2022-04-13T10:21:34.484Z
Social Meetup in March 2022-02-25T16:44:35.225Z

Comments

Comment by Patrick Gruban (gruban) on Report on German Local Groups · 2022-09-04T10:58:46.042Z · EA · GW

Thank you for doing the research and writing it up, I found this very helpful!

Comment by Patrick Gruban (gruban) on My EA Failure Story · 2022-07-12T08:24:44.585Z · EA · GW

Thank you for sharing, I think your post will resonate with many people and show them they are not alone in their struggles. I've gone through similar phases of depression, guilt, feelings of rejection and not allowing myself to seek help or complain as I'm much better off than 99% of the world. This sucks.

The Celebrating Failure at the fUnconference that Ludwig mentioned felt cathartic to me as people shared professional and personal failures. In EA we're an unusual community as we try hard and constantly fail at our expectations. This goes for people applying for jobs as well as leaders of EA orgs. A coach recently told me most calls with them turn to mental health problems at some point.

I hope you will find a community that supports you. I have proposed Masterminds as a format and am in talks to see how we can make it happen. But I've heard of more similar remote formats and interventions that are planned and am hopeful we will see some soon.

As a side note: What helped me in the last months were the 80K talk where Will MacAskill talks about his depression and the one with Sam Bankman-Fried where he agrees that most people will fail trying:

So I think there are really compelling reasons to think that the “optimal strategy” to follow is one that probably fails — but if it doesn’t fail, it’s great. But as a community, what that would imply is this weird thing where you almost celebrate cases where someone completely craps out — where things end up nowhere close to what they could have been — because that’s what the majority of well-played strategies should end with. I don’t think that we recognize that enough as a community, and I think there are lots of specific instances as well where we don’t incentivize that.

Also really helpful was the book by Julian Simon where he talks about overcoming depression in a very relatable way (link via Rob Wiblin).

I shudder to think that someone who donates any amount to AMF would ever feel bad about it, yet I know how hard it can be to convince oneself otherwise.

I recently read the Notes From a Pledger who I similarly far away from a hub and is ok with donating. The comment by Michelle Hutchinson touched me as it brought back the realisation that we're already doing so much more than most people in donating. It's great to aim high and try to get a job in EA but there is no shame in failing, getting a normal job and continuing with donations.

Just trying and failing is something to be celebrated as most people never try. Thank you for trying!

Comment by Patrick Gruban (gruban) on Cape Town Summer EA Office · 2022-06-24T15:38:44.582Z · EA · GW

Having spent two weeks in Cape Town this March and meeting Jordan there I can confirm that it’s a great city to be in and I would love to see an EA coworking space there. Thank you for taking the initiative!

Comment by Patrick Gruban (gruban) on Mastermind Groups: A new Peer Support Format to help EAs aim higher · 2022-06-01T08:36:34.298Z · EA · GW

Thanks, done 

Comment by Patrick Gruban (gruban) on Mastermind Groups: A new Peer Support Format to help EAs aim higher · 2022-06-01T08:07:38.540Z · EA · GW

I can see your concern and coming up with a new name could be nice. On the other hand, I suspect most EAs wouldn't be too concerned if we use a tool internally in a way that works for us while it's also being used by others for less useful purposes.

Comment by Patrick Gruban (gruban) on Mastermind Groups: A new Peer Support Format to help EAs aim higher · 2022-06-01T07:52:58.363Z · EA · GW

Interesting. I read your post while researching for this one and found it very interesting. To me, it seemed that you were describing something bigger and more encompassing than a Mastermind that seems restricted in size, topics and frequency of meetings. But there is definitely some overlap and it's one of the few posts on the forum around the deliberate groups and their setup.

Comment by Patrick Gruban (gruban) on Mastermind Groups: A new Peer Support Format to help EAs aim higher · 2022-06-01T07:48:00.359Z · EA · GW

Thanks, I've made it bold at the top.

Comment by Patrick Gruban (gruban) on "Big tent" effective altruism is very important (particularly right now) · 2022-05-20T05:04:52.836Z · EA · GW

Thank you for this post, I was thinking along similar lines and am grateful that you wrote this down. I would like to see the number of people grow that make decisions around career, donations and volunteering based on the central EA question regardless of whether they call themselves EA. More than a billion people live in high income countries alone and I find it conceivable that 1-10% would be open to making changes in their lives depending on the action they can take. But for EA to accommodate 10-100 million people I also assume different shopfronts in addition to the backend capabilities (having enough charities that can handle vast amounts of donations, having pipelines for charity entrepreneurship that can help these charities grow, consulting capacity to help existing organizations to switch to effectiveness metrics etc). If we look at the movement from the perspective of scaling to these numbers I assume we will see a relatively short term saturation in longtermist cause areas. Currently we don’t seem to be funding restricted in that area and I don’t see a world where millions working on these problems will be better than thousands. So from this perspective I would like us to think about longer view and build the capacity now for a big EA movement that will be less effective on the margin while advocating for the most effective choices now in parallel.

Comment by Patrick Gruban (gruban) on Most students who would agree with EA ideas haven't heard of EA yet (results of a large-scale survey) · 2022-05-20T04:27:41.433Z · EA · GW

Why do you think a conversion rate of 5% is shockingly low? Depending on the intervention this can be a high rate in marketing. A fellowship seems like a relatively small commitment and changing the career is a relatively high ask. As we’re not emphasizing earning to give as much as before I would also expect many people to not find high impact work.

Comment by Patrick Gruban (gruban) on DeepMind’s generalist AI, Gato: A non-technical explainer · 2022-05-17T04:59:53.996Z · EA · GW

Thank you for writing this! I found it very helpful as I only saw headlines about Gato before and am not watching developments in AI closely. I liked the length and style of writing very much and would appreciate similar posts in future.

Comment by Patrick Gruban (gruban) on EA and the current funding situation · 2022-05-11T04:51:48.262Z · EA · GW

I share your worries about the effects on culture. At the same time I don't see this vision as bad:

For many months, they will sit down many days a week and ask themselves the question "how can I write this grant proposal in a way that person X will approve of" or "how can I impress these people at organization Y so that I can get a job there?", and they will write long Google Docs to their colleagues about their models and theories of you, and spend dozens of hours thinking specifically about how to get you to do what they want, while drawing up flowcharts that will include your name, your preferences, and your interests.

Imagine a global health charity that wants to get on the GiveWell Top Charities list. Wouldn't we want it to spend much time thinking about how to get there, ultimately changing the way it works in order to come up with the evidence needed to get included? For example, Helen Keller International was founded more than 100 years ago and its vitamin A supplementation program is recommended by GiveWell. I would love to see more external organisations change in order to get EA grants instead of us trying to reinvent the wheel where others might already be good.

Organisations getting started or changing based on the available funding of the EA community seems like a win to me. As long as they have a mission that is aligned with what EA funders want and they are internally mission-aligned we should be fine. I don't know enough about Anthropic for example but they just raised $580M mainly from EAs while not intending to make a profit. This could be a good signal to more organisations out there trying to set up a model where they are interesting to EA funders.

In the end, it comes down to the research and decision making of the grantmaker. GiveWell has a process where they evaluate charities based on effectiveness. In the longterism and meta space, we often don't have such evidence so we may sometimes rely more on the value alignment of people. Ideally, we would want to reduce this dependence and see more ways to independently evaluate grants regardless of the people getting them.

Comment by Patrick Gruban (gruban) on EA and the current funding situation · 2022-05-10T10:25:43.608Z · EA · GW

I was also surprised to be seeing management and scaling organisations described as "rarely people’s favourite activities", this seems to be a strong claim. For me, it's the most motivating activity and I'm trying to find an organisation where I can contribute in this area.

Comment by Patrick Gruban (gruban) on EA and the current funding situation · 2022-05-10T10:22:14.822Z · EA · GW

He might be referring to Gary Wang as he does later in the text, but not sure about this

Comment by Patrick Gruban (gruban) on Do you offset your carbon emissions? · 2022-05-05T21:46:05.490Z · EA · GW

For travel I calculate what offsetting would have cost, take the amount from my travel budget and donate it to EA recommend climate charities (via Effektiv Spenden in Germany).

Comment by Patrick Gruban (gruban) on How do we create a culture of ambition without deteriorating the community’s mental health? · 2022-04-30T11:19:22.049Z · EA · GW

Thank you so much Max for writing this! I started a draft forum post for a proposal just yesterday. My idea was to have groups of EAs that aim high and fail often and that support each other. Knowing that others are in similar situations and having a smallish group to discuss the strain and celebrate trying might make things easier. I at least would like it. I was planning to the share the draft with you anyway and would love to get your take on it.

Comment by Patrick Gruban (gruban) on Increasing Demandingness in EA · 2022-04-29T14:48:17.877Z · EA · GW

Thank you for this post that touches on the important point of demandingness. Personally, I can see it in two ways.

On a global level giving 10% to effective causes is relatively rare. Giving What We Can has grown impressively but still, less than 1 in every 50.000[1] of the world's high-income population have taken it. 10% is higher than the average donations that are below 2% of GDP. Even in the EA survey, only 1/3 have said to donate at least this amount. While some of the top areas in EA seem less funding constraint, there is still much room for spending until for example GiveDirectly can't give away any more money. In that sense, I'm very grateful to anyone who is able and willing to commit to giving 10% or more of their income and would not want to exclude them from seeing themselves as Effective Altruists. If we've funded everything that is equivalent to GiveDirectly's impact or we have at least 50 Mio. people donating 10+% then I'd revisit this but currently, there is still enough to do.

On a personal level, the concept of demandingness has no limit. 10% is just a Schelling point, something that is easy to communicate for people new to the movement, a goal to be reached. Doing good better doesn't stop there and it doesn't stop at thinking about donations. I like the framing of excited altruism better or altruism as a central purpose. Another framing could be that of aiming higher: Continuously stretching for ways to have more impact while taking care of oneself. Each of these framings will have its supporters and I would encourage anyone to select the one that motivates them best. At the same time, the community and its support structure are very important to keep people healthy and motivated when they feel they are failing at their self-set goals.

  1. ^

    Taking the number of 500 Mio. high-income people in the world and 8500 GWWC members

Comment by Patrick Gruban (gruban) on Is it possible to change user name? · 2022-04-28T08:21:59.946Z · EA · GW

Try contacting the forum team. I haven't seen a written policy of when it's possible to change your username so there might be some restrictions that apply.

Comment by Patrick Gruban (gruban) on Mid-career people: strongly consider switching to EA work · 2022-04-28T08:19:27.469Z · EA · GW

It’s interesting that this comment talks about more generalist roles being mentioned at EAG that haven’t been publicised. I wonder if it is more likely that specialist roles get ‘officially’ publicised, while the more generalist ones are likelier to not be, maybe to the extent of only living in someone’s head in the style, ‘we could really do with someone to help us out on operations…’

As I was only looking for operations roles I don't know if there is a difference to specialists. At the moment there seems to be a lot of dynamics with orgs getting new funding and being able to expand quickly. People at the orgs might be able to tell you they are in the process of writing a job post or they might already have a document but not have posted it publicly. Also for some jobs I assume it might be easier to approach people or networks before posting them and then dealing with many applications. But this is only speculation.

What I would find really useful as more of a generalist is advice around ‘here’s how to use your skill stack to get a job in EA’.

My impression is that often co-founders of organisations don't know themselves what a generalist might be doing in a year as everything is changing quickly. This seems to be very similar to startups. When hiring I would always point out that a job title in a contract should be seen as a starting point and might have little overlap with the actual job a few months in.

The upside is that as a generalist in a small and growing organisation you can bring your specific talents to the table and have the chance to change the role so that it fits your strengths. You can then help outsource or hire talent that can cover your weaknesses.

For mid-career people, it feels like runway may be less of an impact relative to the knowledge you may be giving up something with a guaranteed impact, even if it may not be optimal, on the basis of uncertain factors.

In terms of giving up something, you might try to get a sabbatical at your current company to try out direct EA work for a year. If this doesn't work out you might discuss quitting on good terms so that they'd be willing to hire you again if they have a job open after a year.  It might be useful to research how likely this would work out for you.

For the general framing of impact, I personally ask myself: How can I increase the expected value of the EA community having a bigger impact? Especially in longtermist organisations, the additional dollar donated might be much less useful at the moment than being a co-founder or an early employee of a new organisation. This can be still true if the organisation has a high risk of failure but might do a lot of good if it succeeds.

I see that this can make it hard for many mid-career people to change jobs and leave a secure position. But in willing to do it, you're filling a neglected gap. The counterfactual expected value of your work might be one or two orders of magnitude higher than earning to give.

Comment by Patrick Gruban (gruban) on Consider Not Changing Your Forum Username to Your Real Name · 2022-04-28T07:42:25.020Z · EA · GW

I like your framing of public/private EAs and pointing out that this differs from the point of switching from pseudonyms (where you could easily find out the real name) to real names. I could see other cases where completely private names might be useful:

  • You have a stalker and don't want them to see when you post or what you are currently thinking about
  • You're discussing career change without wanting your current employer to know about it
  • Any kind of whistleblower situation where you're pointing out bad behaviour
  • Discussing things that the community is ok with discussing but that have reputational risks outside

In some cases, a separate anonymous account parallel to your public one might be good.

In terms of the points that you raised about politicians and other public figures, I would advocate for real-name usage. Especially as a politician you often visit and engage with many different groups and as long as your message is consistent between the platforms it should not be encouraged to do this anonymously. 

In general, we should try to make EA as broad a platform as is feasible given our principles, in order to not be associated with one political party or a few donors. The more different people we have interacting with their real names with us, the better - at least as long we can uphold good discussion norms.

Similarly, if you're thinking about starting EA 2.0 it would be good to first openly engage with the community here in order to see if there is support for changes from within. If you start EA 2.0 then this should be either consistent with your writing here or you should be able to argue why you changed your mind.

Comment by Patrick Gruban (gruban) on Consider Not Changing Your Forum Username to Your Real Name · 2022-04-28T07:19:36.780Z · EA · GW

This reminds me of Julia Wise's comment

My coworkers got me a mug that said "Sorry, I'm not Julia Galef" to save me from having to say it so much at conferences.

So it seems this also comes up with real names, but I do agree with your point.

Comment by Patrick Gruban (gruban) on Consider Not Changing Your Forum Username to Your Real Name · 2022-04-28T06:22:56.784Z · EA · GW

After reading this I immediately had the thought that your first name might not be Evelyn as I had assumed but perhaps Eve or Evel (as in Evel Knievel). So yes, I agree with this point.

Comment by Patrick Gruban (gruban) on Mid-career people: strongly consider switching to EA work · 2022-04-26T17:20:09.923Z · EA · GW

>doing research projects in an EA hub

I meant staying for some time in a location where many other EAs are (like the Bay Area, Oxford, London) and working on a research project from there. This could combine the exchange with other people with self-study and deep thinking.

>joining an 80k career group for people of the same age

I would like to have a group of mid-career professionals that are all taking the 8-week 80k career course where we can encourage each other and can discuss questions that come up.

Comment by Patrick Gruban (gruban) on Is it possible to change user name? · 2022-04-26T16:23:37.462Z · EA · GW

There seems to have been an update since this was posted as I was able to change my username without any additional checks now.

Comment by Patrick Gruban (gruban) on Mid-career people: strongly consider switching to EA work · 2022-04-26T12:41:25.580Z · EA · GW

Thank you so much for writing this up, Ben! There are many things that ring true to me, being in my 40s. I'd like to add other misconceptions I experienced:

  • Misconception: EA is only for young people
    • Reality: Although joining a local group can make you feel as if you're sticking out age-wise this will change if you join EAG conferences, especially the mid- and late-career meetups there. Conferences are a great place to start meeting people in similar career positions. Also just sending emails to people you are interested in directly often leads to you being able to have a call. The 80k career call is also a good place to get introduced.
  • Misconception: I will be viewed differently because of my age
    • Reality: After overcoming my own fear of being judged for my age I found out that EAs are generally welcoming and open. I now find myself joining EA meetings more often than ones with my former peers because of the shared interest and good discussion culture. I neither see myself as being penalized nor having special privileges for being older. For example, job applications seem very standardized and fair.
  • Misconception:  Working for an EA org means moving if I'm not in an existing hub
    • Reality: While some organisations want to have staff on the ground, others are fully or partially remote. With more co-working spaces popping up it may even be possible to join EAs in your city while working remotely for an EA org in your specialist area.
  • Misconception: EA orgs are small and mainly need specialists
    • Reality: While many EA orgs have 5 or fewer employees more and more are getting funding to scale up. This means there is more demand for knowledge around organisation building and scaling as well as additional support staff roles. At EAG London I heard about jobs that hadn't been published yet and was able to apply. Prior to the conference, I hadn't even considered these roles as operation roles seemed very limited in scope.
  • Misconception: I can do more good by earning to give
    • Reality: This is really hard to evaluate but after talking to some experienced people at EAG I suspect there are still many people who undervalue their potential for direct work. With new organisations scaling up and the need for bigger projects ("megaprojects") wanted by funders not being met, I think we're seeing a skill gap, especially in entrepreneurship and management. Some people should forgo their current income in order to take jobs that funders are very willing to pay.

Some things that come to mind that might help mid-career people:

  • Meetings/Calls with people that were in similar situations and are now working in EA-aligned jobs
  • Possibilities to explore EA-work while taking a sabbatical, e.g. by doing research projects in an EA hub, joining an 80k career group for people of the same age, doing skilled volunteering
  • Writeups of more success stories of mid-career people changing jobs
  • Talks from mid-career people in EA-jobs at EA job place groups
  • Retreats for mid-career people interested in switching with speakers that have done the transition
  • More volunteering opportunities that enable getting to know organisations

I hope to be able to contribute to getting more mid-career people into direct EA work and have offered High Impact Professionals to support them in that area. Additionally, I'm also always happy to chat with people about this and to make introductions.

Comment by Patrick Gruban (gruban) on Altruism as a central purpose · 2022-04-07T20:38:07.135Z · EA · GW

I like this framing and it resonates with me. As an entrepreneur, I derived meaning through my companies but since engaging more with EA this has shifted to effective altruism. Now my company is a means to a bigger end which is more satisfying to me. Similarly, my volunteering in EA community building and EA software development is more satisfying than comparable activities I did before as it aligns with this purpose.

By writing "a central purpose" I assume you to leave open the possibility of people having multiple purposes, perhaps even some ranking higher? It seems that in most societies people derive the primary meaning or purpose from family, with occupation and career coming in second. 

This could be similar for people having careers in cause areas that are important to them. So someone working for an effective animal charity could see their purpose in the research they are doing (but would derive a similar purpose in a less effective academic position) or in the cause of saving animals. However, they could also see the purpose in doing good better, willing to change their job if other cause areas seem more important, tractable and neglected. I would only see the last case as someone who has EA as their purpose but it would be interesting to hear other views.

Comment by Patrick Gruban (gruban) on The Future Fund’s Project Ideas Competition · 2022-03-08T09:01:19.840Z · EA · GW

I agree that there is a risk that this leads to additional burden without meaningful impact. 

Seeing the numbers of certifications currently deployed that are used public-facing for marketing as well as to reduce supply-chain risks (see for example this certifier) I would see the chance that longtermist causes like biosecurity risks will be incorporated into existing standards or launched as new standards within the next 10 years at 70%. 

If we can preempt this with building one or more standards based on actual expected impact instead of just using it to tick boxes. If this bet works out then we might make a counterfactual impact however I would also like to see the organisation shut down after doing research if it doesn't see a path to a certification having impact.

Comment by Patrick Gruban (gruban) on The Future Fund’s Project Ideas Competition · 2022-03-02T17:40:33.875Z · EA · GW

Longtermist risk screening and certification of institutions

Artificial Intelligence, Biorisk and Recovery from Catastrophe

Companies, nonprofits and government institutions participate and invest in activities that might significantly increase global catastrophic risk like gain-of-function research or research that might increase the likelihood of unaligned AGI. We’d like to see an organisation that evaluates and proposes policies and practices that should be followed in order to reduce these risks. Institutions that commit to following these practices and submit themselves to independent audits could be certified. This could help investors and funders to screen institutions for potential risks. It could also be used in future corporate campaigns to move companies and investors into adopting responsible practices.

Comment by Patrick Gruban (gruban) on We need 40,000h or maybe even 20,000h · 2022-02-19T12:30:42.487Z · EA · GW

That's a valid point and as my comment also had the offer of a private conversation I will try to add a few thoughts here. Generally, if someone asks me about EA career advice when mid-career I would suggest following:

  • Look at the 80K content and especially the 8-week career course. I think it can be a misconception that this is only for students, early-career people or if you want to completely change your career path. For me, it was very helpful and there are several places that discuss different options depending on prior experience and seniority. Thinking about what a fulfilling career looks like and doing cause prioritization is something that can be helpful for anyone.
  • Apply for a 80K 1:1 call when you're ready. I had one with Habiba and she was able to connect me to other people in their mid-career that recently were hired or contracted by EA orgs. Talking to them helped me a lot to see that there are many opportunities to help organisations to have an impact and very different paths to them.
  • Take your time: "If you can increase the impact of your career by just 1%, it would be worth spending up to 800 hours learning how to do that.[1]" by 80K can be translated to 400 or 200 hours for shorter careers that is still a lot of time. 
  • Try things out. I keep coming back to the book Designing Your Life [2], especially the tipps around
    • Tracking the amounts of engagement and energy different activities bring in order to assess better what kinds of work could energize you.
    • Planning and trying out smaller prototypes in order to find out what is a good fit. For this was volunteering
  • Get some slack: If you don't have the time in your life to think about these topics and try things out then I would recommend trying to free up time by:
    • Reducing your job hours if possible (perhaps there is a way to reduce your donations or spending for some time to only work 4 days/week?)
    • Reducing other activities for some time or using vacation
    • Temporarily moving to a place where you spend less and take a remote job
    • Setting up a plan to free up time after you are able to (eg when the children reach a certain age, when you have a certain amount saved up etc.)
  • If this all seems too much at the moment then give yourself some slack. Rushing into a new job that does not suit you is not only bad for you but also for the organisation hiring you. For me it took 6 years from reading about EA to starting to volunteer and I had to overcome many misconceptions around the demandingness of EA and might have dropped out if it wasn't for meeting people and seeing that everyone is human and many people struggle with the question how much they should dedicate of their lives to doing good.
  • For me it was and is very important to meet people and to see how big and diverse on many lines the movement is. I can't recommend enough to apply for EAG and EAGx conferences and to talk to many people. Especially the mid- and late-career speedmeetings at EAG London were very inspiring.
  • The last point brings me back to the private conversations: As good as communicating on the forum is, I would recommend doing these. I anybody wants to have a chat, please send me a message, I would be happy to take a call.
  1. ^
  2. ^

    80K career advisor Michelle Hutchinson also found it useful as I found in this comment

Comment by Patrick Gruban (gruban) on We need 40,000h or maybe even 20,000h · 2022-02-18T14:06:37.214Z · EA · GW

Thank you for posting this! I'm in my mid-40s and it took me some years to get more into the EA movement due to the age difference and my not being sure what I could contribute. So I can very much understand this and also think supporting mid- and late-career people can be very helpful. For me, EAGs were the place to find and connect with other senior people and I now see many more possible ways I can play a growing role in the movement. Currently, I'm volunteering in a role I have much experience with and I see there is a need for this kind of work. 

I'm happy to see that High Impact Professionals (mentioned in another comment already) is taking on parts of this space. Giving What We Can also is adding more general advice around donations that could address some of your questions around donations. At least joining GWWC meetups helped me see that many people have anxieties around driving themselves to donation and income levels that may not be sustainable in the long term.

Less than 9% of respondents in the 2019 EA survey said they are over 45 and the 2020 survey seems similar. Missing out on a big part of the population that includes many people with high salaries, career capital, large networks and also more free time as children grow up seems like a waste. It also seems to me there are more people with grown children looking for meaning while having fewer financial obligations for whom more engagement in EA could be a good fit.

I'd be happy to chat and will send you a message.

Comment by Patrick Gruban (gruban) on EA/Rationalist Safety Nets: Promising, but Arduous · 2021-12-31T10:51:44.283Z · EA · GW

I think this is a good point. One possibility of addressing this could be on the level of local EA groups giving organizers the tools and education to identify struggling members and help them better. As a local organizer, I would find additional resources helpful, especially if they are very action-orientated.

Comment by Patrick Gruban (gruban) on EA/Rationalist Safety Nets: Promising, but Arduous · 2021-12-31T10:42:14.848Z · EA · GW

For people how have taken the further pledge an increase in salary would be less valuable than insurance that is paid by the employer. This might be a case that is only relevant for a few people, however, they might also be part of the most dedicated group.

Comment by Patrick Gruban (gruban) on EA/Rationalist Safety Nets: Promising, but Arduous · 2021-12-31T10:37:14.336Z · EA · GW

Having listened to the 80K podcast with Howie Lempel it seems that for him it was important to get out of a context where he was with EAs for work and friendship for a time in order to recover. So I'm not sure for which cases this would actually be a good solution.

Comment by Patrick Gruban (gruban) on EA/Rationalist Safety Nets: Promising, but Arduous · 2021-12-31T10:33:22.065Z · EA · GW

I wasn't as precise as I could and will try to clarify:

  • The German health-, care- and pension insurance system is set up where employees and employers each pay 50%  of the fees. The fee is defined as a percentage of the income. High-income earners subsidise low-income workers in this way.
  • The KSK is a system on top only for self-employed artists who typically have to cover the 50% share that an employer would cover. 50% of the insurance is paid by the artists (same as what employees would pay), the government subsidises 20%, and clients cover 30%
  • Clients have to pay without knowing if the artist is part of the KSK, so there is some additional subsidising.
  • The additional paperwork for clients could be reduced if the artists would be allowed to collect the payments themselves, which I would like better.

I'm not in favour of how the KSK system works and wouldn't recommend it as a model. However, I think their way of identifying an artist by type of work and minimum revenue from this work area is an interesting input.

Comment by Patrick Gruban (gruban) on EA/Rationalist Safety Nets: Promising, but Arduous · 2021-12-30T07:25:53.257Z · EA · GW

Thank you for the overview! What comes to my mind as similar is the Künstlersozialkasse (KSK) in Germany that is ruled by the special law called Künstlersozialversicherungsgesetz.

This artist social fund is open to anyone that works self-employed in an artistic job like visual artists, authors, journalists, musicians etc. and doesn't have employees.  You have to fill out a 9-page form to apply where you state what work you have already done and that you're over the minimum income from artistic work of 3,900€/year. In the first three years of your work life, you don't have to prove this minimum and this was also deferred during Covid-times.

If you get accepted then the KSK will pay the cost of your health-, care- and pension insurance which includes for example doctors, clinics, medicine, psychotherapy, rehabilitation clinics and dentists. You will have to state your income yearly and pay a portion to the KSK. 

The KSK is financed by three sources:

  • Payments by artists (50%)
  • Payment by the government (20%)
  • Payment by clients that employ the artists (30%)

My company has to list all artists invoices that we paid in a year (by graphic designers, photographers, make up artists etc.) and submit it to the KSK. We are then charged a percentage (currently 4.2%) of this. Every company and self-employed person in Germany has to do this.

An analogue in EA could be a system where you:

  • have to prove that you've 
    • either got EA funding for your work
    • you're working at an EA org without insurance
    • you're in the first 3  years of your EA career
  • pay a portion of your EA salary for the insurance
  • the insurance covers health insurance and other insurance-like services
  • Funders fill up the gap of the payments

This model would still have the issue of vetting applicants, but one clear criterion would be that you can only get in for the first three years without showing that you have minimum funding through grant approvals or EA-aligned jobs. If you don't earn any EA money after that you would get excluded.

Comment by Patrick Gruban (gruban) on Announcing our 2021 charity recommendations · 2021-12-13T20:31:44.432Z · EA · GW

ACE has now, after two weeks, posted an explanation on their blog. I'm also surprised that it took so long to provide an explaination. 

Comment by Patrick Gruban (gruban) on Announcing our 2021 charity recommendations · 2021-12-13T20:24:52.975Z · EA · GW

ACE has now posted an explanation on their blog stating "The crux of our decision for each organization was related to culture issues that we identified during our evaluation process."

Comment by Patrick Gruban (gruban) on Meditations on Caring · 2021-11-29T21:18:56.249Z · EA · GW

Thank you for coming up with the idea and trying it out! I had good experience with metta meditation and including an EA mindset appeals to me. I wonder if there is an approach where you could try out different phrases with different people to optimize them before potentially making a recorded guided meditation.

Comment by Patrick Gruban (gruban) on Announcing our 2021 charity recommendations · 2021-11-24T15:16:49.715Z · EA · GW

Albert Schweitzer Foundation and Good Food Institute were both Top Charities in 2020 and are now neither Top nor Standout Charities. What changed for you to make this update?

Update 12/13: ACE has now posted an explanation on their blog stating "The crux of our decision for each organization was related to culture issues that we identified during our evaluation process."

Comment by Patrick Gruban (gruban) on Questions for Howie on mental health for the 80k podcast · 2021-08-20T13:51:47.995Z · EA · GW

I just recently read this meta-analysis of studies comparing the effectiveness of eCBT with CBT to treat depression and I was surprised how well they're doing. I was wondering if there are already resources listing self-help options where RCTs have been done in order to recommend them.  

There is a paragraph at the end of page 36 of this HLI report but as a potential user it can be hard to find this information. There is some information on EA Hub but it might be outdated as for example MindEase is listed as free, similar this document from the Mental Health Navigator Pilot. Both list tools but no evidence for the effectiveness. 

So in short my question would be what is the best resource as an EA to get up-to-date information about self-help options that have been proven to be effective? And if there is none, is this something the CEA community health team is working on or would this kind of research be good for a volunteer role?

Comment by Patrick Gruban (gruban) on Part 3: Comparing agency organisational models · 2021-07-26T11:48:35.448Z · EA · GW

Avoiding VAT losses: If you buy a service from an agency, they have to add VAT, typically around 20%

Can't charities get the VAT refunded  like businesses in the UK? In Germany they do, only government institutions and banks don't have this option as far as I know. 

Comment by Patrick Gruban (gruban) on Part 4: Intra-organizational and non-tech agencies · 2021-07-26T11:24:42.039Z · EA · GW

Thank you very much for this sequence! I've been thinking about the tech agency model for EA  and was even contemplating writing a post about it but I'm glad you did a much better job than I would have been able to.

  • Software developers: how appealing do you find the idea of working at a low bono vs donor-funded agency vs in-house at an EA org vs sticking with non-EA work?

I've worked as a developer in my own small agency and at a client for 10 years and started volunteering on two web development projects for EAs this year plus a bit of mentoring for a startup charity. From this experience a would very much welcome an agency approach.  For me the biggest upsides would be:

  • Having other developers to talk about projects
  • Having others to do code reviews (and vice-versa)
  • Having partners that can cover for me if I get sick or am on vacation (especially around DevOps issues)
  • Having people who both are EA-aligned and value high quality software development

I would love put my volunteer work under this model and could see the agency mixing different funding cases:

  • Doing work for (lower-end) market-rates for established EA orgs
    • If an org is good at getting funds that may be easier than fundraising for a new org
    • Just having EA-aligned people in it (with experience in working for non-profits) might be enough of an incentive for the org
    • For this case it would still be motivating for a developer to choose this path instead of a slightly higher paying company and stick around for longer
    • In addition to development this could also include recruiting, training and mentorship for developers working at orgs (also giving them a team to talk about tech issues)
    • Also I see consulting and business analysis as promising areas. Often companies are fast to request a software solution when the problem starts at the processes and coordination level. I expect EA orgs could have similar issues.
  • Donor-funded work on specific projects like
    • EA-wide infrastructure (resources several orgs would use but no single one would want to finance)
    • Mentoring of tech people in the community
    • Training for (non-tech) product owners in orgs on writing user stories etc.
    • Workshops and retreats for the EA tech community (including tech people from orgs)
  • Low-Bono work 
    • for EA charity startups that are still in the trial phase. This could also be seen as an investment as the org will be able to pay market rates if it gets funding.
    • for experimental projects to fill a funding gap
    • for anything developers think they'd donate to any way (although this is the weakest case for me)
  • Volunteer work
    • I'm over 40 and for my point in life doing an additional 10-20 hours per week as a volunteer seems best suited for me now. I expect there are more people in similar situations, especially among older EAs.

Also one model I like that wasn't mentioned is that of a cooperative of freelancers. I've been doing some work with one in Munich and for developers that want to stay independent while also sharing responsibility in a project seems like a good combination. The coop that I know chooses their clients based on their values and also does pro-bono work on the side and donates all their profits. They seem pretty happy with that.

How much difference would it make if you were involved in the prioritisation process at a donor-funded org with a remit to find the highest value tech projects?

I'd be personally happy to work for any cause areas, although I'd want to make sure that the project I'm working on has impact and is not a "nice to have". But the more the client pays the less I would want to interfere, so I could imagine some orgs paying market-rate for lower-value projects.

Comment by Patrick Gruban (gruban) on As an EA, Should I renounce my US citizenship? · 2021-04-19T09:21:21.877Z · EA · GW

I renounced my US citizenship two years ago (I still have a German one, living in Germany) because of the restrictions on opening bank accounts, the uncertainty on taxes on capital gains as well around inheritance taxes as well as the yearly cost of paying and additional CPA for the US return. The process was pretty straight forward (going to the consulate, paying the exit fee) and at that time they told me at the consulate that many people were doing this.I've since then travelled to the US once and wasn't questioned at the border.

Not having the option to work in the US is a significant downside so I wouldn't take the decision lightly. However once you start having more assets outside the US (especially if you start investing in companies) the risks and tax requirements can be significant.

If you can manage to open a bank account in the US it might be easier to invest there but usually you need a permanent address.

I short I think it's worthwhile to invest some time (and perhaps money in advisors) in further researching the options you have before making a decision that either reduces your work options or exposes you to unknown financial risks.

Comment by Patrick Gruban (gruban) on Avoiding Munich's Mistakes: Advice for CEA and Local Groups · 2020-10-15T10:07:10.753Z · EA · GW

I think this post could have profited from explaining the word "deplatforming" as in the sentence "Recently, EA Munich decided to deplatform Robin Hanson" as described in "3 suggestions about jargon in EA".

As one of the organisers of EA Munich it would be helpful to know more clearly what is meant by this as I could read it as us trying to "shut down" a speaker. It could also just be a synonym of "disinvite". I think especially in criticizing members of the community we should be as precise as possible.

Larks was so kind to share this article with us before posting and I pointed out this objection as my personal opinion in my reply to him.

Comment by Patrick Gruban (gruban) on Avoiding Munich's Mistakes: Advice for CEA and Local Groups · 2020-10-15T09:51:12.981Z · EA · GW

As one of the organisers of the EA Munich group this was the first thing I thought of when we heard about the press coverage of Robin Hanson: What can we learn from the EA association of the controversies of Peter Singer. I was thinking of your comment and of Ben Todd's quote "Once your message is out there, it tends to stick around for years, so if you get the message wrong, you’ve harmed years of future efforts." I think there is much harm that can be done in canceling but it should be weighed against the potential harm of hurting the movement in a country where values and sentiments can be different than in the english speaking world.

For me the Robin Hanson talk would have been the first event as a co-organiser and seeing a potential cooperation partner unearthing the negative press about Robin Hanson and telling us that they would not be able to work with us if we hosted him, was an indication that we shouldn't rush to hold this talk. Oliver Habryka summarised this pretty well:

Having participated in a debrief meeting for EA Munich, my assessment is indeed that one of the primary reasons the event was cancelled was due to fear of disruptors showing up at the event, similar to how they have done for some events of Peter Singer. Indeed almost all concerns that were brought up during that meeting were concerns of external parties threatening EA Munich, or EA at large, in response to inviting Hanson. There were some minor concerns about Hanson's views qua his views alone, but basically all organizers who spoke at the debrief I was part of said that they were interested in hearing Robin's ideas and would have enjoyed participating in an event with him, and were primarily worried about how others would perceive it and react to inviting him.

I just looked up what I wrote internally after the decision and still think this is a good summary:

In an ideal world we have known about the issues beforehand, would have talked them through internally and if we had invited him we would have known how to address them in a way that is not harmful to the EA community. However given the short time we saw more risks in alienating people than getting them interested in EA through the talk.

The monthly talks we host are public and posted on Meetup and Facebook so our audience consists of people who are new to the community. We as EA local groups are the first impression many people get of the community and are the faces of the community in our region so I would argue we should be well prepared and versed in potential controversies before hosting talks especially with prominent people and on a video platform where all statements can be recorded and shared. As a group that had just one female speaker in the last 15 talks I think this is especially the case if press coverage may seem that the speaker has views that may make women feel less welcome.

At the time it seemed riskier to try to assess and reduce the risks about the potential negative consequences around the talk then to cancel it. However my error was in not assessing risks around signaling in terms of Cancel Culture.