4 Ways to Give Feedback to Job or Grant Applicantspost by Kirsten (Khorton) · 2022-08-05T15:38:43.536Z · EA · GW · 4 comments
Telling applicants whether or not they progressed to the next stage or got the job. Providing concrete information on your hiring process. Giving standardized responses for why people didn't progress to the next stage. Personalizing feedback for candidates (either all candidates, or those who ask). None 4 comments
I have previously posted about my belief that all EA organisations should provide candidates with feedback [EA · GW]. Some people responded [EA(p) · GW(p)]by suggesting that providing feedback to every job or grant applicant would be very costly and take a lot of staff time.
I have a more flexible view of feedback! I think a lot of things can be considered feedback, and it's worth considering how each organisation can provide more feedback and more value to the community without creating disproportionate burdens on themselves.
I've listed ways of giving feedback from the least work for organisations to the most work. All of these options can be mixed and matched. There are also downsides to some, which I don't go into, but I would hope hiring managers would also consider the upsides of including several of these methods at different points in their hiring or grant application review process.
Telling applicants whether or not they progressed to the next stage or got the job.
Knowing how far you've progressed with an organisation is feedback and is useful to candidates. It's important to promptly update candidates who were unsuccessful as well as successful candidates; in general, EA organisations are good at this.
If it's been longer than expected but you haven't made a decision yet, it can be helpful to update applicants, especially for grants, as people may start to believe you dislike their project idea.
Providing concrete information on your hiring process.
Small pieces of factual information can help applicants understand how much they should update from your acceptance or rejection.
We invited 60 applicants to complete this one-hour work test. 42 applicants completed the test, and we invited the top 20 for an initial interview.
In this case, the applicant knows they were in the top half of work test results, which is useful - it's pretty different from being in the top 5% or top 80%.
Your grant application was determined to be complete and within the scope of our fund. After careful review, we have decided not to offer a grant at this time.
In this situation, the grant applicant knows that it's worth applying to the same grantmaker on similar topics, which is valuable information.
Giving standardized responses for why people didn't progress to the next stage.
Interviewers, grantmakers, and assessors can use a pass/fail or Likert scale in clear categories to assess applications and tell applicants how they did.
We assessed your CV and cover letter for understanding of our organisation's mission, working knowledge of Python and data analysis techniques, and relevant work experience. We felt you had a good understanding of our organisation's mission and relevant work experience. We did not see evidence of knowledge of Python and data analysis techniques, so we will not be progressing with your application. Thank you for applying and please feel free to apply for roles with us in the future.
In this situation, the applicant knows the organisation was looking at three categories - hopefully categories that were mentioned in the job advertisement! - and that they met two of those categories. If they wanted to apply again for a similar job, they really need to learn Python first, or mention on their CV that they know it!
My employer, the Civil Service, tells applicants in advance which categories they'll be assessed on at interview (for example Delivering at Pace), provides a rubric for how that category will be assessed, and then provides scores at the end (averaged from 2-4 interviewers). You can learn more about Civil Service interviews here [EA · GW].
Personalizing feedback for candidates (either all candidates, or those who ask).
Of course, the most helpful and most costly feedback is personalized to the individual. This can be combined with providing scores or pass/fail in standardized categories, or it can stand on its own.
Our interviewers noted they would have liked to hear more about your previous leadership and collaboration experiences.
This is helpful because it's very actionable for future interviews and is probably directly connected to the reason the person didn't get hired. Phrasing things as if the person may well have the relevant experience, but you didn't get to see evidence of it, can help you avoid situations where the applicants comes back and says, "Actually I have plenty of leadership experience!" Although I have heard that some organisations (Ought was mentioned) do provide feedback in part because they want applicants to correct them if they've missed important information.
Your grant application was very clear about your idea, but we didn't get a clear idea about who the team executing this idea would be. If you decide to apply to a future round, either with an iteration of this idea or with another idea, we'd like to understand that better.
This would be a super useful piece of feedback for the person receiving it. It's extremely actionable and strongly signal's the organisation's openness to future grant proposals.
It's worth deciding in advance under what circumstances you're willing to provide feedback, how much feedback you're willing to provide, and communicating that clearly to candidates. It's more worthwhile for people interested in doing good to apply for your grants or jobs if they know they'll receive some feedback during the process.
What have I missed? What other methods of giving feedback have you seen and liked or disliked? Please comment below with your views.
Comments sorted by top scores.