Can/should we define quick tests of personal skill for priority areas?

post by jaison-thomas · 2019-06-09T20:30:16.773Z · score: 43 (21 votes) · EA · GW · 5 comments

This is a question post.

Contents

  Answers
    3 casebash
    1 Linch
None
3 comments

Early in a career, if you're uncertain about what you're good at, exploration of your skills/abilities is necessary. Choosing quicker/cheaper tests at this time can help you do this more efficiently.

However, for assessing skill/personal-fit in our priority areas, a lot of the advice we give is "major in X in undergrad" or "be able to get into this type of job/grad school." To my mind, these aren't efficient tests - by the time you've gotten to higher level classes that truly test your ability to potentially move the field forward/get into the right job or grad-school, it's pretty late to pivot. Also, this advice only applies to EAs currently in college.

Instead for priority paths, could/should 80000-Hours and the EA community curate sets of tests, starting from cheap to expensive, that one can use to rapidly gauge their skills? For instance for technical AI safety research, I layout the following hypothetical example (heads up - I'm no AI expert)[1]

The advantage of Test 1 is that you've found a way to test the fundamental skill of flexible technical thinking without investing a ton of time just learning accessory information (how vectors/matrices work, how TensorFlow works, etc.). You could arguably figure this out in one summer instead of over many years.The potential downsides are:

[1] Again, I'm no expert on technical AI research. Feel free to dispute this example if inaccurate, but I'd ask you to try and focus on the broad concept of "could a more accurate set of ranked tests exist and actually be useful for EAs?"

Answers

answer by casebash · 2019-06-11T14:14:32.943Z · score: 3 (2 votes) · EA · GW

I definitely think this is worth experimenting with to see if we can effectively identify those who should pursue a particular path.

answer by Linch · 2019-06-14T04:14:47.859Z · score: 1 (1 votes) · EA · GW

Definitely agree on "should," assuming it's tractable. As for "can", one possible approach is to hunt down the references in Hunter and Schimdt[1], or similar/more recent meta-analyses, disaggregate by career fields that are interesting to EAs, and look at what specific questions are asked in things like "work sample tests" and "structured employment interviews."

Ideally you want questions that are a) predictive, b)relatively uncorrelated with general mental ability[2] and c) are reasonable to ask earlier on in someone's studies[3].

One reason to be cynical of this approach is that personnel selection is a well-researched and economically really lucrative if for-profit companies can figure it out, and yet very good methods do not already exist.

One reason to be optimistic is that if we're trying to help EAs figure out their own personal skills/comparative advantages, this is less subject to adversarial effects.


[1] https://pdfs.semanticscholar.org/8f5c/b88eed2c3e9bd134b46b14b6103ebf41c93e.pdf

[2] Because if the question just tests how smart you are, it says something about absolute advantage but not comparative.

[3] Otherwise this will ruin the point of cheap tests.

5 comments

Comments sorted by top scores.

comment by imben · 2019-06-15T21:51:46.438Z · score: 6 (2 votes) · EA · GW

Really cool idea. If this were possible would we expect to see big companies using similar tests to recruit undergraduates early before competitors do?

comment by antimonyanthony · 2019-06-11T15:16:47.428Z · score: 3 (2 votes) · EA · GW

Agree on the "should" part! As for "can": a potentially valuable side project someone (perhaps myself, with the extra time I'll have on my hands before grad school) might want to try is looking for empirical predictors of success in priority fields. Something along these lines, although unfortunately the linked paper's formula wouldn't be of much use to people who haven't already entered academia.

comment by tamgent · 2019-06-13T13:40:18.075Z · score: 1 (1 votes) · EA · GW

I am interested in this. It can be very costly and difficult to pivot when you make commitments on the order of years, such as what to study at university. However, the sheer size of the commitment also has value as a costly signal. And that's why society relies on it so much. I think cheap tests like you describe are great to do before embarking on commitments on the order of years. And tracking the timing and directionality: ie. which opportunities might be better to take at another time, or how reversible is a pivot, what keeps my options open? I wish I had figured all that out earlier, ideally in high school. Probably telling people earlier, say in high school, to do cheap tests is pretty valuable.