Against the "smarts fetish"

post by Magnus Vinding (MagnusVinding) · 2022-04-08T07:54:32.217Z · EA · GW · 33 comments

Contents

  One can have a high IQ while still not…
    Being knowledgeable and widely read
    Being good at scrutinizing one’s own ideas and convictions
    Being willing to face unpleasant views and inconvenient conclusions
    Being willing to think independently
    Being resistant to excessive contrarianism
    Being resistant to other signaling-related distortions
    Being willing to explore fundamental issues
    Being driven (by altruistic impact)
    Being in touch with common sense
    Displaying interpersonal kindness and respect
  Exceptional combinations of skills can provide exceptional opportunities for impact
  Runaway IQ signaling: A potential explanation and pitfall
  Other important and neglected traits?
None
33 comments

It seems to me that the effective altruism community has a tendency to overemphasize smarts and to underemphasize other important traits. (Some related remarks on this forum are found here [EA(p) · GW(p)], here [EA(p) · GW(p)], and here [EA(p) · GW(p)].) Yes, smarts do matter greatly, and IQ tests are indeed predictive of various outcomes and achievements. But something can be both important and overemphasized at the same time.

By analogy, vitamin C is no doubt necessary for our health, yet to focus on vitamin C to such an extent that one neglects most other vitamins can risk deficiencies of those other vitamins. A focus on smarts to the exclusion of other important traits and capacities may likewise lead to “deficiencies” along those other dimensions. 

To clarify, my claim here is not that anyone holds the cartoonish view that “IQ is everything, period”. My claim is simply that the relative emphasis on smarts versus other important traits seems quite far from optimal, and that it would therefore be beneficial to focus more on those other traits, some of which I will outline below. [ETA: And beyond neglectedness, a reason to focus more on these other important traits relative to IQ — at the level of what we seek to develop individually and incentivize collectively — is that many of these other traits and skills probably are more elastic and improvable than is IQ.]

Note also that my aim here is not to point fingers at anyone; I think most people, myself included, occasionally fall into the trap of being too focused on smarts compared to other things — and that is no sin. (Intelligence can be fascinating, after all.) The point is just that we would do well to focus (more) on promoting a broader range of important traits and virtues.[1]

One can have a high IQ while still not…

Below are various traits that all seem necessary for optimal altruistic impact, and which are plausibly worth emphasizing more relative to smarts (on the current margin). Many of these traits are likely correlated with IQ, but that does not negate the point that one can overemphasize IQ at their expense, and that one can have a high IQ and still completely fail to develop these other traits and virtues. (Needless to say, the following list of important traits is far from exhaustive; I hope you will add to it in the comments.)

Being knowledgeable and widely read

Being good at scrutinizing one’s own ideas and convictions

Being willing to face unpleasant views and inconvenient conclusions

Being willing to think independently

Being resistant to excessive contrarianism

Being willing to explore fundamental issues

Being driven (by altruistic impact)

Being in touch with common sense

Displaying interpersonal kindness and respect

Exceptional combinations of skills can provide exceptional opportunities for impact

By becoming an outlier on many of the traits listed above — or even by doing decently well on a great number of them — one can likely come to embody a combination of traits and skills that precious few have mastered before, and which may open the door to unique opportunities for impact (even if one does not have a sky high IQ).

A case in point might be George Orwell, about whom Christopher Hitchens said the following:

The remarkable thing about Orwell — and the encouraging thing — was he is not a genius. He lived to only 46 years. He never went to university. He never had a steady job. He usually didn’t have a steady publisher. He will never be forgotten because he managed to disprove imperialism, Stalinism and fascism in one lifetime and made some imperishable raids on its territory that no one is ever going to forget. All the time ill. All the time poor. It shows how much difference a person of really average integrity and intelligence and education can make if they have a little courage and a little intellectual honesty. The shortcomings of the individual you can see in him too. But he basically won his own battle against his own prejudices.

I think Hitchens was wrong in implying that Orwell was of “really average integrity” (and he probably was not of “really average intelligence” either). In fact, Orwell seems to have been a clear outlier in terms of integrity and intellectual honesty, which were arguably among his most distinguishing qualities. But that only speaks to the potential value of developing such neglected and less “shiny” traits and virtues.

Runaway IQ signaling: A potential explanation and pitfall

What might explain the apparent overemphasis on IQ compared to other important traits? This is an open question, but one hypothesis that may be a part of the answer is that many people are unduly concerned with IQ signaling (perhaps because high IQ has become the pre-eminent marker of status).

This dynamic was hinted at in a 2016 talk by Geoffrey Miller, in which he highlighted “runaway IQ signaling” as a potential pitfall among aspiring effective altruists:

I’m very concerned that [the effective altruism community] doesn’t go the same path I’ve seen many other fields go, which is: when you have bright people, they start competing for status on the basis of brightness, rather than on the basis of actual contributions to the field. …

EA is prone to runaway signaling of intelligence and openness. So if you include a lot more math than you really strictly need to, or more intricate arguments, or more mind-bending counterfactuals, that might be more about signaling your own IQ than solving relevant problems.

Again, the claim here is not that IQ signaling is ruining everything, but merely that it might be a biasing factor for many of us.

Other important and neglected traits?

Which additional traits distinct from IQ do you think are important and worth prioritizing more? Feel free to comment below. :)[2]

 

  1. ^

    ETA: To clarify, the goal of this post is not to assert the general claim that "EA overrates IQ". My claim is rather that, in terms of the traits we seek to develop and incentivize, IQ seems overemphasized relative to many other important traits and virtues that deserve greater emphasis, such as the ten traits I list below. (And that IQ may be correlated with many of those other traits is not a strong reason to emphasize IQ more relative to those other traits.) This claim is consistent with thinking that IQ is underrated or underemphasized relative to many other factors, including factors that are often given great importance, such as formal titles and prestige.

  2. ^

    For their feedback on this post, I’m grateful to Teo Ajantaival, Tobias Baumann, Timothy Chan, James Faville, Winston Oswald-Drummond, and Sebastian Schmidt.

33 comments

Comments sorted by top scores.

comment by Linch · 2022-04-08T21:58:20.486Z · EA(p) · GW(p)

The article claims that an important trait (smarts) is overrated as a precondition to impact, while giving some caveats and mostly specific reasons for why smarts is not maximally predictive. But this is not much evidence that a factor is overrated!  (Unless you are trying to argue against a correlation factor of 1, which as OP noted, nobody actually believes). The only exception here is the runaway IQ signaling [EA · GW] point, which is indeed an argument (bias) for us to be wrong about relevant values rather than just a claim about factors for absolute values. However, OP do not consider biases that may cause us to underrate smarts, making this not very helpful even for a qualitative judgmental take.

 Without a numerical score for what you think the current community is at with regard to how smarts is rated, plus a numerical score for what is correct or where you think the community should be, it's very hard for me to evaluate the correctness of this claim. In the absence of a quantitative score, I'd have benefited from a rank ordering or some more precise qualitative claims. 

More precisely, I'd like to see:

  1. How much you think "smarts" explains absolute variance in impact among EAs.
  2. How much you think "smarts" explains predictable variance in impact among EAs (if smarts explains 10%, but 90% is noise, then smarts is the best and in fact only metric we care about)
  3. How much you think the community currently believes "smarts" explain absolute variance in impact among EAs.
  4. How much you think the community currently believes "smarts" explains predictable variance in impact among EAs

I'm guilty of the pattern of making a relative claim without mentioning levels myself (see point #4 [EA · GW]), so it feels hypocritical to point this out. Nonetheless, we can grow stronger as a community if people are willing to be hypocritical for the greater good. :)

Replies from: Linch, MagnusVinding, Peterslattery
comment by Linch · 2022-04-08T22:27:52.439Z · EA(p) · GW(p)

Here are my own attempts to answer this:

Qualitatively, I think the appropriate claim from both my (shallow) understanding of the intelligence ∩ work performance literature and some other literature on related topics, plus personal impressions/anecdotes/intuitions goes 

Intelligence (general mental ability) is the most general predictive feature for performance that we have, but it's still not all that predictive in absolute terms.

Quantitatively, my current best estimate is that correlation between intelligence and impact* among self-identified highly-engaged EAs is ~0.55** (explains ~30% of variance).  My guess is that we do not have substantial data to do better than ~0.7 (~50% of variance explained). 

I don't know whether other EAs agree with me here. My current guess is that numerically sensitive ones probably have numbers that aren't too far off (maybe slightly lower?), while people who are less numerically/statistically sensitive will initially claim correlations that are higher.

However, this (if true) would likely be a general bias, rather than an intelligence-specific bias. I would further predict that EAs (at least ones who haven't read this comment) will systematically overestimate the importance of other predictors as well, across a wide range of fields.

I think these numbers may seem pretty low compared to our intuitions for how important smarts are. I don't know how to reconcile these intuitions exactly, except to again note that there are many other fields where intuitions dramatically overestimate correlations relative to reality.

*Here impact is operationalized loosely as "on a log-scale, what prediction-evaluation setups would say about someone's past impact five years from now."

**precision of numbers do not imply confidence.

comment by Magnus Vinding (MagnusVinding) · 2022-04-09T10:26:58.241Z · EA(p) · GW(p)

Thanks for your comment, Linch. :)

It's a fair point that my post was quite vague on some key points, and your comment provides a great invitation for me to try to clarify my claims and views a bit.

The article claims that an important trait (smarts) is overrated as a precondition to impact

I actually wouldn't say that that's my core claim, although I do agree with it.

My claim about overemphasis relates more to the level of actions, norms, and practical focus than it relates to predictions about how much variance in impact IQ accounts for. (This is somewhat apropos the distinction between procedural vs. declarative knowledge as well as the intention-behavior gap.)

That is, it's possible that we're mostly right about how much variance different factors predict (or at least that we would be right on reflection, cf. your note in the other comment about how our immediate intuitions might be wrong), yet that we're nonetheless off in terms of how much we focus on developing and selecting for those respective factors in practice (including, and perhaps especially, when it comes to less tangible "focus promoters" such as norms, informal prestige conferral, and daydreams).

So I think IQ is probably somewhat descriptively overrated (more on this below), but I think the degree to which it is overemphasized at the level of norms, actions, and salient decision criteria is considerably stronger. One line of evidence I have for this is how often I see references to smarts, including in internal discussions related to career and hiring decisions, compared to other important traits.

How much do I think these other things are underemphasized, in quantitative terms? It is difficult for me to put a precise number on it, but my sense is that it would be good if most of the other traits and virtues I listed were to receive at least twice as much attention as they currently do, both in terms of how much time people devote to cultivating them in personal development efforts as well as in terms of how often these virtues are emphasized in the broader discourse among aspiring effective altruists. And beyond neglectedness, a reason to focus more on these other traits relative to smarts at the level of what we seek to develop individually and incentivize collectively is that those other traits and virtues likely are more elastic and improvable than is IQ — which isn't to say that IQ cannot also be improved.

 

How well does IQ predict "impact"?

Next, regarding the question of how well IQ predicts impact, I think this depends critically on how we define "impact". This may feel like a trivial point, but please bear with me as I try to explain where I'm coming from. :)

I like that you specified the following in your other comment, namely that you estimated impact roughly in terms of "what prediction-evaluation setups would say about someone's past impact five years from now". That's a clearly specified point in time.

However, I think it's likely that impact assessments will diverge substantially depending on the timeframe (cf. our vast uncertainty over time and the "Three Mile Island effect"). This also relates to the virtues I listed in the post. 

For example, I think it's possible (perhaps ~10 percent likely) that the community ends up going in a highly suboptimal direction due to focusing too exclusively on metrics such as "number of publications" or "useful theoretical insights provided" over, say, a five-year period, while neglecting less tangible factors such as interpersonal kindness and social health, which may gradually — in less noticeable ways that might only become apparent over longer timespans — lead to corrosion, burnout, or conflicts. (And the lack of emphasis on such less tangible factors might also be driving people away in the short term, in ways that are probably easy to miss by potential evaluators of impact.)

Likewise, it could be that factors such as "attention to social aspects" explain relatively little individual variation in impact, yet that they are nonetheless critical in terms of the community's success or failure. (Similar to how individual variation in some traits is less predictive of certain outcomes than is country-level variation. Indeed, individual-level success is not always conducive to collective success — sometimes it's even detrimental to it; altruistic behaviors that are too babbler bird-esque might be a concrete example of that.)

Finally, I think the point about clarifying fundamental issues, specifically fundamental values, is critical. After all, an impact evaluation that is made relative to some pre-specified set of values (that is held constant) may diverge greatly from an evaluation — even a five-year evaluation — that also factors in moral reflection, and which evaluates impact based on the updated values endorsed on reflection. Such reflection and consequently updated evaluative criteria may [? · GW] even flip the sign of one's impact. 

I'd expect IQ to be significantly better correlated with impact based on the former kind of evaluation (where I might roughly agree with your estimates in the case of a five-year assessment*) vs. the latter evaluation (which in idealized terms one could think of as "an impact evaluation made relative to the values that the person would endorse if they had focused chiefly on value exploration their entire life" — something that more limited value reflection efforts could presumably approximate).

In the latter case, IQ might still come close to being the main predictor, but I suspect that a construct tracking "focus on fundamental values" might do even better among aspiring EAs (not least because changes in fundamental values can change the consequent evaluations a lot). That's one of the reasons I think it's worth focusing much more on fundamental values. :)

Replies from: Halstead
comment by John G. Halstead (Halstead) · 2022-04-11T18:18:27.206Z · EA(p) · GW(p)

Like Linch, I do not see how you present any arguments for your main conclusion in the post. You argue that EA overrates IQ but present no arguments that this is the case. Your response also doesn't present any arguments for that conclusion

Replies from: MagnusVinding
comment by Magnus Vinding (MagnusVinding) · 2022-04-12T10:26:08.289Z · EA(p) · GW(p)

You argue that EA overrates IQ

As noted above, my main claim is not that "EA overrates IQ" at a purely descriptive level, but rather that other important traits deserve more focus in practice (because those other important traits seem neglected relative to smarts, and also because — at the level of what we seek to develop and incentivize — those other traits seem more elastic and improvable).

I noted in the comment above that:

one line of evidence I have for this is how often I see references to smarts, including in internal discussions related to career and hiring decisions, compared to other important traits.

Without directly quoting anyone, I can, to be more specific, say that I've seen relatively senior people in EA imply that certain EA organizations (including CRS, where I work) will be eager to hire applicants if they are extremely smart. That's the kind of sentiment I feel I've seen quite often, and with which I strongly disagree, because being "extremely smart" is far from being sufficient, even if the person in question has altruistic values.

Replies from: Halstead
comment by John G. Halstead (Halstead) · 2022-04-12T11:14:50.270Z · EA(p) · GW(p)

"my main claim is not that "EA overrates IQ" at a purely descriptive level, but rather that other important traits deserve more focus in practice"

The claim that EA overrates IQ is the same as the claim that other traits deserve more attention

comment by PeterSlattery (Peterslattery) · 2022-04-11T06:20:20.335Z · EA(p) · GW(p)

More precisely, I'd like to see:

  1. How much you think "smarts" explains absolute variance in impact among EAs.
  2. How much you think "smarts" explains predictable variance in impact among EAs (if smarts explains 10%, but 90% is noise, then smarts is the best and in fact only metric we care about)
  3. How much you think the community currently believes "smarts" explain absolute variance in impact among EAs.
  4. How much you think the community currently believes "smarts" explains predictable variance in impact among EAs
     

 

A very quick response by someone not very numerical and lacking much recent information on the relevant literature related to IQ: 

1/2- a lot (say 50%) if you assume we measure impact via something like research publications, and assume the presence of mediators such as individual and independent tasks (i.e., no collaboration), good (mental) health, and static agents (e.g., no feedback loops from agents engaging in regular reflection/self-improvement/recalibration loops and changing career paths), and motivation etc. Maybe 10% beyond an IQ of 120 if you assume a variance of impacts (e.g., introducing high competence people/organisations to EA, doing operations work to amplify the impact of intelligent people, and taking personal risks to setting up needed projects that have high expected value), while not assuming that any of the above mediators (e.g.,  mental health) are present.

3/4 - 50% but without realising the assumptions that are plugged in and mentioned above. Most of us know smarter people who are not able to work with others, not in good mental health, not as strongly EA aligned, healthy, not very motivated to do work or not very interested in improving themselves or changing their minds on things. 

As this suggests, I think that EAs tend to assume that intelligence is more sufficient for impact than I think they should. Part of this is my expectation that they tend to I) think of simple single impact/assessment scenarios and ii) assume the presence of other needed ingredients.

Some tangential thoughts:

Much if not most impact probably comes via collaboration with other smart people. However, some of the smartest people I know could not easily collaborate in a startup type collaboration and were therefore, from a entrepreneurial perspective, less valuable than less intelligent but more socially skilled/patient/humble alternatives etc. In such cases hiring based on intelligent could produce bad outcomes.

As I see it, many of the the highest impacts in EA come from bringing good people into the community rather than actually doing work that is seen as high value. This does not seem to load on intelligence much and is instead more about other competencies, such as social skills, access to networks and networking interest and ability).   However, my experience of hiring decisions here suggest that signals of intelligence are overweighted relative to social skills.

comment by Rohin Shah (rohinmshah) · 2022-04-10T06:34:42.370Z · EA(p) · GW(p)

Huh, I would have taken nearly all of the qualities listed here as a reason to prioritize "smarts", because they seem so correlated with "smarts" to me (exceptions: being driven, interpersonal kindness and respect). Like, if I generate examples of people who are high on the skills listed here, they tend to be among the smartest people I know; and if I generate examples of smart people, each example seems to have many but not all of these qualities.

If I were listing useful-to-EA qualities that were reasons to think less about "smarts", I would include:

  • Willingness to do "grunt work"
  • Ability to network well, including with non-EAs
  • Taking initiative (see micro-entrepreneurship [EA(p) · GW(p)])
  • Mental stamina (i.e. can you do 10 hours of focused work a day? I get the sense that a few rare people can, and it's not that related to how smart they are)
Replies from: MagnusVinding, FCCC
comment by Magnus Vinding (MagnusVinding) · 2022-04-10T18:38:20.890Z · EA(p) · GW(p)

Thanks for your comment and for listing those traits and skills; I strongly agree that those are all useful qualities. :)

One might argue that willingness to do grunt work, taking initiative, and mental stamina all belong in a broader "drive/conscientiousness" category, but I think they are in any case important and meaningfully distinct traits worth highlighting in their own right.

Likewise, one could perhaps argue that "ability to network well" falls under a broader category of "social skills", in which interpersonal kindness and respect might also be said to fall (as a somewhat distinct trait or ability, cf. the cognitive vs. affective empathy distinction; networking ability probably draws more strongly on cognitive empathy while [genuine] interpersonal kindness probably relies more on affective empathy). A related trait one could list in that category is skill in perspective-taking.

Regarding the correlation point, I agree that IQ is likely correlated with many of the traits I listed, but I don't believe [EA · GW] that this is a strong reason to think that we are not overemphasizing IQ relative to these other traits. Moreover, as noted in another comment, a reason to focus more on these other traits relative to IQ at the level of what we seek to develop individually and incentivize collectively is that many of these other traits and skills probably are more elastic and improvable than is IQ.

As for how many of these traits are correlated significantly with IQ, it's worth noting that — beyond "being driven" and "interpersonal kindness" — myside bias (also) appears to show “very little relation to intelligence”. And I likewise doubt that IQ has much of a correlation with a willingness to face unpleasant and inconvenient conclusions, or resistance to signaling-related distortions. (Some relevant albeit weak and indirect evidence regarding IQ and signaling-related distortions — specifically when it comes to distortions due to partisan/tribal loyalties — is that greater knowledge of political matters, which is presumably a decent proxy of IQ, does not seem to improve people's ability to provide an accurate representation of the opposite side's views, even when subjects are given a financial incentive.)

comment by FCCC · 2022-04-10T18:18:27.391Z · EA(p) · GW(p)

I think you have to be smart to have all the OP’s listed traits, so sure, there’s going to be correlation. But what’s the phrase? “Science advances one funeral at a time.” If that’s true, then there are plenty of geniuses who can’t bring themselves to admit when someone else has a better theory. That would show that traits 2 and 3 are commonly lacking in smart people, which yes, makes those people dumber than they otherwise would be, but they’re still smart.

Replies from: MagnusVinding
comment by Magnus Vinding (MagnusVinding) · 2022-04-10T18:56:58.823Z · EA(p) · GW(p)

“Science advances one funeral at a time.” If that’s true,

If that were literally true, then science wouldn't ever advance much. :)

It seems that most scientists are in fact willing to change their minds when strong evidence has been provided for a hypothesis that goes against the previously accepted view. The "Planck principle" seems more applicable to scientists who are strongly invested in a given hypothesis, but even in that reference class, I suspect that most scientists do actually change their minds during their lifetime when the evidence is strong. And even if that were not the case,  I don't think it would count as compelling evidence in favor of thinking that IQ isn't strongly correlated with less confirmation bias. (E.g. non-scientists might still do far worse.)

I think stronger evidence for a weak or non-existent correlation between IQ and resistance to confirmation bias is found in the psychological studies on the matter. :)

Replies from: FCCC
comment by FCCC · 2022-04-10T19:47:35.245Z · EA(p) · GW(p)

The “Planck principle” seems more applicable to scientists who are strongly invested in a given hypothesis

Yep, that’s why I referred to your 2nd and 3rd traits: A better competing theory is only an inconvenient conclusion if you’re invested in the wrong theory (especially if you yourself created that theory).

I know IQ and these traits are probably correlated (again, since some level intelligence is a prerequisite for most of the traits). But I’m assuming the reason you wrote the post is that a correlation across a population isn’t relevant when you’re dealing with a smart individual who lacks one of these traits.

Replies from: MagnusVinding
comment by Magnus Vinding (MagnusVinding) · 2022-04-12T11:11:52.716Z · EA(p) · GW(p)

I think it's important to stress that it's not just that some people with an extremely high IQ fail to change their minds on certain issues, and more generally fail to overcome confirmation bias  (which I think is fairly unsurprising). A key point is that there actually doesn't appear to be much of a correlation at all between IQ and resistance to confirmation bias.

So to slightly paraphrase what you wrote above, I didn't just write the post because a correlation across a population is of limited relevance when you’re dealing with a smart individual who lacks one of these traits, but also because for a number of these traits (e.g. interpersonal kindness, being driven, and limiting confirmation bias), there seems to be virtually no correlation in the first place. And also because these other skills likely are more easy to improve than is IQ, implying that there is a tractability case for focusing more on developing and incentivizing these other traits.

Replies from: Stefan_Schubert
comment by Stefan_Schubert · 2022-04-12T11:52:47.033Z · EA(p) · GW(p)

I think the studies you refer to may underrate the importance of IQ for good epistemics.

First, as I mentioned in my other comment [EA(p) · GW(p)], the correlation between IQ-like measures and the most comprehensive test of rationality was as high as 0.695. This is especially noteworthy considering the fact that Stanovich in particular (I haven't followed the others' work) has for a long time argued along your lines - that there are many things that IQ tests miss. So if anything one would expect him to be biased in the direction of a too low correlation.

Second, psychological studies of confirmation bias  and other biases tend to study participants' reactions to short vignettes. They don't follow participants over longer periods of time. And I think this may lead them to underrate the importance of intelligence for good epistemics; in particular in communities like the effective altruism and rationalist communities.

I think that people can to some extent (though certainly not fully) overcome conformation bias and other biases through being alert to them (not the least in interpersonal discussions), through forming better mental habits, through building better epistemic institutions, and so on. This work is, however, quite cognitively demanding, and I would expect more intelligent people to be substantially better at it. Less intelligent people are likely not as good as engaging in the kind of reflection on their own and others' thought-processes to get these kinds of efforts off the ground. I think that the effective altruist and rationalist communities are unusually good at it: they are constantly on the lookout for biased reasoning, and often engage in meta-discussions about their own and each others' reasoning - whether they, e.g. show signs of confirmation bias. And I think a big reason why that works so well is that these communities are comprised by so many intelligent people. 

In general,  I think that IQ is tremendously important and not overrated by effective altruists.

comment by Nathan Young (nathan) · 2022-04-08T09:42:28.062Z · EA(p) · GW(p)

I agree that many of these traits are really important, but I guess most (other than altruism) are well correlated with IQ tests. I don't know much about this, but it's my sense that most postitive things are well-correlated with IQ and that's just a pill we have to swallow.

If someone thinks different feel free to say. 

If we were doing tests I would hope we'd try and build tests that found the features we wanted best, but I think if we could only do one test, it would be an intelligence test. And I don't like that, but I think it's true.

Replies from: MagnusVinding
comment by Magnus Vinding (MagnusVinding) · 2022-04-08T10:10:12.310Z · EA(p) · GW(p)

I agree on the correlation point. :)

But I don't think that undermines the point about a potential overemphasis on smarts:

Many of these traits are likely correlated with IQ, but that does not negate the point that one can overemphasize IQ at their expense, and that one can have a high IQ and still completely fail to develop these other traits and virtues.

Moreover, for some of the traits, there doesn't appear to be much of a positive correlation, such as in the case of conscientiousness:

studies suggest that there is a weak negative relationship between conscientiousness and IQ, possibly because people with a high IQ have less of a need to develop conscientiousness in order to perform well.

or myside bias:

studies have even found that “the magnitude of the myside bias shows very little relation to intelligence”.

These findings suggest that the notion that "most positive things are well-correlated with IQ" is often overstated. (Though it's also frequently understated, to be sure.  :)

Replies from: Stefan_Schubert, nathan
comment by Stefan_Schubert · 2022-04-08T10:28:48.363Z · EA(p) · GW(p)

Stanovich, West, and Toplak (who are cited in your last link) developed maybe the most ambitious general rationality test to date. As measured by that test, rationality exhibited a strong correlation with IQ-type tests - 0.695. See Stuart Ritchie's informative review.

Replies from: Telofy, Larks, MagnusVinding
comment by Denis Drescher (Telofy) · 2022-04-08T15:33:22.580Z · EA(p) · GW(p)

It feels a bit silly to post this as a reply to you, because you’re probably either aware of all of this or, otherwise, could convince me that I’m mistaken. But your comment prompted me to think of these points. So I think it’s more useful for other readers who come across this thread. 

  1. I kind of tend to be a bit disappointed with correlations in reality, maybe because of this effect (or #2 and #3 below).
  2. You probably have excellent intuitions when it comes to what correlations really mean, but I suspect that that takes a lot of statistical training. When I think that something is correlated, the scatter plot I have in my mind is around r = 0.95. Then I need to remind myself that that’s not how reality usually works, so it’s probably just r = 0.7 or even r = 0.5, at which point it’s starting to get hard to even see the correlation in the scatter plot.  Just eyeballing it from an online correlation simulator it looks like you get samples that are below average on one dimension even among just the top 5% of samples along the other dimension. I’m guessing that a lot of people intuitively overestimate how correlated r = 0.7 really is and update too strongly on it.
  3. This seems to be compounded by the halo effect. Sometimes I meet really impressively smart people, and then I’m so shocked when they turn out to be quite irrational about a lot of stuff, at least according to me. ^.^ My level of intuitive shockedness in turn strikes me as a bit irrational too, so I suspect the above overestimation of correlations and some halo effect are at work there.
  4. There’s probably a lot of value in combining multiple virtues just as there is great value in combining multiple skills.  I’ve read an amazing article somewhere (maybe by Ozzie or Kat?) that argued that if you master five skills to a normal level of mastery, you may plausibly be the one best-qualified person in the world for jobs that combine these skills.
  5. I wish I knew how to quantify this right, but my intuition is that the 11 qualities in this article would have to be very, very highly correlated or else there’ll be almost no one who excels at all of them. Maybe there’ll be a few more people who are at least decent at all of them. But some of them also function as multiplies of someone’s power, so if they are very powerful because of the qualities  they do have but then lack self-reflection, kindness, respect or are addicted to contrarianism or signaling, that can be catastrophic. So the badness of failures of the correlation reasoning can make it worthwhile to not rely on it exclusively. (To use an unnecessarily violent metaphor: A kind, self-reflected, respectful person with an (for our community) normal IQ may be more like a well-aimed shotgun. A high-IQ person who lacks just one virtue may be like a high-powered unaimed sniper rifle with a loose trigger.)
  6. The researchers have probably thought of this and you know all of this stuff better than me anyway, but, not having had the time to investigate it, I idly wonder how many of these correlation experiments really measure some fairly uninteresting common cause.  Like, I looove solving Raven’s matrices! They give me the most amazing flow experience, and when I can’t solve them in the allotted time, I can’t help but keep trying for hours afterwards. A friend of mine hated them. Sure enough she got a lower score in the same test, but I don’t think I can really derive anything from that that I didn’t already know. So these studies probably need to correct for countless things like need for cognition and such.
  7. Focusing on one metric because all the others are correlated with it anyway is also quite exploitable. If someone is low on altruism but really high on IQ, they may be particularly drawn to exploiting mechanisms for personal gain that focus too much on IQ. So such mechanisms would then lead to a distribution of outcomes where there is one bump for the desired outcome and one bump for all the people willing and able to exploit the mechanism. That may be fine if the desired people add much more value than the exploitative people can take away, but in communities and among employees it’s usually the opposite.
comment by Larks · 2022-04-08T16:41:16.506Z · EA(p) · GW(p)

strong correlation with IQ-type tests - 0.695

In case anyone else misunderstood, this is a positive correlation of +0.695

Replies from: Stefan_Schubert
comment by Stefan_Schubert · 2022-04-08T22:16:25.693Z · EA(p) · GW(p)

Thanks, yeah, that wasn't intended to be a minus sign but just a standard dash. Sorry about about any potential confusion.

comment by Magnus Vinding (MagnusVinding) · 2022-04-08T10:45:57.509Z · EA(p) · GW(p)

Thanks for sharing, I wasn't aware of that. :)

I'm not surprised that there's a strong correlation between those measures. However, I think it's worth keeping in mind that such "general rationality", in the sense of reasoning correctly in various tests, is still quite different from, and still doesn't capture, many of the traits and virtues listed above, such as being highly driven (let alone altruistically driven), resisting signaling drives in high-stakes social contexts, and displaying interpersonal kindness.

Replies from: Stefan_Schubert
comment by Stefan_Schubert · 2022-04-08T10:50:44.424Z · EA(p) · GW(p)

I wasn't saying it has anything to do with, e.g. interpersonal kindness. It was a comment on your the relationship between rationality and intelligence, which your last point related to.

Replies from: MagnusVinding
comment by Magnus Vinding (MagnusVinding) · 2022-04-08T10:53:12.228Z · EA(p) · GW(p)

Yeah, I know; I didn't mean to imply otherwise. :)

comment by Nathan Young (nathan) · 2022-04-08T11:01:41.436Z · EA(p) · GW(p)

Yeah you've changed my mind a bit.

comment by PeterSlattery (Peterslattery) · 2022-04-11T04:34:34.532Z · EA(p) · GW(p)

Thanks for this! Some very quick and somewhat poorly qualified responses:

  • I agree with this. I have felt that IQ/intelligence is overrated by EA for a long time.
    • As an analogy, I think that IQ is like the top speed of a single drone.  It's easy to measure the speed of the drone and think that that matters most, but other factors are much more important (e.g., networking capacity, range and efficiency etc).  Once you have a drone with a top speed of Xkm, you probably don't care much about getting it to be higher speed, unless you are doing some particularly speed demanding task. If you have a tough work environment, long term durability is more important. If you have to work with multiple drones then their networking and social intelligence are probably more important . If you are selling the drones, then appearance and brand appeal are probably more important. In many of these cases, however, it may be much easier to just measure the top speed of the drones and use that to extrapolate their performance.
  • Similarly, I think that EA has a big evaluability bias related to competence assessment - that it probably focuses on IQ/signals of intelligence because these are easier to access and understand than other factors that matter.
    • This probably sometimes leads to suboptimal outcomes. E.g., If you go to Oxford and excel at writing about EA on the forum, you might be as, or more, likely to be hired as  a movement builder for an area than if you get a degree from a less prestigious university where you lead 3+ highly successful social groups and an EA group.
  • As aside, I worry that EA doesn't actually optimise well for finding smarts, because it seems to favour selecting people based on signals of intelligence that aren't as good as getting them to do an IQ or performance test.
    • Many very smart people I know didn't really try hard at school, care about prestige, or focus on status building growing up. Many are very humble and bad at the marketing that gets you status. It seems that this situation often condemns them to be disadvantaged by (EA/most) selection criteria. It may also lead to an over-representation of talented self-marketers in EA settings (which is probably the case in all professional settings) and perhaps also a tendency towards more overconfident or arrogant people being hired (I have only very limited anecdotal evidence for this ever happening and could be totally wrong).
    • Admittedly, there are good reasons to select 'high intelligence signalling 'people over those who lack those signal but are actually as or more intelligent, as it such signal are persuasive to most audiences. It's also not normal to post about your IQ scores but fine to imply them via marks, awards or degrees. So maybe this approach makes sense as a part of a larger theory of change focused on EA professionals looking smart/credible.
    • I will admit that I often make the mistake of instinctively thinking that X person is probably not super competent (or as competent as Y) because of where they went to uni or something like that.  It's very hard to avoid. It also generally a good rule of thumb to think that people who attended x institute are smart etc. So maybe it is unavoidable or something that works on aggregate and saves time over more demanding approaches etc.
  • Some quick suggestions re: "what additional traits distinct from IQ do you think are important and worth prioritizing more?"
    • Calibration/Forecasting/betting performance
    • Sustained performance under pressure
    • Sustained commitment to EA
    • Charisma (key for many roles)
    • Appearance (sadly, this matters much more than it should in social setting)
    • Network size (e.g., for marketing)
    • Emotional intelligence/social intelligence (e.g., in face recognition/reading the eyes test, performance)
    • Extraversion/interpersonal warmth/social novelty needs (key to movement building success)
    • Risk tolerance (e.g., for starting new project)
    • Status indifference (e.g., for low status, high importance 'grunt' work like being an exec assistant or movement builder)
  • As a general comment related to the above -  I think that EA is going to need a lot of pretty average level intelligence level people for accelerating outputs and spreading messages across networks. I think it should stop holding out for so many elite level members.
    • As an example, many star performers in research have a huge pool of support from less competent or intelligent researchers, who will produce a first draft etc of a paper for them so that they can spread their genius more widely across many such papers. If someone like will thinks that having an exec assistant can double his output/impact (or something similar)  then we might be missing out on a lot of impact multipliers by failing to hire such people.
    • Related to that, I sometimes feel that we are trying to slowly recruit teams of 'geniuses' (who may in fact be particularly poorly suited to work with each other), when we more urgently need large teams of people to help 'geniuses'.
comment by Holly_Elmore · 2022-04-14T17:44:33.381Z · EA(p) · GW(p)

And beyond neglectedness, a reason to focus more on these other important traits relative to IQ — at the level of what we seek to develop individually and incentivize collectively — is that many of these other traits and skills probably are more elastic and improvable than is IQ

This is the most important thing to me. We're burning a lot of fuel proving that we have a good (basically) fixed trait, and what's the point? What do we actually gain by knowing the exact smartness ranking of the people in EA? Just seems like a waste of time compared to learning things, gaining skills, forming new collaborations, etc. 

Also disturbs me that being found to be smart seems to be its own reward, instead of an instrument for having a positive impact.

comment by Stefan_Schubert · 2022-04-08T08:11:31.243Z · EA(p) · GW(p)

Fwiw it seems to me that effective altruists do emphasise several of the other mentioned traits a lot. E.g. that seems true regarding "Being good at scrutinizing one’s own ideas and convictions", "Being willing to face unpleasant views and inconvenient conclusions", "Being willing to explore fundamental issues", and "Being driven (by altruistic impact)".

With regards to the first three, effective altruists have celebrated the Scout Mindset [? · GW], cause-neutrality [? · GW], and "weird" ideas; and are more generally emphasising the importance of good epistemics a lot. Similarly, effective altruists emphasise the importance of being driven by impact.

Replies from: MagnusVinding
comment by Magnus Vinding (MagnusVinding) · 2022-04-08T08:30:11.516Z · EA(p) · GW(p)

I agree that most of these other traits do get emphasized, and I probably should have acknowledged that. :)

But I still think it would be beneficial to emphasize most of these traits much more (which is part of the reason why a book like The Scout Mindset is so important). For example, it seems to me [EA · GW] that exploration of fundamental issues is extremely neglected relative to its importance. It also seems to me that potential inconvenience and unpleasantness biases warrant more attention in discussions about the importance of non-human animals and risks of very bad future outcomes.

comment by Teo Ajantaival · 2022-04-12T12:01:31.155Z · EA(p) · GW(p)
a reason to focus more on these other important traits relative to IQ — at the level of what we seek to develop individually and incentivize collectively — is that many of these other traits and skills probably are more elastic and improvable than is IQ

+1. To the extent that IQ may be difficult to improve, it seems good to focus on improving the other important virtues. Yet perhaps people might (for some roles) select heavily for IQ precisely because it — unlike the more improvable virtues — can not so easily be improved after the selection.

(This might also in part explain how commenters might be sometimes talking of different things, i.e. "what to cultivate" versus "what to select for".)

comment by Sami Kassirer · 2022-04-09T01:43:36.416Z · EA(p) · GW(p)

Great post, thanks for sharing! I think you might find Igor Grossmann's work on the psychology of wisdom particularly interesting (https://igorgrossmann.com/projects/wisdom/), if you haven't already been exposed to it :)

Replies from: MagnusVinding
comment by Magnus Vinding (MagnusVinding) · 2022-04-09T10:44:56.443Z · EA(p) · GW(p)

Thanks, it looks interesting. :)

comment by samuel · 2022-04-08T22:22:24.733Z · EA(p) · GW(p)

I've been thinking about this lately, especially since I've started to apply to EA-specific opportunities. It does seem like EA orgs use intelligence as a main filter for hiring, which makes sense given the work (and is far better than plain-old credentialism), but I sometimes wonder if they're filtering out valuable candidates who are more clever, empathetic or dogged than high IQ. Most EA organizations are small so I expect this will change as the community scales to become more inclusive to the full spectrum of skillsets. Note that this is a perspective from the outside looking in and is completely anecdotal. I could be mistaken.

comment by Simon Skade · 2022-04-11T09:21:07.282Z · EA(p) · GW(p)

Out of the alternative important skills you mentioned, I think many of them are very correlated, and I think the relevant stuff roughly boils down to rationality (and perhaps also ambition).

Being rational itself is also correlated with being an EA and with being intelligent, and overall I think intelligence and rationality (and ambition) are traits that are really strong predictors of impact.

The impact curve is very heavy-tailed, and smarter people can have OOMs more impact than people with 15 IQ points less. So no, I don't think EA is focusing too much on smart people, indeed, it would surprise me if it had reached a level where it wouldn't be good to focus even more on intelligence. (Not that I claim I sufficiently argued for this claim, but I can say that this is true in my world model.)