The Stereotype Trap

StereotypeMetaLab.png

That women in science and in the professional world in general are subject to gender biases with real consequences (lower pay, less career opportunities) goes without saying. In this context, I find it important to be aware of how easy it is to be biased myself. Not in order to justify, but to better understand. I have recently made two experiences with my own and fellow female researchers’ biases, in situations where I somewhat slipped into a man’s skin.

I haven’t grown chest hair, but one thing one can do to slip into a man’s skin is to work on data-and computation-heavy projects. I know this is terribly stereotypical, but please go and count the female-to-male ratio on any given computational modeling conference compared to an infant cognition one. In any case, there is only one such project I am involved in, MetaLab, an online platform that assembles meta-analyses on language development, makes them publicly available, and allows data visualization and some computations like power analysis.

Now, slapping on my man skin aka MetaLab poster for the BCCCD conference (see picture above in case you are not quite sure how this would look like), I got one female researcher who looked, mumbled  “Uh, this looks complicated!”, and left. There was another one who actually started talking to me but who stated, before I could even open my mouth: “I am not sure I will understand this. It looks very difficult.” Granted, this is a sample of N=2 (although Christina just told me that she got similar reactions exclusively by women on a similar poster recently), and there were many others that did not say anything like this. Nevertheless, I had never gotten any such reaction on any other project. So man skin experience #1 showed me a few examples of women having a that’s-too-complicated-for-me-bias against themselves.

Well, you might say, this man skin isn’t too convincing. But I have an even better one. It’s my first name. First names ending in ‘o’ are, across many cultures, associated with men rather than women. I think I first got painfully aware of this when the Russian family friend persistently called me “Shoa”, because he just didn’t want to deal with the fact that a little girl’s name ended with an “o”. Fast-forward, and I keep receiving an uncountable amount of mail addressed to “Mr./Herr/Dhr./M. Tsuji”, and recently this involuntary man skin, actually in combination with the MetaLab man skin, culminated in me being imagined as an “over 40-year-old single guy who watches porn movies in his free time.”

The female researcher had been in email contact with me in the context of contributing data to MetaLab. When we met for the first time, she told me she had imagined me to be said single guy, and, so I understood, among others because she had, in her mind, compared a person that voluntarily chooses to work on inputting hundreds of papers into spreadsheets and then analyzing them to some male programmer acquaintances of hers.

Since I had been thinking of writing a piece about how women can be as biased as men, I contacted her and asked whether I could use that story. Now, what follows is a somewhat complex but good lesson in biased thinking, combined with my generation of around-30-year-old-women’s attitude towards feminism that (to digress a bit) gets very well highlighted in the context of the current US presidential debate.

The female researcher replied promptly, further explaining that, first, it was my name that had misguided her into thinking I was male, and, second, once she had assumed I was male, she had linked the spreadsheet-analyzing project to her programmer stereotype. She agreed with me that women can be as prone to gender stereotypes as men, and encouraged me to describe our encounter. Great, I thought, wrote my piece and sent it back to her. In my draft, I compared this incidence to man skin experience #1, implicating that women can be biased against women (man skin experience #1: women not assuming women to be able to do this type of work; man skin experience #2: women not assuming women to be likely to do this type of work), and that they can in addition have stereotypical assumptions about guys, like programmer = porn watcher.

No, she said, that’s not quite what happened. You’re right I had the second bias, the programmer-porn one. But you’re wrong about the first, I didn’t think it unlikely at all that a woman was working on your project, it was just your name that made me assume you were a guy in the first place, and linking you to the second bias would have never happened without the first.

At first I didn’t get it. I got the logic, of course, but not why this distinction was important for my main point: that women can be as biased as men. And on some level I still think so. Is being biased something where it matters which or how many groups of people you are biased against? But after some more thinking, I conclude this was a valuable lesson in how easy we all are trapped in our own presumptions and biases. First, it is true that the name bias was downplayed in my mind, even though she had told me about it very explicitly and this involuntary man skin is a very salient feature of my daily life. It was overridden by the lure of the direct and sensational MetaLab-porn guy link, and this has to do with my presumption (fueled by man skin #1 experience) of people rather associating male than female researchers with a project like MetaLab, which made me immediately jump to concluding that it was the nature of the project rather than my name driving her man assumption (I think at this point it is worth mentioning that we are one man and six women on this project).

Second, it illustrates the important point that we might be more readily allowing ourselves and admitting to have biases against, say, white male programmers than against other groups. Important for two very different reasons, (a) that this is not a “better” kind of bias to have, but (b) that, given our knowledge that many of our biases are implicit, “allowing” ourselves to be explicitly biased against some, but not other groups, does not mean that we, in fact, are less biased against other groups.

Third, (and here it gets really confusing in case you are not yet), in contrast to the point I just made where I assumed we do not allow ourselves to be explicitly biased against women, and here I return to my digression to the US elections: Our generation of women, who are partly living the feminist dream, we have the luxury to NOT vote for a female president just because, or to write this very blog post about how women are biased. But I am very aware that all of this does not diminish the real consequences women in the professional world are facing based on explicit or implicit, wanted or unwanted, gender biases. Rather, it is a lesson in one of the biggest dangers of (gender) stereotypes: That they are so easy to adopt and relate to.

Leave a comment