每次啟動(dòng)Siri,都會(huì)聽(tīng)到一個(gè)美妙的女聲和我們say hi。提出的問(wèn)題得到解決,Siri那溫柔的聲音簡(jiǎn)直猶如天籟。Siri為何是女聲?因?yàn)槿藗兏矚g女性的聲音?為何網(wǎng)站上在線客服的圖標(biāo)永遠(yuǎn)是頭戴耳機(jī)、面帶微笑的妙齡女子?究其原由,人們默認(rèn)女性“天生”就是提供幫助的人。想要將這傾斜的天平擺正,我們何不從改變語(yǔ)音助手的聲音開(kāi)始……
When the makers of Apples Siri unveiled Viv1) at TechCrunch Disrupt NYC last month, the crowd—and press—swooned2). Pitched as “the intelligent interface for everything,” Viv is a personal digital assistant armed with a nearly transcendent level of sophistication. She is designed to move seamlessly3) across services, and be able to fulfill complex tasks such as “Find me a place to take my peanut-free uncle if it rains tomorrow in Cleveland.” Viv is also just the latest virtual helpmeet with a feminine voice and female name. In addition to Siri (Norse for “beautiful woman who leads you to victory”), her sorority sisters include Amazons Alexa and Microsofts Cortana (named after a voluptuous4) character in the video game “Halo,” who wears a “holographic5) body stocking”).
Why are digital assistants overwhelmingly female? Some say that people prefer womens voices, while others note that in our culture, secretaries and administrative assistants are still usually women. Regardless, this much is certain: Consistently representing digital assistants as female matters a lot in real life: It hard-codes6) a connection between a womans voice and subservience.
As social scientists explore the question of why women lag so far behind men in workplace leadership, theres increasing evidence that unconscious bias plays an important role. According to Erika Hall, a professor at Emory Universitys Goizueta Business School, unconscious bias has its origins in the “cultural knowledge” we absorb from the world around us. This knowledge can come from movies and television, from teachers and family members; we acquire it almost osmotically by living in our society. Unconscious bias happens when we then engage in discriminatory behaviors because we unwittingly use this knowledge to guide our actions.
And this knowledge is everywhere: Our society largely depicts women as supporters and assistants rather than leaders and protagonists. A recent study found that women accounted for only 22 percent of protagonists in the top-grossing films of 2015 (and only 13 percent of protagonists in films directed by men). A comprehensive review of video game studies found that female characters are predominately supporting characters, often “assistants to the leading male character.” And a study of prime-time television found that women comprise the majority of aides7) and administrative support characters. These create “descriptive stereotypes” about what women are like—that women are somehow innately more “supporter-like” than “l(fā)eader-like.”
Because Viv and her fellow digital assistants are female, their usage adds to the store of cultural knowledge about who women are and what women do. Every time you say, “Viv, order me a turkey club” or “Viv, get me an Uber,” the association between “woman” and “assistant” is strengthened. According to Calvin Lai, a Harvard University post-doc who studies unconscious bias, the associations we harbor depend on the number of times we are exposed to them. As these A.I. assistants improve and become more popular, the number of times were exposed to the association between “woman” and “assistant” increases.
The real-world consequences of these stereotypes is well-documented: Research has shown that people tend to prefer women as supporters and men as leaders. A study of engineering undergraduates at the University of Michigan found that when students presented work, the men tended to present the material and the women tended to play the role of “supporter of the male expert.” In another study, when people were shown identical resumes with either male or female names for a lab manager position, they rated the male candidate significantly more competent and hirable. A third study found that saleswomen earned less than salesmen in part because theyd been denied support staff—why would a supporter need a supporter, after all?
While “descriptive stereotypes” lead to women not being perceived as suitable for leadership positions, stereotypes can be prescriptive, too: Women are expected to conform to the stereotype of being a supporter or helper, and rejected or punished for failing to do so. Linguist Kieran Snyders study of performance reviews in tech companies showed that women are routinely criticized for having personality traits that dont conform to feminine stereotypes. Women, but not men, were consistently docked8) for being “abrasive9)” and not “l(fā)etting others shine.” In other words, they were punished for not being good helpers and supporters.
In a study by New York University psychologist Madeline Heinemann, a woman who stayed late to help a colleague was rated less favorably than a man who stayed to help—but penalized more when she declined to stay to help. Indeed, because women are expected to be helpers, they dont actually accrue10) any reward for doing it—theyre simply living up to the expectation. But if they decline to help, they are seen as selfish. Women are aware of this expectation, too: In a study of medical residents, a female medical resident reported that when leading others, “The most important thing is that when I ask for things they should not sound like orders.”
Ultimately, the more our culture teaches us to associate women with assistants, the more real women will be seen as assistants, and penalized for not being assistant-like. At this moment in culture, when more and more attention is being paid to womens roles in the workplace, its essential to pay attention to our cultural inputs, too. Lets eschew11) the false choice between male and female voices. If these A.I. assistants are meant to lead us into the future, why not transcend gender entirely—perhaps a voice could be ambiguously gendered, or shift between genders? At the very least, the default12) settings for these assistants should not always be women. Change Viv to Victor, and maybe one fewer woman will be asked to be the next meetings designated note-taker.
上個(gè)月(編注:原文發(fā)表于2016年6月),蘋(píng)果語(yǔ)音助手Siri的制作團(tuán)隊(duì)在紐約國(guó)際創(chuàng)新峰會(huì)上推出了人工智能助手Viv,大眾和媒體紛紛為之癡迷。被定位成“萬(wàn)能智能界面”的Viv是一款個(gè)人數(shù)字助手,她搭載的技術(shù)水平堪稱(chēng)卓越。Viv的設(shè)計(jì)旨在可以于各種服務(wù)之間無(wú)縫切換,并能完成復(fù)雜的任務(wù),例如“如果明天克利夫蘭下雨,找一個(gè)我和我不吃花生的叔叔去吃飯的地方”。Viv只是最新的擁有女性聲音和女性名字的虛擬助手。除了Siri (挪威語(yǔ)中指“引領(lǐng)你走向勝利的美女”),Viv的姐妹團(tuán)還包括亞馬遜的Alexa和微軟的Cortana (名字取自電子游戲《光暈》里一個(gè)身著“全息緊身衣”的性感角色)。
為什么數(shù)字助手絕大多數(shù)都是女性呢?有人說(shuō)人們更喜歡女性的聲音,而另一些人則指出,在我們的文化里,秘書(shū)和行政助理通常仍以女性為主。不管怎樣,有一點(diǎn)是肯定的:一貫把數(shù)字助手描繪成女性對(duì)現(xiàn)實(shí)生活影響很大——這種做法在女性聲音和從屬地位之間建立了一種牢固的聯(lián)系。
為什么職場(chǎng)領(lǐng)導(dǎo)層中女性的數(shù)量遠(yuǎn)遠(yuǎn)少于男性?隨著社會(huì)學(xué)家不斷探索這個(gè)問(wèn)題,越來(lái)越多的證據(jù)表明無(wú)意識(shí)的偏見(jiàn)發(fā)揮著重要的作用。根據(jù)埃默里大學(xué)戈伊祖塔商學(xué)院艾瑞卡·霍爾教授的說(shuō)法,無(wú)意識(shí)的偏見(jiàn)源于我們從周?chē)澜绲玫降摹拔幕R(shí)”,這些知識(shí)可能來(lái)自電影和電視節(jié)目,來(lái)自老師和家人。生活在這個(gè)社會(huì),我們幾乎是潛移默化地獲得了這些知識(shí)。當(dāng)我們有厚此薄彼的行為時(shí),無(wú)意識(shí)的偏見(jiàn)便發(fā)揮了作用,因?yàn)槲覀儾唤?jīng)意地用這種知識(shí)指導(dǎo)我們的行動(dòng)。
并且,這種知識(shí)無(wú)處不在:多數(shù)情況下,我們的社會(huì)把女性刻畫(huà)成協(xié)助者和助手而非領(lǐng)導(dǎo)者和主人公。最近一項(xiàng)研究發(fā)現(xiàn),2015年票房大賣(mài)的電影中只有22%的主角是女性(而由男性執(zhí)導(dǎo)的電影中,只有13%的主角是女性)。全面回顧電子游戲研究后發(fā)現(xiàn),女性角色大多為配角,通常是“男性領(lǐng)導(dǎo)者角色的助手”。對(duì)黃金時(shí)段電視節(jié)目的研究發(fā)現(xiàn),女性占了助手和行政支持角色的大多數(shù)。這些都造就了關(guān)于女性是什么樣子的“描述性的模式化形象”——女性似乎天生更像隨從者,而不是領(lǐng)導(dǎo)者。
因?yàn)閂iv和她的數(shù)字助手伙伴們都是女性,對(duì)它們的使用增加了關(guān)于女性角色以及女性工作內(nèi)容的文化知識(shí)。每一次你說(shuō)“Viv,幫我訂個(gè)火雞肉三明治”或“Viv,幫我叫下優(yōu)步”時(shí),“女性”和“助手”的聯(lián)系就得到了加強(qiáng)。根據(jù)加爾文·萊的說(shuō)法,我們腦海中的這種聯(lián)系取決于我們接觸到它的次數(shù)。加爾文·萊是哈佛大學(xué)研究無(wú)意識(shí)的偏見(jiàn)的博士后。隨著這些人工智能助手不斷改進(jìn)并且變得越來(lái)越受歡迎,我們接觸到“女性”與“助手”之間聯(lián)系的次數(shù)也在增加。
有充分的證據(jù)表明這些模式化形象對(duì)現(xiàn)實(shí)世界有影響:研究發(fā)現(xiàn),人們往往喜歡把女性當(dāng)做支持者,把男性當(dāng)做領(lǐng)導(dǎo)者。對(duì)密歇根大學(xué)工科大學(xué)生的研究發(fā)現(xiàn),學(xué)生展示作業(yè)時(shí),男生傾向于展示材料,而女生傾向于當(dāng)“男性專(zhuān)家”助手的角色。在另一項(xiàng)研究中,人們接到兩份相同的簡(jiǎn)歷,申請(qǐng)的都是實(shí)驗(yàn)室經(jīng)理的職位,一份使用男性姓名,另一份使用女性姓名。他們認(rèn)為男性候選人更能勝任工作,可以被雇用。還有一項(xiàng)研究發(fā)現(xiàn)女銷(xiāo)售員比男銷(xiāo)售員賺得少,部分原因是她們得不到助手——說(shuō)到底,一個(gè)助手為何還需要輔助人員呢?
雖然這些“描述性的模式化形象”使人們普遍認(rèn)為女性不適合領(lǐng)導(dǎo)者的位置,但模式化形象也可能是規(guī)定性的:人們期望女性符合這種模式化形象,成為支持者或幫助者,如果不這么做,就會(huì)被拒絕或懲罰。語(yǔ)言學(xué)家基蘭·斯奈德對(duì)科技公司績(jī)效評(píng)估的研究發(fā)現(xiàn),女性員工經(jīng)常會(huì)因?yàn)橛胁环吓栽谌藗兡X海中形象的個(gè)人特點(diǎn)而受到批評(píng)。女性,而不是男性,總是因?yàn)楸憩F(xiàn)“粗魯”或“喧賓奪主”而被扣工資。換句話說(shuō),她們因不是一個(gè)好助手或支持者而受罰。
紐約大學(xué)心理學(xué)家瑪?shù)铝铡ずD难芯恐赋觯幻舻胶芡韼椭碌呐缘玫降脑u(píng)價(jià)不如留下來(lái)幫忙的男性高,但是她如果拒絕留下幫忙卻會(huì)受到更多的處罰。實(shí)際上,因?yàn)槿藗兤谕允菂f(xié)助者,所以她們并不會(huì)因此獲得任何獎(jiǎng)賞——她們只是達(dá)到了人們的期待而已。但是如果她們拒絕幫忙,就會(huì)被認(rèn)為自私。女性也意識(shí)到這種期望:在一項(xiàng)對(duì)住院醫(yī)生的調(diào)查中,一名女住院醫(yī)生提到,在領(lǐng)導(dǎo)別人時(shí),“最重要的一點(diǎn)是當(dāng)我要求大家做事時(shí),我的話聽(tīng)起來(lái)不應(yīng)該像是命令”。
最終,我們的文化教導(dǎo)我們把女性和助手聯(lián)系在一起的次數(shù)越多,現(xiàn)實(shí)生活中就會(huì)有越多的女性被看作助手,并且因?yàn)樗齻儾幌駛€(gè)助手的樣子而被懲罰。在文化的現(xiàn)階段,隨著女性在職場(chǎng)的地位受到越來(lái)越多的關(guān)注,重視我們的文化輸入也至關(guān)重要。我們要避免在男性聲音和女性聲音之間做選擇這樣荒謬的想法。如果我們打算讓這些人工智能助手引領(lǐng)我們走向未來(lái),何不完全超越性別呢?也許可以使用分不清男女的聲音,也可以是男女聲音的轉(zhuǎn)換。最起碼,這些助手的默認(rèn)設(shè)置不應(yīng)該總是女性。把Viv換成Victor, 或許下次會(huì)議指定的記錄員就會(huì)少一些女性了。