AI蔑称“铁罐头”已成TikTok种族歧视短剧的遮羞布
内容来源:https://www.wired.com/story/the-ai-slur-clanker-has-become-a-cover-for-racist-tiktok-skits/
内容总结:
【新闻总结】近日,社交平台掀起一场以"Clanker"(原意为机器人运转时发出的嘎吱声)为核心的反AI风潮,这一源自科幻作品的词汇被用作对人工智能的贬称,却在演变中异化为种族歧视的掩护工具。
事件始于19岁黑人创作者哈里森·斯图尔特七月发布的搞笑短剧:在虚构的2044年场景中,他以"肮脏的Clanker"斥责女儿AI男友。该剧获数百万播放量后,他却被网友冠以"Clanker先生"称号。然而八月斯图尔特突然宣布停更系列内容,因其发现评论区出现针对其种族的侮辱性变体词汇,直言"当这些贬称从指向AI转向针对我本人时,一切已失去趣味"。
随着"Clanker"成为谷歌月搜索量超200万的热词,更多创作者开始拍摄机器人遭受歧视的讽刺短剧。但部分视频明显套用美国民权运动前的种族隔离场景:有创作者装扮警察对机器人说出"Clanker该坐公交车后排",餐厅拒服务机器人时标榜"我有机器人朋友"。尽管创作者塞缪尔·雅各布承认借鉴了吉姆·克劳法历史,却坚称这只是"愤怒诱饵"的娱乐手法。
西北大学莫亚·贝利教授指出,这类内容实质是为种族主义笑话寻找借口:"当人们为'机器人不该享有权利'发笑时,折射的正是将奴役制度合理化的深层思维。"而AI技术本身亦存在偏见隐患:OpenAI的Sora模型被指强化对少数族裔的刻板印象,孟菲斯黑人社区因xAI数据中心建设遭遇环境不公,均显示科技发展中的结构性歧视。
斯图尔特对此表示痛心:"我的本意是探讨科技伦理,却成为他人散布仇恨的挡箭牌。"这场以反AI为名的网络狂欢,最终照见了数字时代下经久不衰的种族偏见暗流。
中文翻译:
今年七月,内容创作者哈里森·斯图尔特收到推广"完美"AI女友的营销邮件后,在TikTok发布了一段讽刺短剧,用蔑称"铁疙瘩"来反对人工智能。他扮演一位反对女儿恋情的父亲,在2044年与女儿的机器人男友对峙:"你叫什么名字?不对,你是626S系列机型。这才是你的名字,你这铁疙瘩。"作为最早创作"铁疙瘩"主题短剧的创作者之一,网名为切斯的斯图尔特因视频获得数百万浏览量而被粉丝称作"铁疙瘩小哥"。但到了八月,这位19岁的黑人创作者宣布不再发布相关主题视频。他表示这个梗及其引发的回应已演变为种族歧视。
斯图尔特在视频中解释:"当我看到评论区有人开始叫我'黑铁疙瘩',用这个蔑称指代我本人而非AI和电子产品时,我觉得这完全不好笑。"铁疙瘩一词最早可追溯至1950年代末作家威廉·坦恩,他用该词形容科幻电影中的机器人,但其作为蔑称的普及则源于《星球大战》系列,被用作对反派机器人和士兵的贬称。近几个月来,它已成为对AI全面渗透社会生活的一种抗议形式。
过去三个月间,这个词在谷歌的搜索量超过200万次,社交媒体相关帖子至少达数十万。亚利桑那州参议员鲁本·加列戈斯七月在X平台发文称:"厌倦了对电话重复十遍'转人工'?我的新法案确保您不必被迫与铁疙瘩交谈。"在TikTok和Instagram上,反对AI的浪潮以短剧形式呈现,设想机器人全面融入社会的未来。在这些短剧中,"铁疙瘩"与"铁皮佬""线缆背""漏油怪"等词汇被用作贬称。但部分短剧似乎将铁疙瘩作为黑人的代称,延续着民权运动前时代的种族主义套路。
在一则短剧中,创作者塞缪尔·雅各布身着警服说出:"不知道铁疙瘩该坐公交车后排吗,罗莎·铁克斯?"和"认栽吧乔治·机器人,你这生锈的猴崽子该进监狱了"。另一则由创作者斯坦齐·波滕扎制作的短剧中,女服务员拒绝为对象服务,屏幕标注"视角:2050年的铁疙瘩",她用南方拖腔告诉镜头:"没看见门外标识吗?本店不接待铁疙瘩。"视频标签是常见偏见辩护语的变体:"别较真,我也有机器人朋友。"
雅各布向《连线》承认短剧与种族隔离的关联是刻意为之:"从历史角度看灵感来源很明显,就像1950年代吉姆·克劳法时期的现象。我觉得'历史重演'的概念有趣,至少对象是机器人。"他补充说虽涉及"引战营销",但自己并不认真对待剧情也不认同其中的观念。波滕扎则拒绝置评。
西北大学研究媒体中种族与性别表征的莫娅·贝利教授指出,部分短剧的反黑人潜台词表明有人以此作为种族歧视笑话的借口:"那些走种族幽默路线的人只是想要个借口——这个借口确实不错——来讲他们本想讲的笑话,并为自己的联想能力得意。"多部短剧将铁疙瘩描绘成二等公民:"这种种族主义充分说明,反黑人意识如何深植于我们对劳动、服务与奴役的认知中。"
在TikTok上,捍卫该潮流的人辩称这并非种族主义,因为背景是虚构宇宙而非针对人类。在解析视频中种族主义隐喻的评论区,"别上纲上线"是常见回应。雅各布表示:"我没想太深,只是和其他创作者一样跟风。现在潮流消退,我会继续前进。人们不该再纠结于此。"斯图尔特对有人用自己的作品为冒犯性内容开脱感到困扰:"最让我愤怒的是有人辩称'斯图尔特是黑人也做过铁疙瘩视频,你有什么问题?'我的本意是调侃科技过度发达后的应对,绝非宣扬种族主义。"
AI已被用于生成源自网络梗和TikTok的种族主义图像。OpenAI的生成工具Sora因延续对有色人种、女性、残障人士等群体的偏见而遭抨击。贝利同时指出AI行业的种族歧视甚至体现在技术方法层面,她以孟菲斯90%为黑人居民的博克斯敦社区为例,说明xAI数据中心对当地健康的负面影响,这是AI行业造成的环境种族主义。
贝利强调并非所有铁疙瘩视频都具冒犯性,但人们应采取反AI立场时需审慎思考措辞:"笑话会划分圈内圈外,幽默感正来自'我懂这个,我们是同一阵营'的共鸣。人们真的需要花时间想清楚,当你说某些事情好笑时,你究竟在与谁为伍。"斯图尔特呼应了这一观点:"我清楚看到黑人在媒体中的形象模式——我们最终总是成为笑话的牺牲品。"
英文来源:
In July, just after receiving an email pitch about the “perfect” AI girlfriend, content creator Harrison Stewart made a TikTok skit using the anti-AI slur “clanker.”
Pretending to be a disapproving father, he confronted his daughter’s robot boyfriend in the year 2044. “What’s your name? No it’s not. It’s model number 626 S Series. That’s your name, you dirty clanker,” Stewart says in the video.
As one of the original creators to make clanker-themed TikToks, Stewart, who goes by Chaise online, was dubbed the “clanker guy” by his fanbase after racking up millions of views. But in August, the 19-year-old content creator, who is Black, announced that he would no longer be publishing any more videos on the subject. The joke, he said, and responses to it, had become racist.
“When I go into my comment section and people are starting to call me ‘cligger’ and ‘clanka’ or ‘you’re a dirty clanker’—not voicing those slurs at AI and electronics, but at me—I don’t find that entertaining or funny at all,” Stewart explains in the video.
The origins of clanker date back to late 1950s author William Tenn, who used the word to describe robots from science fiction films, but its adoption as a sort of slur came from the Star Wars franchise, where it was used as a derogatory term toward the antagonist droids and troopers. In recent months, it has become a protest of sorts against the rapid implementation of AI into virtually every aspect of society.
Over the past three months, the term has garnered over 2 million Google searches and at least hundreds of thousands of social media posts. In an X post in July, Senator Ruben Gallego of Arizona wrote, “Sick of yelling ‘REPRESENTATIVE’ into the phone 10 times just to talk to a human being? My new bill makes sure you don’t have to talk to a clanker if you don’t want to.”
On TikTok and Instagram, however, the ongoing backlash against AI has taken on the form of short video skits, envisioning a future where robots have been fully incorporated into society. The term “clanker,” along with “tinskins,” “wirebacks,” and “oil bleeders” are used as pejoratives in these skits. But some of these skits appear to be using clankers as stand-ins for Black people, perpetuating racist tropes and scenarios that harken back to a pre–Civil Rights era.
In one skit, creator Samuel Jacob dresses up in a police officer’s uniform and throws out phrases such as, “Don’t you know clankers sit in the back of the bus, Rosa Sparks?” and “Come on George Droid, looks like it’s jail time for you, rust monkey.” Another skit by TikTokker Stanzi Potenza depicts a waitress at a diner acting out a scenario in which she’s refusing service to the subject with the words “pov: you’re a clanker in 2050” sprawled across the screen. Speaking in a Southern drawl, she tells the camera, “Didn’t you see the sign outside? We don’t serve clankers here.” The caption underneath the video is a variation of a common phrase often used by people to defend their own prejudices: “Don’t worry, I have robot friends.”
Jacob tells WIRED the parallel between his skit and racial segregation in the US is intentional.
“ It's pretty obvious what it's based off of, in terms of the historical standpoint, like everything that was happening in the 1950s and such with the Jim Crow laws and stuff,” he says. “I thought it was a funny idea of ‘history repeats itself sometimes,’ but at least it would be against robots.” He adds that while he’s engaging in a “little bit of rage-baiting,” he doesn’t take the skit seriously nor does he hold the beliefs portrayed. The backlash, he says, is likely to remain a part of what he does and is “ something I gotta learn to deal with and just move on from.” Potenza declined WIRED’s request for comment.
Moya Bailey, a professor at Northwestern University who specializes in the representation of race and gender in the media, says the anti-Black subtext to some clanker skits suggests that some are using the term as a justification for racist jokes.
“I think the folks that go that route of racist humor honestly wanted an excuse—and it's a pretty good one—to make some jokes that I think they just wanted to make and felt clever in making those connections,” she says. Several of the skits show clankers being treated as second-class citizens. “For me, the racism very much shows just how embedded and how tied anti-Blackness is with our ideas of work and labor and service and servitude.”
On TikTok, people defending the trend argue it’s not an example of racism because it's depicted in the context of a make-believe universe in which humans aren’t being targeted. “It’s not that deep” is a comment that is commonly thrown around under videos dissecting the seemingly racist references in clanker videos.
“ I didn't think too deeply about this,” Jacob says. “I was pretty much riding the trend like a lot of other creators were. Now it's slowly died out, and I'm just going to keep moving on to the next thing. I don't think people should harp on it any longer or give anymore energy to it. Don't want to keep perpetuating it into something bigger than it needs to be.”
Stewart says he’s bothered that people used his skits to provide cover for the offensive ones.
“ What got me more upset than anything was people were justifying it by saying, ‘[Stewart] is Black and he made clanker videos, so what's your problem?’” he says. “ I was poking fun of the fact that if things got really that advanced, how would we deal with it? It's not supposed to be a racist thing.”
AI has already been used to generate racist imagery that draws from stereotypes featured in memes and TikToks. The generative AI tool Sora from OpenAI has come under fire for perpetuating biases against people of color, women, disabled people, and other minority groups in the content being prompted by users.
Bailey also points out that racism within the AI industry goes as far as the actual methods used to power it. She references the negative health impact of xAI data centers in a Memphis neighborhood called Boxtown, which is 90 percent Black, as an example of environmental racism inflicted by the AI industry.
Though not all clanker videos are offensive, Bailey says people should carefully consider the tropes and references they’re using to take an anti-AI stance. “Jokes create an in-group and an out-group,” she says. “That's part of what makes it humorous, is like, ‘I get this. We're on the same team, we're on the same side.’ And so I think people really need to take some time to think through who it is that they're aligning with when they say that certain things are funny.”
Stewart echoed a similar perspective: “I see a pattern with how Black people are portrayed in the media and how we're the butt of the joke at the end of the day.”
文章标题:AI蔑称“铁罐头”已成TikTok种族歧视短剧的遮羞布
文章链接:https://www.qimuai.cn/?post=1447
本站文章均为原创,未经授权请勿用于任何商业用途