White Supremacy and Artificial Intelligence
In her new book Race After Technology: Abolitionist Tools for the New Jim Code, Ruha Benjamin breaks down the 鈥淣ew Jim Code,鈥 technology design that promises a utopian future but serves racial hierarchies and racial bias. When people change how they speak or act in order to conform to dominant norms, we call it 鈥渃ode-switching.鈥 And, like other types of codes, the practice of code-switching is power-laden. Justine Cassell, a professor at Carnegie Mellon鈥檚 Human-Computer Interaction Institute, creates educational programs for children and found that avatars using African American Vernacular English lead Black children 鈥渢o achieve better results in teaching scientific concepts than when the computer spoke in standard English.鈥 But when it came to tutoring the children for class presentations, she explained that, 鈥淲e wanted it [the avatar] to practice with them in 鈥榩roper English.鈥 Standard American English is still the code of power, so we needed to develop an agent that would train them in code-switching.鈥 This reminds us that whoever defines the standard expression exercises power over everyone else, who is forced to fit in or else risks getting pushed out. But what is the alternative? When I first started teaching at Princeton, a smart phone app, Yik Yak, was still popular among my students. It was founded in 2013 and allowed users to post anonymously while voting 鈥渦p鈥 and voting 鈥渄own鈥 others鈥 posts, and was designed to be used by people within a 5-mile radius. It was especially popular on college campuses and, like other social media sites, the app reinforced and exposed racism and anti-Black hatred among young people. As in internet comments sections more broadly, people often say on Yik Yak what they would not say in person, and so all pretense of racial progress is washed away by spending just five minutes perusing the posts.
But the difference from other virtual encounters is that users know that the racist views on Yik Yak are held by people in close proximity鈥攖hose you pass in the dorm, make small talk with in the dining hall, work with on a class project. I logged on to see what my students were dealing with, but quickly found the toxicity to consist overwhelmingly of…racist intellectualism, false equivalences, elite entitlement, and just plain old ignorance in peak form. White supremacy upvoted by a new generation … truly demoralizing for a teacher. So I had to log off. Racism, I often say, is a form of theft. Yes, it has justified the theft of land, labor, and life throughout the centuries. But racism also robs us of our relationships, stealing our capacity to trust one another, ripping away the social fabric, every anonymous post pilfering our ability to build community. I knew that such direct exposure to this kind of unadulterated racism among people whom I encounter every day would quickly steal my enthusiasm for teaching. The fact is, I do not need to be constantly exposed to it to understand that we have a serious problem鈥攅xposure is no straightforward good. My experience with Yik Yak reminded me that we are not going to simply 鈥渁ge out of鈥 White supremacy (as Jessie Daniels demonstrates in Cyber Racism), because the bigoted baton has been passed, and a new generation is even more adept at rationalizing racism. Yik Yak eventually went out of business in 2017, but what I think of as NextGen Racism is still very much in business more racially coded than we typically find in anonymous posts. Coded speech, as we have seen, reflects particular power dynamics that allow some people to impose their values and interests upon others. As one of my White male students, Will Rivitz, wrote鈥攊n solidarity with the Black Justice League, a student group that was receiving hateful backlash on social media after campus protests:
鈥淭o change Yik Yak, we will have to change the people using it. To change those people, we will have to change the culture in which they鈥攁nd we鈥攍ive. To change that culture, we鈥檒l have to work tirelessly and relentlessly towards a radical rethinking of the way we live鈥攁nd that rethinking will eventually need to involve all of us.鈥 I see this as a call to rewrite dominant cultural codes rather than simply to code-switch. It is an appeal to embed new values and new social relations into the world, because as Safiya Noble writes in Algorithms of Oppression, 鈥渁n app will not save us.鈥 Whereas code-switching is about fitting in and 鈥渓eaning in鈥 to play a game created by others, perhaps what we need more of is to stretch out the arenas in which we live and work to become more inclusive and just. If, as Cathy O鈥橬eil writes in her book Weapons of Math Destruction, 鈥淏ig Data processes codify the past. They do not invent the future. Doing that requires moral imagination, and that鈥檚 something only humans can provide,鈥 then what we need is greater investment in socially just imaginaries. This, I think, would have to entail a socially conscious approach to tech development that would require prioritizing equity over efficiency, social good over market imperatives. Given the importance of training sets in machine learning, another set of interventions would require designing computer programs from scratch and training AI 鈥渓ike a child鈥 (as Jason Tanz wrote in an article for Wired) so as to make us aware of social biases. The key is that all this takes time and intention, which runs against the rush to innovate that pervades the ethos of tech marketing campaigns. But, if we are not simply 鈥渦sers鈥 but people committed to building a more just society, it is vital that we demand a slower and more socially conscious innovation. The nonprofit AI research company Open AI says, as a practical model for this approach, that it will stop competing and start assisting another project if it is value-aligned and safety-conscious, because continuing to compete usually short-changes 鈥渁dequate safety precautions鈥 and, I would add, justice concerns. Ultimately we must demand that tech designers and decision-makers become accountable stewards of technology, able to advance social welfare. For example, the Algorithmic Justice League has launched a Safe Face Pledge that calls on organizations to take a public stand 鈥渢owards mitigating the abuse of facial recognition analysis technology. This historic pledge prohibits lethal use of the technology, lawless police use, and requires transparency in any government use鈥 and includes radical commitments such as 鈥渟how value for human life, dignity, and rights.鈥 Tellingly, none of the major tech companies has been willing to sign the pledge to date. Nevertheless, there are some promising signs that more industry insiders are acknowledging the complicity of technology in systems of power. For example, thousands of Google employees recently condemned the company鈥檚 collaboration on a Pentagon program that uses AI to make drone strikes more effective. And a growing number of Microsoft employees are opposed to the company鈥檚 contract with the US Immigration and Customs Enforcement : 鈥淎s the people who build the technologies that Microsoft profits from, we refuse to be complicit鈥 (Frenkel, New York Times, June 19, 2018.) Much of this reflects the broader public outrage surrounding the Trump administration鈥檚 policy of family separation, which rips thousands of children from their parents and holds them in camps reminiscent of the racist regimes of a previous era. The fact that computer programmers and others in the tech industry are beginning to recognize their complicity in making the New Jim Code possible is a worthwhile development. It also suggests that design is intentional and that political protest matters in shaping internal debates and conflicts within companies. This kind of 鈥渋nformed refusal鈥 expressed by Google and Microsoft employees is certainly necessary as we build a movement to counter the New Jim Code, but we cannot wait for worker sympathies to sway the industry. Where, after all, is the public outrage over the systematic terror exercised by police in Black neighborhoods with or without the aid of novel technologies? Where are the open letters and employee petitions refusing to build crime production models that entrap racialized communities? Why is there no comparable public fury directed at the surveillance techniques, from the prison system to the foster system, that have torn Black families apart long before Trump鈥檚 administration? The selective outrage follows long-standing patterns of neglect and normalizes anti-Blackness as the weather (as Christina Sharpe describes in her book In the Wake), whereas non-Black suffering is treated as a crisis. This is why we cannot wait for the tech industry to regulate itself on the basis of popular sympathies. This edited excerpt from by Ruha Benjamin (2019) appears by permission of the author and Polity Press.