In lieu of an abstract, here is a brief excerpt of the content:

  • ¿Human ÷ (Automation + Culture) = Partner?
  • Stephanie Dinkins (bio)

In 2014, I decided to befriend Bina48, a humanoid robot that mimics my identity.1 This relationship led to years of thinking about all manner of automated systems as they relate to Black people—and other nondominant cultures—in a world that already often give us too little and overly focused attention. I am particularly concerned with automated systems of the algorithmic persuasion, aka the code underlying artificial intelligence (AI). In computer science, an algorithm is a set of precise, reusable computational steps designed to accomplish a task or solve a problem.2 They are the building blocks that make up the automated systems governing and revolutionizing many structures of society, including culture, work, ownership, wealth, medicine, embodiment, justice, memory, and love. Algorithms are often proprietary recipes that those deploying them do not wish to disclose. In many cases, even the people who design and code the algorithms are not sure how the systems they've created will function. [End Page 294]

Click for larger view
View full resolution
Figure 1.

Stephanie Dinkins. Still from Conversations with Bina48 (2014–). Image courtesy of the artist.

Beyond questions about the future of work and human domination by machines are questions about what it will mean to be human in the highly automated, artificially intelligent future. How will we sustain ourselves, our minds, our bodies, our communities? What happens when an insular subset of society encodes systems intended for use by most on the planet? What happens when those writing the rules—in this case, we will call it code—do not know, care about, or deliberately consider the needs, desires, or traditions of people their work impacts? What happens if the code that makes decisions about all manner of things disproportionately informed by biased data, systemic injustice, and misdeeds committed to preserving wealth under the pretense of being "for the good of the people"?

I am reminded that the authors of the Declaration of Independence, a small group of white men said to be acting on behalf of the nation, did not extend rights and privileges to folks like me—mainly Black people, women, and my distant enslaved relatives. Laws and code operate similarly to protect the rights of those that write them. I worry that the current path of AI development, which relies heavily on the privileges of whiteness, men, and money, cannot produce an AI-mediated world of trust and compassion that serves the global majority in an equitable, inclusive, and accountable manner. People of color, in particular, cannot afford to merely consume algorithmic systems that significantly impact our liberty, our work and ability to build wealth; our concepts of humanity are developed and encoded with the same biases and causes of systemic injustices we experience today. Unless people of color become authors, testers, and watchdogs of the creation of AI systems, hundreds of years of skewed history, systemic discriminations, and racial myths will perpetuate in these new technologies. If we want the technological matrix we are building with AI to encode a future that honors the full breadth of society and tell a spectrum of stories, then its development must engage a range of people and modes of thought. I wonder if we have it in us to magnanimously envision [End Page 295] an AI-mediated world of trust, compassion, and creativity that serves the majority in a fair, inclusive, and equitable manner.

As much as I worry about the AI-mediated future, I look forward to the arrival of ever more capable automated technologies that will expand our comfort and capabilities. I often fantasize, for example, that a few words spoken into a mobile app will instruct my car to drop me off at home before embarking on the half-hour search for parking. Once parked, the car will lock itself, let me know where it is located, and text me a cheerful goodnight. Perhaps at some point, I will even accept a garbled "I love you" delivered by my car as the smitten driver in the 2018 Volvo's Window commercial does.3

Automated systems with the ability to assist, surveil, judge, mislead, and perhaps even act on their own...


Additional Information

Print ISSN
pp. 294-297
Launched on MUSE
Open Access
Back To Top

This website uses cookies to ensure you get the best experience on our website. Without cookies your experience may not be seamless.