2 min read

When Androids Dream of Electric Black Sheep

When Androids Dream of Electric Black Sheep

Representation matters. While this phrase has been bandied about so often recently as to be rendered almost a cliché, it nonetheless retains its intrinsic worth and basic truth. Many reading are of course aware of the recent controversy surrounding Google and its Gemini AI program. In summary, Google has temporarily paused the AI’s ability to generate images of people after it was found to be generating historically inaccurate images, such as, perhaps most notoriously and click-baity, racially diverse Nazi-era German soldiers.

Of course, the images were inaccurate; no, there was most likely never a female pope; and unfortunately, no, there were no early Black Americans visibly among the founding fathers. But while these images are undoubtably inaccurate, they are not erasure, they are not insidiously corrective as part of an overarching conspiracy, and they are certainly not about to re-write the entrenched past.

They are simply pieces of art, generated by a program that, for all the hype, is an amalgamation of unthinking algorithms fundamentally directed to display a world in which representation is key to inclusion as a value, for a living world that has a deeply historically complicated relationship with diversity. Finding the right balance for truly scaled and systemic representation, in a manner that respects the achievements of the past and the more-various stakeholders of the future, without covering ugly blemishes or ignoring established realities, will be necessarily iterative and at times messy. Gemini is attempting to dream a better world, it just hasn’t sorted this dream from our waking reality yet. Reckoning with a shared sense of self and identity will take time, and may be a constantly moving target; relying on new, playful technology to act as a salve or substitute for critical thought will not get us there any faster.

The backlash also ignores the potential for a deeply empathetic, key insight: it hurts to be ignored—how does it feel when you’re the one overlooked? In many arenas, history still avoids reckoning with itself. For example, it was not until 2018 that the Texas State Board of Education decided that the curriculum should be changed to emphasize that slavery was a primary cause of the Civil War; as of 2015, a state-adopted textbook still referred to enslaved people as “immigrant workers.”1 Google’s images are a technical mistake, they are not the problem.

Certain images have in many cases been metonymous with certain concepts, such as an emaciated African child for poverty and starvation (“just one dollar a day…”), despite many other nations and people experiencing the same plight; however, the backlash here is generally silent, as the connotation is negative and at least in the US safely distant—and no one wants to be the poster race for hunger.    

Many are using this incident to point to the perceived corrosive nature of diversity. We believe it represents an opportunity to reflect upon the opposite. It reduces and diminishes to exclude, and history needs to make space for all who exist within it. Yes, Google, fix the algorithm—who knows when someone will need a realistic, machine-generated image of Martin Van Buren—but to everyone who took umbrage, please don’t forget how this exclusion made you feel, and then press on to the question of why.