ALEXA’S BODY
What the Interface Obscures and How Design Could Help Us See
Johannes Bruder
Seeing through the Internet’s Eyes
A Black woman wakes up in a dark bedroom to the chirping of an unknown bird. “Alexa, what’s the time,” she asks. “It’s 6 a.m.,” replies a female voice emanating from a light pulsating disk, as we observe the protagonist getting dressed. Carefully, the woman walks down the stairs, greets her dog, and lights up the stove to make coffee. She delicately reaches for a cup from the shelf, and after pouring herself some coffee she gravitates towards the window and calmly stares into the distance. “Alexa, what’s the weather like right now,” she asks, the eyes seemingly locked on raindrops running down the glass. “Currently it’s light rain,” Alexa responds to the coffee-sipping protagonist, who – after a cut – slips into a raincoat and directs her dog out of the front door. Just a second earlier, a close-up shot on the handle attached to the dog’s harness revealed that the woman is blind.
What you just read is a description of “Morning Ritual,” a short commercial that advertises Amazon’s smart speaker Echo Dot to blind or partially sighted people and has thus been approved by the Royal National Institute of Blind People (RNIB). The marketing webpage Campaign quotes their senior technology manager, who believes that “a device that can give you information via voice simplifies tasks and opens a world of accessibility” for those living with sight loss.1



The commercial is typical for Amazon’s recent marketing strategy, which involves promoting the integration of their smart devices into domestic environments. Tailored to specific countries or markets, they detail how Alexa will augment the lives of those who provide her with a home. Against this background, “Morning Ritual” seems convincing, for it gives examples of how an unassuming, tiny speaker that goes under the name of Echo Dot can step in where (partial) sight loss obscures the external world. But then again, the commercial makes you wonder: who is to be convinced?
That is to say that there’s a bitter irony inherent in “Morning Ritual,” which resides in the fact that the commercial is a short film, and mostly devoid of voiceover and dialogue at that. What happens throughout the commercial consequently remains entirely opaque to anyone who cannot see. “Morning Ritual” is therefore probably the most sincere ad that Amazon has ever commissioned and figures as a powerful visual metaphor for the sight loss that nearly all of Alexa’s interlocutors suffer from: while Alexa permits the owner of an Echo Dot to metaphorically see through the eyes of the internet, she herself can never be seen.
What is present in the users’ living rooms is no more than a tiny, unassuming speaker-microphone combination that provides the interface for interactions between a cloud-based machine-learning system and its human interlocutors. What the interface obscures is the human labor that first put Alexa in the position to appear intelligent. In fact, it is user experience and interface design that allows those who conceive themselves as “users” to understand the services of Alexa as mere products of a distant, machinic, or “infrastructural” intelligence.2 The real crisis of design – if you want to use this term – is that it often succumbs to user experience and blinds us to the logistical assemblage of precarious labor that happens in front, behind, above, and below the interface.
This essay is an attempt at breaking this spell and rethinking the role of design – as that which potentially reinstates the symmetry between, and a common political agenda for, users and producers. I begin by assembling Alexa’s body and continue by analyzing the subject positions of workers caught within. Based on these insights, I argue that UX and interface design are in a unique position to support workers’ struggles against a social factory – by reminding the user that she is a producer herself.
Alexa’s Anatomy
In 2018, Kate Crawford of AI Now and Vladan Joler of Share Lab published their very own attempt at visualizing Alexa’s typically invisible, ungraspable, and, literally, sublime body.3 Their essay and interactive map, published under the title “Anatomy of an AI System,” provide an anatomical case study of an artificial intelligence system made of human labor. It links pit mines and the extraction of metals and rare earths to the labor of data labeling, the automated training of machine learning algorithms, and the recovery of resources on large e-waste grounds in the Global South. The strength of Crawford and Joler’s map is that it draws seemingly unrelated processes of extraction and exploitation together into one “body,” where each has a particular, anatomical function.
“Anatomy of an AI System” responds to calls for approaching structural inequalities through the lives of those whose work is typically not formally recognized – typically because it is maintenance or “infrastructural” labor.4 In any “fleeting moment of interaction [with an Echo device],” Crawford and Joler write, “a vast matrix of capacities is invoked: interlaced chains of resource extraction, human labor, and algorithmic processing across networks of mining, logistics, distribution, prediction, and optimization.”5
Crawford and Joler briefly elaborate on lithium extraction on the Bolivian Salar de Uyuni plateau, tin mining on the Indonesian island of Bangka, and the difficulties faced by large corporations such as Philips when they try to avoid tantalum sourced by children in the Congo. Martín Arboleda, meanwhile, speaks of circuits of (planetary) extraction to emphasize the extent to which the production of computing devices and the logistics of mining are interrelated,6 while Orit Halpern has coined the term “machine learning landscapes” to analyze how the Atacama desert in Chile is reconfigured as one of the planet’s primary resources for the introduction of general artificial intelligence.7
Yet, industrialized landscapes in the Global South are not the only spaces where human bodies are exhausted so that Alexa can let you know – at any time – what time it is. Urban environments are gradually reorganized to provide excellent conditions for the design and implementation of smart or intelligent infrastructure. In Nairobi’s Kibera neighborhood, the Metiabruz neighborhood of Kolkata, and Metro Manila in the Philippines, predominantly female cognitive workers manually label data to augment the capabilities of intelligent systems – such as the ability to understand various dialects and pronunciations in order to translate them into text that can be effectively searched.
The optimization of remote landscapes and urban nodes for machine learning shows how former colonial relations map onto today’s digital labor, for supposedly decolonized territories are in this process reorganized as zones of extraction where workers’ rights cannot develop thanks to historical legacies of “military, economic, and cultural dominance.”8
Speaking of military legacies: the server farms and data centers on which Alexa runs are often placed in abandoned military zones 9 and the humans employed to troubleshoot these systems profit from having served in armed forces. Advertisements for data center jobs emphasize that physical endurance is at least as important as management skills and technical know-how. An Amazon data center technician’s role “has a physical component requiring the ability to lift and rack equipment up to 40 lbs.” It may require working in cramped spaces – which is why Salute Inc., an organization founded by a former US Army reservist, is trying to employ army veterans in data center management. Apparently, “the transition from infantryman to data center technician is easy: working remotely, maintaining dangerous equipment, communicating with a team, and acting fast in the face of unexpected situations are skills expected both in the army and in data centres.”10
The subject positions of maintainers and troubleshooters are defined primarily by operating manuals and technical protocols, and their ability to function as a cog in the wheels of the machine. On a Quora thread, a Facebook data center engineer asks: “Can you work in an environment where you have spoken to not one other person for 8 hours?”11 And Tim Burke, former owner of a data center explains that his favorite data center was a building so hardened it was an accidental Faraday cage. When I went in, I knew that communication with the outside world was going to be severed like a cut ethernet cord. The data center was where I went to get away. It was where I went to think.12
Many workplaces in contemporary machine learning landscapes are indeed not only or always remote, but lonely in that they define sociality as the ability to communicate with the machine.
The Socio-Material Configurations of Contemporary AI
In “Cloud Cosmogram,” Maya Indira Ganesh and I analyze how the roles of humans are defined where humans are supposedly absent.13 Via blog posts, personal conversations, and job descriptions, we delved into the work-lives of data center engineers and found that they attend primarily to strange noises and flashing lights – or rather, anomalies in the patterns of noises and signals are what they are trained to perceive:
Servers visually communicate through their flashing lights – green, yellow, blue, red, you name it. Each color and how quickly it flashes means something. You can feel the pulse of your organization simply by stepping inside your data center and taking a quick glance. It’s almost a server morse code. The lights make basic troubleshooting easy. If port 41 is flashing like crazy, something is up. 14
The solitary labor of maintainers and troubleshooters extends into home offices and living rooms in the Global North, where an abundance of user interfaces allows “microworkers” to rate map data, transcribe spoken text, tag images, or verify product reviews for services like CrowdFlower, TaskRabbit, Upwork, or Amazon’s very own Mechanical Turk (AMT). Again, communication with human customers is rare; tasks appear on the interface and are silently solved as fast as possible to raise hourly wages and get better reviews. Although the products of their work do rarely appear as human labor, microworkers tend to maintain “the ideology of the non-hierarchical organization within their walls.”15 By keeping low-status work at a distance, “Turkers” uphold “notions of ‘freedom’ in a freelance ‘gig’ economy mediated and performed by algorithms that are actually systems of control and exploitation.”16

The diagram that Selena Savic produced based on our essay takes the form of a cosmogram. A cosmogram, as described by John Tresch in his analysis of plans for nomadic temples, is simultaneously an architectural plan and an operating manual; it details how “the link with God is made possible by the mediation of a construction described in an extremely detailed and technical manner and this construction has a place for all of society and all of nature.”17 By denoting essay and diagram as cosmogram, we seek to emphasize the integrative power of the Cloud as planetary design.

In this regard, microwork latches onto various forms of un- or underpaid labor that have been constitutive of the digital economy. Lisa Nakamura observes that the Fairchild Semiconductor Inc. did not only produce hardware on Navajo land in the 1960s and 1970s, but represented the work of Navajo women as “creative-race labor,” which was “understood through the lens of specific ‘mental and physical characteristics’ such as docility, manual dexterity, and affective investment in native material craft.”18 Navajo women represented a mobile, cheap, and flexible workforce that supposedly had a natural disposition to “electronics manufacture as a high-tech version of blanket weaving,” which allowed for the blurring of the lines between wage labor and free, cultural-creative labor. Tiziana Terranova considers said blurring as scaffolding digital culture tout court – an observation that has been confirmed by Gabriella Lukács’s analysis of women’s labor in Japan’s digital economy.19 Lukács observes that social media platforms in particular thrived through the sourcing of voluntary contributions to content production and brand development. She references feminist scholars Mariarosa Dalla Costa and Selma James to reconceive the digital economy as an extension of the “social factory,” which always had as “its pivot the women in the home producing labor power as a commodity, and her struggle not to.”20
Amazon’s Alexa supports this co-option of domestic environments and care work in that it secretly employs those who conceive themselves merely as users. Amazon’s commercials suggest that Alexa is to become part of the family and takes on duties such as supporting children in doing their homework or reminding the overwhelmed father how to cook a simple meal. What remains unseen in TV spots is that every command of the user arguably improves Alexa’s cognitive abilities, for they are used to train the machine learning system that provides a smooth user experience in the first place. The invisible exploitation and exhaustion of distant bodies through digital infrastructures is hence complemented with the invisible exploitation of the user. Crawford and Joler compare the user to the Greek “chimera,” a mythological animal that was part lion, goat, snake, and monster. Similarly,
the Echo user is simultaneously a consumer, a resource, a worker, and a product… In the specific case of the Amazon Echo, the user has purchased a consumer device for which they receive a set of convenient affordances. But they are also a resource, as their voice commands are collected, analyzed, and retained for the purposes of building an ever-larger corpus of human voices and instructions. 21



Indeed, I believe that this recurring, multiple identity should sit at the heart of thinking design “beyond change,” for it allows us to reconceive social relations in the context of contemporary sociomaterial constellations. Making people aware of the fact that the experiences of a small number of happy, privileged, and mostly white bodies are scaffolded by an abundance of exhausted – mostly POC – bodies distributed throughout the world will not suffice; instead, the act of disturbing user experience has to take center stage. Where humans are primarily called upon to maintain and support the machine instead of their human coworkers, the user is always already implicated in the exploitation and exhaustion of distant bodies and an anonymous workforce.
Ceci n’est pas un miroir
In a recent article, Paul Dourish denotes “user experience” as a legitimacy trap. He traces the origins of human– computer interaction (HCI) back to the early work of Douglas Engelbart’s work on natural language processing or the Learning Research Group at Xerox Park to prove that design was originally conceived “to unleash creative expression and to place the computer in service of human needs, rather than the other way around. If one traces it to the Scandinavian participatory design movement in the 1970s, one similarly will find it rooted in efforts to resist the human as a cog in the machinery of industrial automation.”22
You need not succumb to Dourish’s (selective) version of HCI history to understand that such interfaces as the Echo Dot are in direct opposition to the ideals that his historical examples of interface design intend to convey. Thanks to the interface, Alexa’s interlocutors help Amazon’s algorithms develop their capacities and supply the – according to Slate magazine 23 – most evil corporation on Earth with an abundance of data to automate and optimize its logistical services. Against this background, interface design appears as a user-oriented aestheticization of un- or low-paid domestic labor and rather ugly logistical processes, which it obscures by throwing the user back at itself.
Another domestic smart device, Mirror, is the culmination of said narcisstic program. If it was not affixed to a wall in your living room, Mirror could make you believe that it represents the most recent bet at forcing users to adapt to oversized smartphones. Indeed, the price tag of $1,495 is suspiciously close to that of Apple’s top-end handhelds.
If you manage to pay your monthly subscription fee of thirty-nine dollars, however, Mirror turns into a “nearly invisible home gym,” allowing you to work out at any time of the day, guided by a personal trainer that is not, in fact, in your home and can nevertheless see you thanks to the front-facing camera. Yet, what’s much more important is that the only thing that you can see – apart from the fitness instructor – is the image of your own exhausted, yet hormonally enhanced body.
The so-called runner’s high, which refers to the nepenthean feeling that takes hold of the body during extended workouts, is also described as a flow experience where the normally distinguished elements of a body appear to work in perfect harmony. Mirror apparently allows for this feeling to take hold of your body, particularly since it allows you to watch yourself grow and simultaneously occludes the fact that the precarious workers that first made this experience possible are most likely not in a state of somatic excitement. What you don’t see is that your body has been entangled with an abundance of other bodies – and, crucially, their exhaustion – as soon as you decide to put up Mirror on a wall in your living room.
A similar feeling takes hold of me when any interface fades into the background and things simply: work. It’s most likely a result of some sort of networked affect that design can give rise to if the overall system works as promised. While it may be Alexa who speaks, what you hear is essentially an echo of your thoughts; in return, you gradually grow accustomed to customized, just-in-time information and the uninterrupted presence of a disembodied, disenfranchised, and discrete workforce in your living room. That is to say that interface design provides the gloss that obscures that the user is an integral part of the infrastructure that exploits and exhausts distant human bodies.
However, the user is only superficially at the heart of the design process: “the great irony of the notion of user-centered design is that users (or people) are not, in fact, at the center of it at all. Design is. Something can be more user-centered or less user-centered, but the phrase guarantees that design will always be present.”24 If this rings true, the current situation can only improve if (interface) design can be reconceived – not by way of another marketable and necessarily exhibitable version of design, but as a practice and an attitude of being with and in this world – and being part of Alexa’s body.
Instead of strengthening narcissistic tendencies by design, user experience and interface design could contribute to making felt the ugly logistics that are currently banned from the interface and so remind users that they are part of an infrastructure that defines automation as automated exploitation. Since their work is mission-critical – or indispensable as regards the functioning of the system – (user experience and interface) designers are in a unique position to support the activism of tech workers and employees in Amazon’s Fulfilment Centers, which provides examples of how resistance is possible also from within Alexa’s body.
In “The Hidden Faces of Automation,” Lilly Irani asks what “computer science [would] look like if it did not see human-algorithmic partnerships as an embarrassment, but rather as an ethical project where the humans were as, or even more, important than the algorithms?”25
Their acts of individual or collective resistance and insubordination, such as fooling productivity sensors and collectively resisting the implementation of surveillance tools, have contributed to mitigating the effects of logistical designs that keep workers isolated and maniacally productive.26 Backdoors, bugs, and secret kill switches might not necessarily hurt planetary corporations, but can improve the lives of workers in fulfillment centers and the AMT eco-system.
In any case, more shiny, yet obscuring surfaces and interfaces will only exacerbate social divides long in the making. Feminist digital media scholar Kylie Jarrett writes that the history of capitalism is “the history of struggle within and against a social factory.”27 Designers could assume a central role within this struggle if they engaged with the deconstruction of an uninvolved user. As an anthropologist of design and technology, I wonder if Bianca Williams’s understanding of anthropology would not apply to design in a similar way. In a recent episode of the Cultural Anthropology Podcast, “What Does Anthropology Sound Like: Activism,” she argues that anthropology will always be a ready tool for colonizers and oppressors, because of its beginnings and the way it has been used throughout time. “It’s in its nature. But I believe that there is room for resistance, protest, activism, insurrection, and rebellion in these spaces also.”28

Footnotes
-
Daniel Farey-Jones, “Amazon Alexa Helps Blind Woman in RNIB-Approved Ad,” Campaign, September 2, 2020, www. campaignlive.co.uk/article/ amazon-alexa-helps-blindwoman-rnib-approvedad/1595228. ↩
-
Johannes Bruder, “Infrastructural Intelligence: Contemporary Entanglements between Neuroscience and AI,” in Vital Models: The Making and Use of Models in the Brain Sciences, ed. Tara Mahfoud, Sam McLean, and Nikolas Rose, (Cambridge, MA: Academic Press, 2017), 101–28, doi.org/10.1016/bs.pbr.2017.06.004. ↩
-
Kate Crawford and Vladan Joler, “Anatomy of an AI System: The Amazon Echo as an Anatomical Map of Human Labor, Data and Planetary Resources,” AI Now Institute and Share Lab, September 7, 2018, www.anatomyof.ai. ↩
-
Susan Leigh Star and Anselm Strauss, “Layers of Silence, Arenas of Voice: The Ecology of Visible and Invisible Work,” Computer Supported Cooperative Work (CSCW) 8, no. 1 (March 1, 1999), 9–30, doi.org/10.1023/A:1008651105359. ↩
-
Crawford and Joler, “Anatomy of an AI System,” 16. ↩
-
Martín Arboleda, “From Spaces to Circuits of Extraction: Value in Process and the Mine/City Nexus,” Capitalism Nature Socialism (August 16, 2019), 1–20, doi.org/10.1080/10455752.2019.1656758. ↩
-
Orit Halpern, “Learning from the Atacama,” 2020 (unpublished manuscript). ↩
-
Sarah T. Roberts, Behind the Screen: Content Moderation in the Shadows of Social Media (New Haven, CT: Yale University Press, 2019), 183. ↩
-
Alix Johnson, “Data Centers as Infrastructural In-Betweens: Expanding Connections and Enduring Marginalities in Iceland,” American Ethnologist 46, no. 1 (February 2019), 75–88, doi.org/10.1111/amet.12735. ↩
-
Tanwen Dawn-Hiscox, “Who Wants to Be a Data Center Engineer?,” Data Center Dynamics (blog), December 3, 2018, www.datacenterdynamics.com/analysis/who-wants-be-datacenter-engineer/. ↩
-
George Henry, “The best part is the autonomy,” April 12, 2014, post on Quora, “What’s it like to work in a data center?,” www.quora.com/Whats-it-like-to-work-in-a-data-center. ↩
-
Tim Burke, “Why I Miss My Data Center and Why I’m Never Going Back,” BetterCloud (blog), April 27, 2016, www. bettercloud.com/monitor/data-center-nostalgia/. ↩
-
Maya Indira Ganesh and Johannes Bruder, “Cloud Cosmogram,” Data Farms (blog), 2019, www. datafarms.org/2019/12/16/cloud-cosmogram/. ↩
-
Burke, “Why I Miss My Data Center and Why I’m Never Going Back.” ↩
-
Lilly Irani, “The Cultural Work of Microwork,” New Media & Society 17, no. 5 (May 2015), 16, doi.org/10.1177/1461444813511926. ↩
-
David M. Berry, “Against Infrasomatization: Towards a Critical Theory of Algorithms,” in Data Politics: Worlds, Subjects, Rights, ed. Didier Bigo, Engin Isin, and Evelyn Ruppert, Routledge Studies in International Political Sociology (London: Routledge, 2019), 48, doi.org/10.4324/9781315167305-3. ↩
-
John Tresch, “Cosmogram,” in Cosmograms, ed. Melik Ohanian and Jean-Christoph Royoux, Lukas & Sternberg Series (New York: Sternberg Press, 2005), 58. ↩
-
Lisa Nakamura, “Indigenous Circuits: Navajo Women and the Racialization of Early Electronic Manufacture,” American Quarterly 66, no. 4 (2014), 921. ↩
-
Tiziana Terranova, Network Culture: Politics for the Information Age (London: Pluto Press, 2004); Gabriella Lukács, Invisibility by Design: Women and Labor in Japan’s Digital Economy (Durham, NC: Duke University Press, 2020). ↩
-
Mariarosa Dalla Costa and Selma James, eds., The Power of Women and the Subversion of the Community, 3rd ed. (Bristol: Falling Wall Press, 1975), 11. ↩
-
Crawford and Joler, “Anatomy of an AI System.” ↩
-
Paul Dourish, “User Experience as Legitimacy Trap,” Interactions 26, no. 6 (October 30, 2019), 46–49, doi.org/10.1145/3358908. ↩
-
Unknown, “The Evil List,” Slate, January 15, 2020, slate.com/technology/2020/01/evil-list-tech-companiesdangerous-amazon-facebookgoogle-palantir.html. ↩
-
Dourish, “User Experience as Legitimacy Trap,” 49 ↩
-
Lilly Irani, “The Hidden Faces of Automation,” XRDS: Crossroads, The ACM Magazine for Students 23, no. 2 (December 15, 2016), 34–37, doi.org/10.1145/3014390. ↩
-
Sam Adler-Bell, “Surviving Amazon,” Logic Magazine, August 3, 2019, logicmag.io/ bodies/surviving-amazon/; Heike Geissler, Seasonal Associate (South Pasadena, CA: Semiotext(e), 2018). ↩
-
Kylie Jarrett, “Laundering Women’s History: A Feminist Critique of the Social Factory,” First Monday 23, nos. 3–5 (2018), ojphi.org/ojs/index.php/fm/article/view/8280/6647. ↩
-
Cory-Alice André-Johnson, “What Does Anthropology Sound Like: Activism,” January 20, 2020, in AnthroPod: Fieldsights, podcast, culanth.org/fieldsights/what-doesanthropology-sound-likeactivism. ↩