Becoming Cyborgs
“I am not worried about robots becoming more like humans. I am worried about humans becoming more like robots.”
– Marilyn Strathern
While I was a graduate student at UMass, I volunteered as a tutor in various jails across western Massachusetts. I was not allowed to take my phone inside the jail. But throughout the two-hour tutoring session, I would find myself reaching, repeatedly, into my back pocket absentmindedly. Each time, I was surprised to find it empty. I never got used to being without my phone. It always felt like I was missing an essential limb.
Since then, I’ve been thinking a lot about how the smartphone is less a technological artifact and more a bodily appendage, a mechanical attachment we’ve surgically secured onto our bodies. Smartphones physically accompany us in our daily lives. They are our extended selves and have become enduring corporal fixtures. And when we have to navigate life without them, we immediately feel lost and confused.
Just as the iPhone has become an abbreviated third arm, a phantom limb, so too is AI becoming something akin to an external brain. My friends who have embraced artificial intelligence tell me as much: “It thinks for me.” “It is kind of like my second brain.” Because I have always loved books – and because I have always understood reading as an inherently relational activity linking me with humans both dead and alive – I don’t see much revolutionary potential in technologies that pretend to replace our (uniquely human) proclivity to think, to write, and to create. The threat of AI is not (only) the “art” it supposedly “creates” and the soulless words it strings together, but rather the way that it draws us further into the hell of digital technology where nothing is real and everything is impersonal and devoid of humanity.
As an undergraduate and graduate student in Anthropology, I, like many others, was assigned Donna Haraway’s Cyborg Manifesto in several classes. In the text, Haraway deploys the concept of the “cyborg” – a human whose body incorporates mechanical elements – as a metaphor to explore the “blurred boundaries” between humans, machines, and nature. I didn’t understand much of it as an undergraduate, mostly because of Haraway’s dense, jargony, and self-indulgent writing style. Haraway’s ultimate argument is that the “cyborg” can present radical possibilities for rethinking traditional categories such as gender, race, and sexuality and offer a vision of a “more inclusive and equitable future.” Although I don’t think I agree with Haraway, I find the concept of “cyborg” useful to think about humans today.
“In a recent blog post with that title, [Sam] Altman wrote that “ChatGPT is already more powerful than any human who has ever lived. Hundreds of millions of people rely on it every day and for increasingly important tasks.” In his telling, the human is merging with the machine, and his company’s artificial-intelligence tools are improving on the old, soggy system of using our organic brains.”
— Kyle Chayka, “AI is Homogenizing Our Thoughts”
Heralded by Haraway, “Cyborg Anthropology” is a sub-discipline of anthropology that explores the production of humanness through machines. But I am more interested, and more concerned by, the inverse of that: the way in which our enchantment with rapidly advancing technology has made us meld into these technologies ourselves, becoming more like them in the way we relate to each other, live our lives, and navigate our neighborhoods and worlds.
TECHNIQUE.
Jacques Ellul, a French philosopher, sociologist, and a Christian anarchist, wrote in his book The Technological Society about the concept of “technique,” which he defined as “the totality of methods rationally arrived at and having absolute efficiency in every field of activity” (pg. xxvi). In other words, “technique” refers to any set of “standardized means for attaining a predetermined result” (pg. vi). Technique converts “spontaneous and unreflective behavior” into behavior that is “deliberate and rationalized” (Ibid). As “technical humans,” we have become obsessed with the allure of efficient results, so much so that we forego human connection, creativity, spontaneity, and critical thought. In this very concrete way, technique dehumanizes us. Our relentless pursuit of efficiency transforms our behavior from that which is deeply reflective and creative to behavior subsumed by technical reflexes committed to the “neverending search for the ‘one best way’ to achieve any designated objective” (vi). The Technical Human thus “cannot help admiring the spectacular effectiveness of nuclear weapons of war.”
“When technique enters into every arena of life, including the human, it ceases to be external to man and becomes his very substance. It is no longer face to face with man but is integrated with him, and it progressively absorbs him.”
– Jacques Ellul, The Technological Society, pg. 6
Relatedly, anthropologist Natasha Schull, in her study of gamblers’ connection to their slot machines, coined “machine zone” to conceptualize a state of mind where people “don’t know where they begin and the machine ends” (Turkle, pg. 72). One of the gamblers Schull interviewed said, “I’m almost hypnotized into being that machine.” Perhaps we are constantly in a machine zone – for do we really know where we end and where our smartphones begin?
Indeed, describing smartphones as bodily or mental appendages assumes that they are additions to our autonomous and creative human bodies, when in actuality – if you really think about it – perhaps it is the other way around. Perhaps we are the ones becoming appendages and afterthoughts to the digital technologies that populate and animate our lives. Rather than human beings – emotive, empathetic, relational, creative –
we are becoming robots. And this is evident when we take a closer look at how we manage our relationships to our bodies, our selves, and our friends.
TRACKING.
“Then he told us some of the things his educator [watch] could do other than just educate. Heart rate, bloods, steps, nutritional breakdown of what you’re eating, internet everything, camera, phone. Transform voice to text. Instantaneous translation but only forty languages (next model up more expensive does one hundred and thirty). Stream anything streamable. Tell the time.”
– Ali Smith, Gliff, pg. 84
On my 20th birthday, I received a “Fitbit” watch from my boyfriend. When I wrapped the watch around my wrist, it tracked the number of steps I walked, my heart rate, and the approximate number of calories I burned throughout the day. I thought it was fantastic. Clearly this magical device was going to help me become the healthiest version of myself. A couple years later, my cousin recommended an app that would track the hours of REM sleep I achieved per night if I left my phone lying on the bed next to my pillow. I became obsessed with tracking myself in these ways. My days became quantified and quantifiable, summarized by a series of numbers. Before long, I realized that everything else about me could be similarly measured. Spotify informs us annually of the number of minutes we spent listening to music, summarizing in attractive graphics the data it has collected from us unknowingly. It has now become fashionable to publicly establish “reading goals,” referring to the number of books one reads per year, on websites like Goodreads. The applications I download to track my period are never solely focused on my menstrual cycle, but constantly inundate me with requests for more data: minutes of exercise per day, number of bowel movements, “inflammatory” habits in which I may have partook (smoking, drinking, etc), and much more. I am tempted to fill in this data because it draws a rather pretty picture in my head: becoming the healthiest, most glowing version of myself – a super-health-goddess – but it feels a bit too tedious.
Natasha Schull has written extensively on the “datafication” of human health and self-care. In her work, she builds on the legacy of Foucault, who famously coined the term “biopower” to describe a “positive” technology of power that produces techniques of disciplining or regulating the body to maintain a healthy life (Foucault, pg. 138). Disciplining the body, here, refers to “optimizing” the usefulness and capabilities of the body. In The History of Sexuality, Foucault delineates how techniques of “self-discipline” – minute and habituated forms of health management and optimization of the body – seep into people’s everyday lives. As a result, how we live becomes understood, regulated, and controlled on a statistical level; private life thus becomes a matter of public interest. Schull develops this seminal argument, contending that self-tracking, which takes place on an “informatic-behavioral register” rather than a molecular or genetic register, constitutes a datafication of biopower by collecting small units of data about the minute ways people conduct their lives (pg. 11).
The number of steps and sips taken in a day, the speed at which bites are taken during a meal, degrees of slouching, rate of respiration, quality of sleep, physical and cardiovascular stress, calories burned, blood oxygen level – these are all bits of data that the “Whoop” watch, a device I recently discovered on the wrist of a friend at a social gathering, tracks. The Whoop wearer can also supplement this plethora of information through a “Behavior Journal” where she can log her behaviors and “link them to physiological trends and insights.”
Evidently we live in an era of datafication and self-tracking, turning us into persons without qualities, whereby we treat each other and ourselves as “uniform, averaged, smoothed out” (Schull, pg. 910). Through such digital technologies, the self is “sliced and diced” into decontextualized parts, divided into “ever more granular bits,” sorted into and tracked through data sets with the aim of “algorithmically steering its behavior” (Schull, pg. 910). Datafication, and the digital technology that facilitates the process, decomposes the person, eroding human agency and self-image. This is the ultimate proof of Ellul’s claim that technique, when it enters into every arena of life, “ceases to be external to man and becomes his very substance” (pg. 6).
SURVEILLANCE.
Having an opinion means posting it on Instagram. This is not something I learned over time but something I practiced, enthusiastically, ever since I first downloaded Instagram as a teenager. The rush of serotonin I felt when I authored a political take or vented out my anger at the “isms” of the world through an Instagram ‘post’ or ‘story’ was unparalleled. Now this is commonplace: Instagram has become an – or the – arena of politics, where everyone is posting about a breaking-news-update or the news-of-the-day often accompanied by their own hastily-written analysis. Until I deleted the application a few months ago, I never stopped posting, rather haphazardly, on Instagram. I seldom knew why exactly I was posting what I did, but what I knew was that for many of the people I encountered on social media, “to be silent meant complicity with the enemy” (whoever the enemy of the moment was) and “being silent” meant not posting anything. I knew this because I myself abided by the same cultish doctrine until rather recently.
I once got into a heated political argument, over Instagram, with an old friend. The argument spiraled on and on, until it reached absurd levels, culminating in the friend accusing me of not posting anything about a political issue she cared deeply about, which, to her, signified my complicity with “the enemy.” I was flabbergasted that she had seemingly kept a record of what I was or was not posting. Yet I, too, used to conduct the same exercise: I also kept track of who was posting about what and accordingly rated them on a scale of moral righteousness. Social media makes it ridiculously easy for us to surveil each other in these sinister ways, building a repository of what everyone posts and (seems to) care about, which, again, turns us into robots. This depersonalizes and dehumanizes our relationships, making us more willing to disregard the human being, in all her complexity, behind the digital profile.
I wonder if we can extend Foucault’s concept of “panopticism” to our usage of social media and how it encourages us to police and discipline one another in these ways. In Discipline and Punish, Foucault provides a detailed overview of Jeremy Bentham’s “panopticon,” the architectural figure of the modern prison whereby prisoners are observed by a single security guard, without knowing whether or not they are being watched. Foucault offers this background to put forward his conception of “panopticism,” or the application of the logic of the panopticon upon all sorts of societal institutions and practices. He writes: “Whenever one is dealing with a multiplicity of individuals on whom a task or a particular form of behavior must be imposed, the panoptic schema may be used. It is – necessary modifications apart – applicable ‘to all establishments whatsoever, in which a number of persons are meant to be kept under inspection’” (Foucault, pg. 205-6). Panopticism, thus, is “the general principle of a new ‘political anatomy’ whose object and end are not the relations of sovereignty but the relations of discipline” (Foucault, 208). In short, the panopticon is “an architectural plan” (Foucault, pg. 200) while panopticism is a set of general ideas about the control of populations.
OBJECTIFICATION.
I’ve written elsewhere on the way in which digital technologies – and particularly social media and dating applications – make it so that we see and experience one another as objects. I won’t belabor the subject here.
The objectifying power of digital tech and its corollaries is clearly visible in a TikTok my sister sent me. A mother and daughter sit side-by-side as the mother scrolls through the daughter’s dating application. Screen captures of the profiles that the mother surveys are provided to the viewer: including their names, photos, ages, occupations, and all the other accompanying information. Together the mother and daughter assess and then dismiss each profile for one reason or another. There were thousands of likes and hundreds of comments on this TikTok. But I was horrified. What makes people so comfortable with publicly denigrating random strangers whom they have never met and will probably never know? Why are people so comfortable publicizing information about strangers that should theoretically be – and has been until very recently – kept private?
A common refrain made by defendants of digital technology is that these practices (represented by dating applications) have always existed, e.g. in the form of matchmaking, or having a blind date. Digital technology simply develops and augments such practices. But there is a clear difference.
The difference is that when you meet someone in person, you can look into their eyes, sense their presence beside you, feel their body temperature rising, watch the breath going in and out of their bodies: you see them as a human being, and so you are more likely to treat them as such – you humanize them because they present themselves as a human being, flawed yet real flesh-and-bones, in front of you.
When you are swiping through dating profiles on the Internet, in a manner that in essence commoditizes the potential dating partners you are surveying, you are more likely to objectify and dehumanize them because there is nothing human about a person who has condensed their humanity into a series of numbers, categories, and likes/dislikes.
SCROLLING.
One would think that we would find it jarring to scroll through Instagram reels or TikTok and watch a heartwarming video about an abandoned animal being rescued, or a “day in the life vlog,” proceeded by scenes from a warzone, or a talking head providing political analysis on the latest violent developments in a country thousands of miles away. But we don’t. In fact, what I’ve just described is commonplace and totally normal in our modern world; something I myself have experienced in the hundreds of hours of scrolling I’ve accumulated in my life.
This is not a completely new phenomenon. In fact, it probably dates back to the invention of the television, or maybe even the telegram. Neil Postman writes about the “Now… This” phenomenon in his 1985 book Amusing Ourselves to Death: whereby the reporting of a horrific event on television – such as a rape or a five-alarm fire or global warming – “is followed immediately by the anchor’s cheerfully exclaiming “Now ... this,” which segues into a story about Janet Jackson’s exposed nipple or a commercial for lite beer” (pg. 6). This creates a “sequencing of information so random, so disparate in scale and value, as to be incoherent, even psychotic” (pg. 6). “Now… This” is commonly used on radio and television shows to signal that what one has just heard or seen has no relevance to what one is about to hear or see – it is a means of acknowledging the fact that “the world as mapped by the speeded-up electronic media has no order or meaning and is not to be taken seriously” (Postman, pg. 87). “There is no murder so brutal, no earthquake so devastating, no political blunder so costly—for that matter, no ball score so tantalizing or weather report so threatening—that it cannot be erased from our minds by a newscaster saying, “Now ... this.””
We don’t even hear the words “Now… This” anymore, there is not enough time as our thumb flicks the screen upwards, desperate for endless and mind-numbing stimulation. Our brains have been programmed to accept the sequencing of events and information as the algorithms of our digital technologies have ordered them. We have become thoroughly accustomed to such discontinuities and are “no longer struck dumb, as any sane person would be, by a newscaster who having just reported that a nuclear war is inevitable goes on to say that he will be right back after this word from Burger King” (Postman, pg. 91). I wonder about the extent that this juxtaposition desensitizes us to scenes of violence, impoverishment, brutality, and injustice – those scenes that constantly populate our screens right alongside advertisements, skincare tips, morning routines, and celebrity gossip. I think that this, too, turns us into automatons; makes us machine-like in our inability to process and sit with the violence and staggering loss of life that is beamed into the screens in our back pockets.
Works Cited
Foucault, Michel. 1978. The History of Sexuality, Volume 1: An Introduction. Translated by Robert Hurley. New York: Pantheon Books.
Foucault, Michel. 1977. Discipline and Punish: The Birth of the Prison. Translated by Alan Sheridan. New York: Pantheon Books.
[Others are cited in the reading list below…]
I am certainly not being very original when I write about technology and society. On the contrary: I’ve been deeply inspired by poets, writers, artists, filmmakers in thinking about the dystopian effect of technologies on our sense of self, our relationship to our bodies, and on society at large.
A Reading List for Luddites, or humans who want to remain humans
Books:
Ellul, Jacques. 1964. The Technological Society. Translated by John Wilkinson. New York: Alfred A. Knopf.
Illouz, Eva. 2007. Cold Intimacies: The Making of Emotional Capitalism. Cambridge: Polity Press.
Postman, Neil. 1985. Amusing Ourselves to Death: Public Discourse in the Age of Show Business. New York: Viking.
Schüll, Natasha Dow. 2012. Addiction by Design: Machine Gambling in Las Vegas. Princeton, NJ: Princeton University Press.
Turkle, Sherry. 2015. Reclaiming Conversation: The Power of Talk in a Digital Age. New York: Penguin Press.
Articles:
Chayka, Kyle. 2025. “A.I. Is Homogenizing Our Thoughts.” The New Yorker, June 25, 2025. https://www.newyorker.com/culture/infinite-scroll/ai-is-homogenizing-our-thoughts.
Schüll, Natasha Dow. 2016. “Data for Life: Wearable Technology and the Design of Self-Care.” BioSocieties 11 (3): 317–33. https://doi.org/10.1057/biosoc.2015.47.
Schüll, Natasha Dow. 2016. “The Data-Based Self: Self-Quantification and the Data-Driven (Good) Life.” In The Quantified Self: A Sociology of Self-Tracking, edited by Btihaj Ajana, 17–40. London: Polity Press.
Vadukul, Alex. 2023. “This Teen Gave Up Her Smartphone. It Changed Her Life.” The New York Times, February 2, 2023. https://www.nytimes.com/2023/02/02/opinion/teen-luddite-smartphones.html.
Novels:
The Years (2017), by Annie Ernaux.
Bonding (2024), by Mariel Franklin.
Brave New World (1932), by Aldous Huxley.
That Hideous Strength (1945), by C. S. Lewis.
Gliff (2024), by Ali Smith.
Player Piano (1952), by Kurt Vonnegut.
Other:
Lamm, August. 2025. You Don’t Need a Smartphone: A Practical Guide to Downgrading and Reclaiming Your Life. Self-published pamphlet.
Berry, Wendell. 1987. “Why I Won’t Buy a Computer.” In The Gift of Good Land: Further Essays Cultural and Agricultural, 229–35. San Francisco: North Point Press.
One of my favorite movie scenes:
“I grew up in an age without Internet and mobile phones. The world that I knew has disappeared. For me it was all about going to stores. Record stores. I’d take the tram to Voices record store in Grunerlokka. Leaf through used comics at Pretty Price. I can close my eyes and see the aisles at Video Nova in Majorsuta. I grew up in a time when culture was passed along through objects. They were interesting because we could live among them. We could pick them up. Hold them in our hands. Compare them.”
— The Worst Person In The World (2021)
A quote by one of my favorite writers:
“It is one of the evils of rapid diffusion of news that the sorrows of all the world come to us every morning. I think each village was meant to feel pity for its own sick and poor whom it can help and I doubt if it is the duty of any private person to fix his mind on ills which he cannot help. (This may even become an escape from the works of charity we really can do to those we know.) A great many people do now seem to think that the mere state of being worried is in itself meritorious. I don’t think it is. We must, if it so happens, give our lives for others: but even while we’re doing it, I think we’re meant to enjoy Our Lord and, in Him, our friends, our food, our sleep, our jokes, and the birds’ song, and the frosty sunrise. About the distant, so about the future. It is very dark: but there’s usually light enough for the next step or so.”
– C. S. Lewis, in “Letter to Bede Griffiths” dated 20 December 1946.


We think datafication is for our benefit, no, our lives are quantifiable now because thats the only way to train machines to become us. We think we (humans) are worthy of being gods, to create an entity as “perfect” as us. In our attempt to create such entity, we’ve stripped ourselves of all the bits of humanity we have left.
It’s sad and frustrating to grapple with, thank you for putting it so eloquently. It’s always cathartic reading your words, this piece identifying exactly what i’ve been pondering about recently.. Our modern culture and digital tech’s tracking, surveillance, objectification, removing meaning from all aspects of life - psychosis - and ultimately dehumanization.