Christoffer - liminal
Embodying Liminal Data Lives: Encoding the Aesthetics of Trans Bodies as Algorithmic Distance
Abstract
The claim that algorithms are infinitely improving our lives is presented as an axiomatic truth, but for trans people, algorithms are violent, and at worst, deadly. Behind the veil of neoliberal techno-optimism, algorithms perpetuate colonial and cisnormative legacies that anchor a binary idea of human life, wherein the possible ‘human’ becomes the white, cisgender human, which violates trans lives from not fitting the binary codes embedded into and making up algorithmic systems. Instead of complying with neoliberal beliefs in algorithms or falling short on critique, this article theorises the aesthetics of trans lives as embodied liminal data lives as a strategy of sensing distance to algorithms from the tactical uncodeability of transness in opposition to the binary confinements of algorithmic technologies. Taking this stance, this article asks: How can we create spaces of distance to algorithms in a world inherently entangled with them, and how can the liminality of trans data lives allow us to consider (im)possible ways of living and distance as resistance to the multifaceted reality of algorithmic violence in which we exist?
Introduction: Algorithmic Ordering of Trans Lives
The claim that algorithms are objectively beneficial to our lives stands as an axiomatic truth presented by Big tech companies and global governments. Popularly, algorithms are sold as tools to fix, tweak, improve, and exponentially advance our lives, but to trans people, this promise is not a given nor a truth; for trans people, algorithms and the spaces they enable, are violent, and at worst, deadly. Behind the neoliberal veil of techno-optimism fuelled by international politics, nation states and Big-tech companies, algorithms learn from, revigorate and perpetuate colonial and cisnormative legacies of violence that anchor a binary default (Amaro; Hoffmann). Within this default in algorithmic systems, the only possible ‘human’ becomes the white, cisgender human - forcing transness out of existence from not fitting the encoded template and binary codes making up the valorisation of human life.
Contrary to the belief that ‘AI’ technologies are inherently novel, progressive and revolutionary, recent scholarship on trans experiences of algorithms has critically taken up how they reinscribe binary colonial markers of gender essentialism. For example, algorithms enable facial recognition software to reject trans faces (Keyes; Scheuerman et al), encode trans bodies as dangerous deviant threats in airport security scanners (Costanza-Chock; Wilcox), deny trans people access to crucial state welfare services, delete trans health data and create messy bureaucratic problems (Amnesty International UK ; Hicks; Waldman), intimately surveil trans identities (Keyes & Austin; Shelton et al.), erase trans existence through binary digital identification systems (Andersen "Wrapped Up in the Cis-Tem"; Dixon-Román; Shah), and enact transphobic feedback loops on social media (Rauchberg). Together, these algorithmic technologies share an enabled reiteration of colonial classification of humans along binary lines of life, which essentialise physio-phrenological traits of the body as corresponding to the gender binary as the singular comprehensible unit of algorithmic recognition, which reinforces systemic marginalisation of trans people and locates trans bodies as territories for surveillance and violence.
Meanwhile, scholarship also attends to how both queer and trans bodies enact “small but playful forms of disruption such as the error or glitch” (Gaboury), glitch out algorithmic technologies, resist cisnormative systems of surveillance and establish epistemologies of ‘glitching’ (Elwood & Leszczynski; Russell; Shabbar), productively ‘fail’ in algorithmic technologies to unsettle the categories of ‘naming’ (Bridges) and embrace the inherent subversive potential of embodying failure (Campanioni). This following article extends both the current scholarship on 1) unveiling and criticising the embodied and sociopolitical impacts of algorithmic violence for trans lives, and 2) analysing the productivity of ‘glitchy’ encounters of troubling, messing with or failing through the algorithmic codes. Rather than merely focusing on the glitches, errors or failures of trans bodies in their encounters with algorithms, this article seeks to conceptualise this relationship along a different axis. This article is interested in theorising the aesthetic promise and potential of trans lives and their data reality of liminality—the state of existing in-between spaces simultaneously as equally invisible/visible, visual/invisual, codeable/uncodeable and liveable/unliveable—to investigate what the digital fleshiness of trans lives entitles as an embodied practices of distance to algorithmic technologies. This foregrounds trans techniques of refusal, misuse, and disruption that work with, through and against contemporary algorithmic technologies, and as such, establishes a trans critique of algorithms.
Instead of conceptualising algorithms as neutrally coded artefacts able to present objective truths about the world, this article theorises algorithms as sociopolitically contingent artefacts that fundamentally “engage in fabulation – they invent a people, write them into being as a curious body of correlated attributes, grouped into clusters derived from data that are themselves fabulatory devices” (Amoore Cloud Ethics 58) that construct fabricated hierarchised imaginaries of the world, and as a result, of the subjects entangled with them. Within this algorithmic age, these “flows of personal data—abstracted information—are sifted and channelled in the process of risk assessment, to privilege some and disadvantage others, to accept some as legitimately present and to reject others” (Lyon 674). Highlighting the algorithmically curated imaginaries of both people and the world reveals the contingent nature of trans people locked into their data shadows, where algorithms invent specific fabulas about trans lives. Algorithms invent and present trans lives as if they are inherently unreadable, or even impossible within the systems, while claiming that they are not compatible with the idea of the human due to their inability to be correctly rendered by the same systems excluding trans lives from the possibility of being understood along the lines of humanness in the first place. The act of constructing trans lives as unrecognisable entities in algorithmic systems is derived from data limited to tell a certain story and are thus fabulatory devices that dis/allow specific truths about the world to appear. Essentially, this invention of the impossibility to recognise trans lives leveraged by algorithms likewise becomes a fabulatory device itself – a device that serves to legitimise a story about trans people as ‘impossible’ and dismiss trans lives as a naturally given and inevitable reality that cannot be different despite the histories and lived reality of trans people.
In addition to living in one’s own data shadow, not only have some bodies been historically ostracised through data and from their own data, but this data is ultimately part of larger sociopolitical relations and interconnected technological networks, where it is profiled, circulated and translated across several databases – administrative systems, digital databases, bureaucratic documents, biometric technologies and computational predictions that break the body down into coded digits. These sociopolitical relations inhabit the differentiation and hierarchisation that have disciplined the humanity of bodies (Weheliye), where this making and distinction of difference enact the multiplicity and specific production, circulation, cementation and thus possibilities of human data (and datafied humans) across various algorithmic assemblages. By weaving together algorithms and trans lives, it becomes possible to consider how trans existence contributes to novel spatiotemporal forms of thinking about the algorithmic augmentation of social order and human differentiation in our digital societies: not only is ‘trans’ a technology for mapping deviance in reference to binary life, but further a tool for ordering, classifying, and controlling the embodiment of the human solidified through the colonial imposition of gender binarity. In this sense, the question is, can this function of difference that ‘trans’ encapsulates be made productive as a rupture to create distance to the embodied forms of algorithmic violence?
At a crucial moment in time where algorithms are exponentially embedded into every facet of our everyday lives, and where they both prime global political imaginaries of human value and reinfuse taxonomical colonial hierarchies of power, the disproportionate implications for trans lives must be investigated, but likewise must strategic techniques for productively curating distance to the algorithmic technologies themselves. This article is situated between two intersecting branches of scholarship – that on algorithmic violence and trans experiences of and resistance against it – with the aim of contributing with a spatiotemporal digital orientation of trans bodies as liminal data lives in order to unveil the forces of algorithmic violence, as well as to provide a theory of the productive encounters that occur when the uncodeability of transness is inserted into the algorithmic equations.
Living with algorithms while trans presents an inescapable reality and precarious unliveability only predicted to intensify the impossibility of trans lives. The question becomes, how do we carve out liminal spaces in proximity to, but away from the algorithmic gaze of death? How can we create productive spaces of distance to algorithmic violence in a world inherently entangled with algorithms? I suggest an alternative coded rupture from transness itself to conceptualise the aesthetics of living as trans and trans lives as liminal data lives—lives that inherently inhabit a digital space in-between two states of being targeted and dismissed—that makes possible the operationalisation of a productive strategy of sensing distance to algorithms by keeping with the complex uncodeability of transness in opposition to the binary limits of algorithmic technologies. In doing so, how might this shift from mere ‘error’ and ‘failure’ towards uncodeability allow us to consider (im)possible ways of living and distance as resistance against algorithmic technologies towards encoding trans liveability?
Coded Flesh, Coded Death: Algorithmic Violence and Binary Valorisation of Life
Algorithms are maps of technical instructions that order and classify objects and humans into fixed categories; embodied by humans that code them and through the humans implicated by them. Algorithms are immaterial infrastructures of predictions, yet “need to be embodied in some combination of human and/or machine […] in relation to the systems of interpretation and to the bodies that do the interpreting and reacting to the information they provide.” (Wilcox 16-17). Crucially, in relation to bodies, transness—with its infiniteness, messiness and mutability—works against the operational principle of algorithms and their binary definiteness, fixedness, and immutability, which renders trans people either hypervisible as a deviance or invisible and erased. This imposes a violent gendering of the human in accordance with colonial cisnormative rules of classification as the decision over life and distinction of who should live and who must die by “performatively enacting themselves/ourselves as being human, in the genre specific terms of each such codes’ positive/negative system of meanings” (Wynter 30). Under the contemporary code of the algorithmic reality, the white cisgender human represents a positive symbolic meaning of living, while transness characterises a negative impossibility of life. Algorithms essentially represent a computational figuration of the politics of classification; the act of classifying and sorting bodies as objects into neatly defined categories, which inevitably infuses an overwriting and exclusion of those who cannot be fitted into strict categories.
Trans people exist as neither-nor in a liminal space within the computational order of life: On one side, existing as codeable by being hypervisible in deviating from binary code, which positions trans people as a target for violence through failure to conform to the necropolitical norms and logics underlying the order of life and death in the algorithmic. On the other side, existing as uncodeable in its authentic and fleshy entirety as algorithms cannot comprehend transness, but neglects and computes transness to not exist in the first place as a non-life left to die outside of the territory of life - in both instances of (in)visibility, transness is fundamentally uncodeable. In this sense, the algorithmic entitles “identifying norm and multiple deviations from the norm [by deploying] an “architecture of enmity”, a drawing of the lines between self/other; us/them; safe/risky; inside/outside” (Amoore "Algorithmic War" 51). These affective senses of ‘improper life’ stick to transness in its aberrations from normative binary structures, hence the trans body is subjected to coded operations of elimination that mark the flesh and strip the trans body of its human possibility as a coded death. If algorithms resemble a war-like architecture of enmity, then trans represents the compulsory fleshy reference for enabling the algorithmic distinction of value. In the current algorithmic reality, “if war at a distance” produced a subject position of a viewer, “war as big data” produces the subject position of a user, that is, a subject that actively participate in securing the system as a whole” (Hu 113). Trans thus functions as a digital flesh to securitise the structures of algorithmic technologies and legitimise their war on certain ‘othered’ bodies as a whole through its interlinked assemblages of information, data and digits that do not correspond to trans existence, but rather, render and interpret trans lives as a computational incomprehensibility. Here, I strategically utilise the term digital flesh to “reflect the structure of digital phenomena as a continuum of reality, instead of an empty space lacking reality” (Yoon 585) to emphasise how bodies are inscribed into the algorithmic systems that co-construct their embodiment. For securing the future, if “algorithmic techniques are concerned with anticipating [and curating] an uncertain future, then the logic of algorithmic war is one of identifying norm and multiple deviations from the norm” (Amoore "Algorithmic War" 55). The logic of war needs deviations to be identified in advance, and this is what is happening with encoding the trans body as an existing difference and deviance from the norm; the assumed stable and secure cisnormative body template of life.
Importantly, what I draw attention to here is an overarching differentiating order of embodiment predicated on the instrumentalised sequences of algorithmic necropolitical functions designed to configurate trans subjects as ontologically killable flesh, as already uncodeable to the system, and as imminent deceptive, where “identity and subjectivity are stripped away from bodies; persons are objectified as their fleshy, material bodies.” (Wilcox 104). Transness, I argue, represents as an epiphenomenon of algorithmic processes of classification, sorting and ordering through abstracted code, references and proximity that turn trans bodies into data formations that deviate from the installed norms within the systems. This process prenecessitates that rendering of trans bodies as ‘threats’, which legitimises their co-existing attribute of being coded for exposure rather than being coded as human. Through the operations of algorithmic technologies, which revolve around the “logistics in massive technical systems that work through the ability to abstract and optimize” (Parikka 31), algorithms appropriate the binary order of code as the framework of readable life, hence abstracting trans lives as malfunctioning data formations not apt with the system.
As a technical object expressed through code, trans bodies are rendered as unintelligible in the symbolic order of the binary code, thus alienated from themselves and their flesh from not being possible to be read as trans, as life, as human. As Pugliese (2005) puts it, “Not to produce a template is equivalent to having no legal ontology, to being a non-being; you are equivalent to subjects who cannot be represented and whose presence can only be inferred by their very failure to be represented” (14). Instead, trans bodies come to represent coded signs of falling through the cracks, as something away from what constitutes the human and what the human is supposed to be. In this framing, transness is rather—through its computational uncodeability in its own right—read as an absence of human that must be eliminated due to the lack of humanness. To the human witnessing algorithmic violence, this “radical absence [of humanness] is crucial to witnessing what is not there, or fails to materialise, or is destroyed, or has died” (Richardson 153). In the algorithmic age in which we are situated, one must be algorithmically readable in order to exist and live. Far from distancing the human partiality from that of the code, algorithmic technologies insist on executing the predetermined configuration of the human based on colonial legacies of binary gender, which “embed the discursive, affective, and fantastic logics of war in all their racializing and gendering dimensions into the algorithm at every stage of its design, training, and operation” (Richardson 103). They are, in this way, inseparable from the violent production of gender that formalise and exercise the impossibility of certain lives as digital flesh coded for death, while employing an algorithmically-augmented valorising and upholding the liveability of binary lives.
With these facets of coded violence in mind, by attending to the aesthetic-political potential of the liminality of trans lives rather than framing algorithmic technologies as simply failing to capture transness, how might we interpret this act of failure that trans flesh embodies—and the inherent partiality it reveals—as central to our unveiling and knowledge production of algorithms? I suggest that the coded trans flesh unveils a liminal data life that illuminate a unique property in its liminality that the algorithmic system cannot expect, predetermine or fully calculate, but a fluidity of life that runs between the codes.
It is exactly at this liminality between the physical and the digital that the trans body arrives as digital flesh that is once appropriated and used by algorithmic systems to claim unrecognisibility to target and legitimise war on the trans body as the logic of algorithmic warfare on deviant bodies relies on their presumed deviance to defend the war itself. On the contrary, this also enables the trans body to remove itself from its physical flesh and into the digital cracks as a liminal data life to speculate and simmer as a possibility of something different outside of the uncalculatable range within algorithmic systems. This liminality encoded through data creates a rupture where the possibilities of identification and life exceed the binary limitations of embodiment in the system and the digitally mediated boundaries to which life can be lived.
Liminal Data Lives: Aestheticising the Digital Trans Flesh as Algorithmic Distance
No system can enforce a fixed, undisrupted narration and computation of truth without cracks. Algorithmic technologies—despite their glaring appearance as territories of unambiguous domination—are places of messiness, frictions, interference and disruption. This reality is often concealed behind the myriads of efforts needed to make an artificial system of binary logics appear fortified as the truth, and thus not articulated as a feature or productive fragility core to the systems themselves. Through the disruptive potential of trans data lives, a rupture and opening into said fragility of binary code can be located and exposed through the inherent uncodeability of transness that creates a liminal distance to algorithmic code and upholding of binary life. The question is, how do we critically utilise this liminal data space that trans people embody to create distance and inscribe another possible sensing of algorithms?
As Fuller and Weizman argues, aesthetic investigations—in this article through the lived experiences of transness—have a twofold aim as they “are at the same time investigations of the world [algorithmic violence] and enquiries into the means of knowing it [trans lives]” (15). Utilising the aesthetics of trans lives as means of sensing the world of algorithms and critically questioning the harmful colonial politics underlying its expansion involve “sensing – the capacity to register or to be affected, and sense-making – the capacity for such sensing to become knowledge” (Fuller and Weizman 33). This operationalisation of aesthetics enables us to attend to the affective facets of trans lived experiences with algorithms and translate these experiences into productive knowledge for refusal against algorithmic systems. In this aspect, trans existence is infinitely “wielded […] as an invaluable mapping tool, a means by which origins and boundaries are simultaneously traced and constructed and through which the visible traces of the body are tied to allegedly innate invisible characteristics” (Chun 10). By default, this marks both binarity as an ontological necessity and operationalise a spatiotemporal colonial reiteration of a hierarchised social order: ‘trans’ then is not only a tool for ordering, classifying, and controlling solidified through the imposition of the gender binary that is mirrored by algorithmic code, but trans lives inflict disruption by existing as a mapping technology for locating destructible deviance and resistance in algorithmic technologies.
Trans bodies embody and curate a liminal data space—simmering simultaneously between two different places and states of being in and with data: visibility as targets of violence and invisibility from going under the coded radar. Firstly, this takes form in terms of codeability from being rendered as visibly ‘deviant’ and uncodeability from the computational inability to comprehend trans existence in holistic authenticity. The idea of codeability speaks to the fact that, despite the seemingly algorithmic inability to read trans lives, data is still produced about the trans body – in this instance, as a deviance, where the data generated come in the format of registered deviance from the systems’ norms. Meanwhile, this means there is an inherent uncodeability of trans lives in algorithmic systems, where they are not rendered and understood on their own terms in a holistic sense due to the algorithmic inability to comprehensively represent them. While some data is always produced about trans people in their encounters with algorithmic technologies, they cannot be fully and holisitically rendered and coded in their total legitimacy without misrecognition, flaws or by being exposed to significant risks or held to a cisgendered comparison.
Secondly, this tension relates to the liveability of trans people in their data. Liveability refers to the “holistic quality of life located at the trans body as situated in an algorithmic world, and in which ways algorithms complicate the degrees of (un)liveability under which trans lives are subjugated (…) [and] concerns how trans liveability is affected and through which different systematic, sociopolitical and structural hierarchies of power encoded into algorithmic detection and decision-making” (Andersen "Beyond Fairness" 3). Liveability exists as a mode of inhabiting data that is always rendered in its perpetual precariousness and surveillant assemblages inscribed with precoded hierarchies of power, where trans people are not represented as liveable on their own terms in the code despite living in their own right. In comparison to other lives, trans lives are especially targeted, which position trans lives in a state of unliveability due to that same data manifesting as coded procedures of exposure, exclusion and death.
In this way, trans lives—trapped within binary codes of life—inhabit a liminal yet powerful space of simmering and sensing the algorithmic world between the visible/invisible; codeable/uncodeable; liveable/unliveable as iterative modes of being that illustrate a significant and inescapable relationship between how trans bodies exist in the world and how algorithms interpret this existence as a constant coded negotiation between targeting and erasure.
This relation between algorithms and trans bodies as a co-produced liminal distance begins at the point of dismissing, rejecting or omitting transness from categories necessary for the binary logics that undergird the operationality of algorithms. Existentially and algorithmically, this is essentially the coded trap that trans subjects find themselves in, or vocalised differently; the space in which they inhabit and in this praxis of living, they 'sense' and 'refuse', trouble, delay, distort and glitch algorithmic infrastructures. As exemplified by trans experiences, as trans lives interact with algorithmic systems—whether that be in facial recognition software technologies having trouble representing and verifying trans faces, body scanners at the border being stunted by the nonconformity of trans bodies, or state welfare systems glitching out on granting trans citizens access—they are inconvenienced by trans existence as this form of existence does not correspond with the preprogrammed space that lives are expected to inhabit. Altogether, in their various technical operations and attempts at rendering a tangible subject, the algorithmic systems are troubled, delayed and stunned by the interference from trans embodiment that they cannot account for, which speaks to the aesthetic potential of the liminal distance enacted by trans lives.
Critically, within this space, it “require[s] ways of knowing and being that refuse to be reduced to the limits of normative digital-social orders (…) [where] queer life originates in desiring and doing that which normative social orders situate as impossible” (Elwood 213). The conditions of ‘error’ or ‘erasure’ in contrast to cisnormative data lives encode a distance that encourages strategic fugitive tactics of refusal for algorithmic infrastructures to be resisted and reimagined despite seeming impossible under the current neocolonial techno-optimism; a space where algorithmic infrastructures are troubled, delayed, distorted, and glitched from how transness exists in/against the code. Transness embodies a particular kind of ‘in-betweenness’ that at once infiltrates the binary code, renders it futile as a universal truth and effectuates distance to the reductionist algorithmic readability of humanness towards redefining what it means to be(come) human. By not fitting into binary code, transness strategically falls through the coded cracks of life. Despite the rigid boundaries of binary code, the ambivalent liminality of trans data lives allows for transness, as digital flesh, to become fluid and fugitive between the algorithmic codes. In this way, transness activates a fugitive resistance against algorithmic violence from embodied investment in failure by occupying a spatiotemporal position at both sides of the threshold of code utilised by algorithmic technologies; cutting over, falling through, going against and obscuring binary flows of code. At this dual threshold, a certain kind of productive and disruptive relationship is generated that alter what we understand as distance to, while in inevitably in proximity with algorithms that only trans bodies can catalyse.
This points to a crucial technical inception between the lived experiences and capacities of trans bodies and the systemic conditions of algorithms; their interfaces, systems and infrastructures. As an embodied tactic of trans lives, this in-betweenness operates at the level of the trans body in its interference with the systemic conditions of algorithms. Through embodying difference, they fall through the coded line that cannot capture their lives, obscure the efficiency of code by not fitting into the system, work directly against and expose the absurdity of binary reductionism, and cut over binary code by embodying more than what the binary can encapsulate. Trans lives introduce a disruptive plasticity to algorithmic systems through “their very gaps and indefiniteness (avoiding over-prescriptive recommendations), adaptability (being able to reset, forget or stay still), and overlaps (preferring repetitions to reduce risk and increase security)” (Chevillon 5), which embrace the multifaceted and unpredictable connections of trans lives and their data traces. These tactical breakages occurring from this in-betweenness act as operations that contrast what is otherwise considered legible lives in the infrastructures and outcomes of algorithms. Instead, this reveals how these operations conflict the rigidity of algorithmic technologies by enabling a productive distance to the algorithms themselves from the ways in which trans people occupy a constant space in-between as lives never fully rejected or accepted by the systems.
By conflicting binary code, what kind of algorithmic distance does trans lives produce, and what does the liminality generate for the relationship between bodies and algorithms? Regardless of how encounters between trans bodies and algorithms occur, they exemplify the aesthetic operations as tactics of difference that trans people employ: When facial recognition software is failing on and dismisses trans faces as a part of their authentication process, the unrecognisibility attributed to trans faces disrupts the programmed facial detection on binary metrics. When automated gender recognition algorithms singularly operationalise the ‘essence’ of gender only through essentialising it as binary, trans people utilise the visual aesthetics of difference to reject the auto-encoded singular logic of binarity. When body scanners at the border immanently locating risky deviance on trans bodies from not fitting the binary gendered template they are engineered to execute, trans bodies appropriate the space between the generated visuality of the scanner and sociopolitical gendered expectations inscribed into the system. When nation state data administrative systems lose trans data upon legal gender change from relying on the fortification of computable binary gender to function, trans lives upset both the digitalisation processes but also the rigid nation state conceptualisations of what categories of gender and citizenship mean.
Taken together, they make visible a fractionated relationship always in proximity, where trans bodies can reach and sense algorithms, but are only tentatively computed and never understood, where the promise of ‘life’ is rendered at a distance but not constituting a full liveable life, that nonetheless work to decode and expose the inherent limitations and coded violence of algorithms. By design, collection, translation, operations and gaze, algorithms mould certain bodies not only for exposure, but also as never possible as ‘human’ in the first place (Wilcox), as already-always incompatible and deathable within and uncomputable to the systems to propagate, disseminate, and commodify global political imaginaries of hierarchised human value and liveability.
In the case of facial recognition, where the algorithm persistently fails to not only recognise trans faces, but through this computational inability also forward the absence of humanness, it creates a looming uncoded presence that can only be inferred by the very failure to be represented. As Trinh Minh-ha writes “invisibility is built into each instance of visibility, and the very forms of invisibility generated within the visible are often what is at stake in a struggle” (Minh-ha 132), forcing an acknowledgement of the constitutive outside of the binary gaze and rendering distance of algorithms. Similar to documentary practices and recording gaze of ‘seeing’, this idea of ‘making visible’ accelerates exponentially with contemporary algorithmic technologies for “seeing faster, all at once, and always more” (Minh-ha 131). This is translatable to the all-encompassing gaze of algorithmic systems, where there has to be an exclusion for there to be an inclusion in the system as they are inseparable conditions enabling each other.
Facial recognition technologies assume seamless and accurate detectability, while presuming and maintaining an immutable conception of binary gender (Danielsson et al.; Keyes; Thieme et al.). Globally and across Europe, facial recognition software has largely been seen as an effective tool by governments and agencies to ensure security, direct war, protect borders, and make identification easier (Guo; Opiah; Wagner; Wilson). This optimism persists despite several international organisations (see e.g. Buolamwini; Amnesty International; DIHR; Harding) consistently warn against the embedded injustices and underlying forces of harm that facial recognition technologies reinscribe when utilised for border surveillance, state welfare access, military warfare, crime detection, and immigration policies that fortify racial and gendered violence. In terms of risks for trans faces, facial recognition technologies cause problems across everyday life as they are implemented at access points between state infrastructures, international borders and spaces of movements. Between these points of access, trans people experience infringement on their human rights through how facial recognition misgenders, targets and directly fails on trans faces and denies their personhood, limits equal access and excessively profiles trans faces as a problem of unsolvable illegibility, making facial recognition technologies “dangerous when they fail and harmful when they work” (Crawford).
Yet, trans lives reveal further configurations than the mere split between visibility/invisibility. Between the lives seen, registered and recorded by algorithms either as legitimate or targets for violence and those not seen either through invisibility or erasure, there is also the power of the in-betweenness, the art of living between the coded lines, the illegible absences, and the digital silences that make up the space between each code. This intersection of trans bodies, data and colonial relations of binarity reiterated through facial recognition algorithms persuasively alter what it means to essentialise and secure ‘truth’ through the essence of binary gender. In doing so, this further establishes a productive distance to the algorithms themselves, and in this case, the aim of recognition through image generation. In these encounters, trans lives redefine the spatial dynamics of recognition, confuse traditional claims of material visibility, and expose the profound dissonances that determine the relationship between trans identities and algorithmic perception of humans. The spatiotemporal dynamics originally intended by this algorithmic governance is disrupted by trans faces in ways that neither the infrastructures of transnational Big tech companies suspected, or national legislative agendas can accomplish, essentially reconfiguring the the spatiotemporal dynamics of recognition by turning them into something unrecognisible.
Regarding trans encounters with body scanners implemented at international borders and airport security checkpoints, as Shacar and Mahmood highlight, “Treating the body as the site of regulation and control of mobility is no longer a matter of science fiction. It is the reality of the here and now.” (126). In this way, the body scanners put forward a move into the coded tactility of the flesh. These algorithmic body scanners all work on an essentialising coded template of binary gender that, when people stand in and walk through them, renders a visual imagery of the body silhouette in comparison to the outline of how the default cisgender body looks. Held against this visual of the binary body, if any additional body parts are present that do not fit this template or if an absence of normatively expected parts is detected, this catalyses a mechanism that flags the body as ‘suspicious’ and as a potential security threat needing further inspection. Upon walking into the scanner, trans bodies become dematerialised as flesh and reassembled into misrepresenting code that, by the algorithm, read and flag trans bodies as deceptive based on an encoded template corresponding strictly to that of a normative cisgender body as a location from where everything else is rendered in a dangerous deficit to (inter)national security (Beauchamp; Clarkson; Currah and Mulqueen; Quinan). This encounter remediates the relationship between bodies and algorithms, where the trans body physically being positioned in the scanner triggers the rendering as a ‘risky threat’ from not correlating to the programmed binarity of the system. As Drage and Frabetti notes, this threat “is often rendered analogous to the concealed sex/gender of a trans person in airport security who must be “outed” and surveilled to maintain public safety.” (90).
However, despite this technical rendering of the trans body as a threat in the automatic comparison to the constructed safety of the cisgender body, the trans body catalyses an alternative form of embodiment that challenges the system. To the system, the material tactility of the trans body creates a liminal distance that halts the body in proximity to the algorithmic operations of locating (in)security. Within this operation, this leaves transness as an embodiment that catalyses a requirement of impossible comprehension, which dissolves the appearance of its own perfectibility by showing its insufficient comprehension of human bodies. This redefinition reveals the algorithmic fragility and proneness to cracks that fail precisely “at the task which they have been set: to read the body perfectly” (Magnet 50). The embodiment emerging from the trans body complicates this encoded binary body template and reorient the algorithmic imaginaries of the body itself. This liminality of trans data lives means that they are simultaneously misrecognised, while also exceeding the computational bounds of algorithms and the codified idea of the human body. Similar to the ways in which trans faces reveal further configurations of in/visibility and spatiotemporal dynamics within facial recognition technologies, trans bodies disrupt and unveil the artificiality of the cisgender body as the default body template programmed into these scanners and aesthetically stretch the boundaries of what it means to algorithmically ‘know a body’.
Crucially, as argued by Os Keyes, “if these systems cannot conceptualize that you exist, then you are essentially living in a space that is constantly misgendering you and informing you that you’re not real” (cited in Cockerell). Together, the mentioned algorithmic technologies brought forward by this article highlight the shared algorithmic violation of trans bodies and showcase the inherent tension of the liminality embodied by trans data lives embody through their entanglement with binary code. Resistance to these encoded modes of unliveability begins at the exact point of exposing the instability of said categories that trans lives—through the flesh and through data—are dismissed and rejected from due to the binary logics that undergird and effectuate the functionality and operations of algorithmic technologies. The liminality of trans data lives allows for an ‘aesthetic trick’ of—within in the acts of being positioned as targets for erasure and exclusion—defying the gaze of the code and slipping through the systems. By attending to this simultaneous reality of trans data lives, it becomes clear how trans lives are parallelly positioned for violent exposure from algorithmic code, yet defy these bounds through a distance to algorithms as a way of living anyway in-between the coded lines.
Conclusion
Despite the global claim of algorithms as revolutionary and with an unprecedented perfectibility to improve human lives delivered by Big tech companies, nation states and far right lobbyist efforts, trans lives effectively locate an unexpected, oppositional and unsolvable flaw to the binary code that embrace the fluidity, instability and messiness of gender beyond the colonial binary encoded into the fabric of algorithmic technologies; essentially exposing their limitations for computing and comprehending life and the category of the human as wider than the default white cisgender human that cement hierarchised taxonomies of humanness. Fundamentally, algorithmic technologies “echo the imperialist ideologies that underpinned the development of physiognomy and other scientific projects of classification, meaning that these contemporary technologies have the potential to reify racist, sexist, and cisnormative beliefs and practices” (Scheuerman 2), which vindicate and reinforce global political imaginaries of colonial power intended to strengthen prior practices of exclusion through algorithmic force.
Theorising the aesthetics of trans lives as liminal data lives directs critical attention to the ways in which the appearing impossibility of transness in binary algorithmic technologies interact, interfere and simmer distractingly in-between the coded lines of algorithmic assemblages that at once produce performative effects of violence, but also for disruption located at the very trans bodies that algorithms cannot comprehend. This redirected attention disrupts not only popular narratives of algorithms as hegemonic and neutral, but pushes queer and trans scholarship on glitches and errors to consider the liminality of trans data lives as they reveal crucial cracks, faults and flaws in the systems that can be utilised strategically to unveil and resist modes of algorithmic violence through establishing a distance while in proximity arising from the lived experiences of uncodeability by design. By doing so, transness, as digital flesh, embodies a lived contrast and differentiating relationship to the algorithmic rendering of life by occupying a spatiotemporal position at both sides of the threshold of algorithmic code; cutting over, falling through, and obscuring the binary flows of code and confusing their anticipated outcomes.
Situated at this inception, the questions for future research hence become, which imaginaries, thresholds, distances and embodied forms of resistance can the digital fleshiness of trans bodies and their lives as inherently situated between the (im)possible, between (in)visibility, (un)codeability and (un)liveability unveil and produce for curating fugitive procedures and operations against algorithmic violence and subverting the binary gaze of life? How can the potential of trans data lives be utilised to envision and engineer trans and gender affirming algorithmic technologies and imaginaries that do not limit, but rather multiply the lived realities outside of binary restrictions and technical confinements of current sociopolitical systems? As this article grabbles with the possibility of creating distance to algorithmic technologies, while simultaneously always being entangled with and existing in proximity to them, this calls for future interventions looking at how this tensional space embedded in the liminality of trans data lives can be made productive from the perspectives of trans lives themselves.
Works Cited
Amaro, Ramon. The Black Technical Object: On Machine Learning and the Aspiration of Black Being. Sternberg Press, 2022.
Amnesty International UK. “Denmark: New Report - Mass Surveillance and Discrimination in Automated Welfare State.” Amnesty International, 13 Nov. 2024, www.amnesty.org.uk/press-releases/denmark-new-report-mass-surveillance-and-discrimination-automated-welfare-state.
Amnesty International. “EU: AI Act at Risk as European Parliament May Legitimize Abusive Technologies.” European Institutions Office, 12 June 2023, www.amnesty.eu/news/eu-ai-act-at-risk-as-european-parliament-may-legitimize-abusive-technologies.
Amoore, Louise. "Algorithmic War: Everyday Geographies of the War on Terror." Antipode, vol. 41, no. 1, 2009, pp. 49–69. https://doi.org/10.1111/j.1467-8330.2008.00655.x.
———. Cloud Ethics:: Algorithms: Algorithms and the Attributes of Ourselves and Others. Duke University Press, 2020.
Andersen, Christoffer Koch. "Wrapped Up in the Cis-Tem: Trans Liveability in the Age of Algorithmic Violence." Atlantis: Critical Studies in Gender, Culture & Social Justice, vol. 46, no. 1, 2025, pp. 24–41. https://atlantisjournal.ca/index.php/atlantis/article/view/5790.
———. "Beyond Fairness: Trans Unliveability in European Algorithmic Assemblages." European Workshop on Algorithmic Fairness, PMLR, vol. 294, 2025, pp. 295–302. https://proceedings.mlr.press/v294/andersen25a.
Beauchamp, Toby. Going Stealth: Transgender Politics and US Surveillance Practices. Duke University Press, 2019.
Bridges, Lauren E. "Digital Failure: Unbecoming the ‘Good’ Data Subject through Entropic, Fugitive, and Queer Data." Big Data & Society, vol. 8, no. 1, 2021. https://doi.org/10.1177/2053951720977882.
Buolamwini, Joy. “Civil Rights Implications of the Federal Use of Facial Recognition Technology.” Algorithmic Justice League, 8 Mar. 2024, https://www.ajl.org/civil-rights-commission-written-testimony.
Campanioni, Chris. "The Glitch of Biometrics and the Error as Evasion: The Subversive Potential of Self-Effacement." Diacritics, vol. 48, no. 4, 2020, pp. 28–51. https://dx.doi.org/10.1353/dia.2020.0028.
Chevillon, Guillaume. "The Queer Algorithm." SSRN, 2024. http://dx.doi.org/10.2139/ssrn.4742138.
Chun, Wendy Hui Kyong. "Introduction: Race and/as Technology; or, How to Do Things to Race." Camera Obscura, vol. 24, no. 1 (70), 2009, pp. 7–35. https://doi.org/10.1215/02705346-2008-013.
Clarkson, Nicholas L. "Incoherent Assemblages: Transgender Conflicts in US Security." Surveillance & Society, vol. 17, no. 5, 2019, pp. 618–630. https://doi.org/10.24908/ss.v17i5.12946.
Cockerell, Isobel. "Facial Recognition Systems Decide Your Gender for You. Activists Say It Needs to Stop." Rappler, 2021, https://www.rappler.com/technology/features/facial-recognition-automated-gender-coda-story.
Costanza-Chock, Sasha. "Design Justice, AI, and Escape from the Matrix of Domination." Journal of Design and Science, no. 3.5, 2018, pp. 1–14. https://doi.org/10.21428/96c8d426.
Crawford, Kate. "Halt the Use of Facial-Recognition Technology Until It Is Regulated." Nature, vol. 572, no. 7771, 2019, pp. 565–566. https://www.nature.com/articles/d41586-019-02514-7.
Currah, Paisley, and Tara Mulqueen. "Securitizing Gender: Identity, Biometrics, and Transgender Bodies at the Airport." Social Research: An International Quarterly, vol. 78, no. 2, 2011, pp. 557–582. https://dx.doi.org/10.1353/sor.2011.0030.
Danielsson, Karin, et al. "Queer Eye on AI: Binary Systems Versus Fluid Identities." Handbook of Critical Studies of Artificial Intelligence, Edward Elgar Publishing, 2023, pp. 595–606. https://doi.org/10.4337/9781803928562.00061.
Danish Institute for Human Rights (DIHR). “Facial Recognition to Combat Crime.” The Danish Institute for Human Rights, 20 Feb. 2020, www.humanrights.dk/publications/facial-recognition-combat-crime.
Dixon-Román, Ezekiel. "Algo-Ritmo: More-than-Human Performative Acts and the Racializing Assemblages of Algorithmic Architectures." Cultural Studies ↔ Critical Methodologies, vol. 16, no. 5, 2016, pp. 482–490. https://doi.org/10.1177/1532708616655769.
Drage, Eleanor, and Federica Frabetti. "Copies Without an Original: The Performativity of Biometric Bordering Technologies." Communication and Critical/Cultural Studies, vol. 21, no. 1, 2024, pp. 79–97. https://doi.org/10.1080/14791420.2023.2292493.
Elwood, Sarah. "Digital Geographies, Feminist Relationality, Black and Queer Code Studies: Thriving Otherwise." Progress in Human Geography, vol. 45, no. 2, 2021, pp. 209–228. https://doi.org/10.1177/030913251989973.
Fuller, Matthew, and Eyal Weizman. Investigative Aesthetics: Conflicts and Commons in the Politics of Truth. Verso Books, 2021.
Gaboury, Jacob. "Critical Unmaking: Toward a Queer Computation." The Routledge Companion to Media Studies and Digital Humanities, Routledge, 2018, pp. 483–491.
Guo, E. “The US Wants to Use Facial Recognition to Identify Migrant Children as They Age.” MIT Technology Review, 19 Aug. 2024, www.technologyreview.com/2024/08/14/1096534/homeland-security-facial-recognition-immigration-border.
Harding, Xavier. “Facial Recognition Bias: Why Racism Appears in Face Detection Tech.” Mozilla Foundation, 7 Aug. 2023, www.mozillafoundation.org/en/blog/facial-recognition-bias.
Hicks, Mar. "Hacking the Cis-tem." IEEE Annals of the History of Computing, vol. 41, no. 1, 2019, pp. 20–33. https://doi.org/10.1109/MAHC.2019.2897667.
Hoffmann, Anna Lauren. "Terms of Inclusion: Data, Discourse, Violence." New Media & Society, vol. 23, no. 12, 2021, pp. 3539–3556. https://doi.org/10.1177/1461444820958725.
Hu, Tung-Hui. A Prehistory of the Cloud. MIT Press, 2015.
Keyes, Os, and Jeanie Austin. "Feeling Fixes: Mess and Emotion in Algorithmic Audits." Big Data & Society, vol. 9, no. 2, 2022. https://doi.org/10.1177/20539517221113772.
Keyes, Os. "The Misgendering Machines: Trans/HCI Implications of Automatic Gender Recognition." Proceedings of the ACM on Human-Computer Interaction, vol. 2, CSCW, 2018, pp. 1–22. https://doi.org/10.1145/3274357.
Leszczynski, Agnieszka, and Sarah Elwood. "Glitch Epistemologies for Computational Cities." Dialogues in Human Geography, vol. 12, no. 3, 2022, pp. 361–378. https://doi.org/10.1177/20438206221075714.
Lyon, David. "Technology vs ‘Terrorism’: Circuits of City Surveillance Since September 11th." International Journal of Urban and Regional Research, vol. 27, no. 3, 2003, pp. 666–678. https://doi.org/10.1111/1468-2427.00473.
Magnet, Shoshana Amielle. When Biometrics Fail: Gender, Race, and the Technology of Identity. Duke University Press, 2020.
Minh-Ha, Trinh T. "The Image and the Void." Journal of Visual Culture, vol. 15, no. 1, 2016, pp. 131–140. https://doi.org/10.1177/1470412915619458.
Opiah, A. “Biometric Surveillance Testing Data Sparks Ethics Concerns in Germany.” Biometric Update | Biometrics News, Companies and Explainers, 13 May 2024, www.biometricupdate.com/202405/biometric-surveillance-testing-data-sparks-ethics-concerns-in-germany.
Parikka, Jussi. Operational Images: From the Visual to the Invisual. U of Minnesota Press, 2023.
Pugliese, Joseph. "In Silico Race and the Heteronomy of Biometric Proxies: Biometrics in the Context of Civilian Life, Border Security and Counter-Terrorism Laws." Australian Feminist Law Journal, vol. 23, no. 1, 2005, pp. 1–32. https://doi.org/10.1080/13200968.2005.10854342.
Quinan, C. L. "Biometric Technologies, Gendered Subjectivities and Artistic Resistance." Rethinking Identities Across Boundaries: Genders/Genres/Genera, Springer International Publishing, 2023, pp. 21–41. https://doi.org/10.1007/978-3-031-40795-6_2.
Rauchberg, Jessica Sage. "#Shadowbanned: Queer, Trans, and Disabled Creator Responses to Algorithmic Oppression on TikTok." LGBTQ Digital Cultures, Routledge, 2022, pp. 196–209.
Richardson, Michael. Nonhuman Witnessing: War, Data, and Ecology After the End of the World. Duke University Press, 2024.
Russell, Legacy. Glitch Feminism: A Manifesto. Verso Books, 2020.
Scheuerman, Morgan Klaus, Madeleine Pape, and Alex Hanna. "Auto-Essentialization: Gender in Automated Facial Analysis as Extended Colonial Project." Big Data & Society, vol. 8, no. 2, 2021. https://doi.org/10.1177/20539517211053712.
Shabbar, Andie. "Queer-Alt-Delete: Glitch Art as Protest Against the Surveillance Cis-Tem." WSQ: Women's Studies Quarterly, vol. 46, no. 3, 2018, pp. 195–211. https://dx.doi.org/10.1353/wsq.2018.0039.
Shachar, Ayelet, and Aaqib Mahmood. "The Body as the Border." Historical Social Research / Historische Sozialforschung, vol. 46, no. 3, 2021, pp. 124–150.
Shah, Nishant. "I Spy, With My Little AI: How Queer Bodies Are Made Dirty for Digital Technologies to Claim Cleanness." Queer Reflections on AI, Routledge, 2023, pp. 57–72.
Shelton, Jama, et al. "Digital Technologies and the Violent Surveillance of Nonbinary Gender." Journal of Gender-Based Violence, vol. 5, no. 3, 2021, pp. 517–529. https://doi.org/10.1332/239868021X16153783053180.
Thieme, Katja, Mary Ann S. Saunders, and Laila Ferreira. "From Language to Algorithm: Trans and Non-Binary Identities in Research on Facial and Gender Recognition." AI and Ethics, vol. 5, no. 2, 2025, pp. 991–1008. https://doi.org/10.1007/s43681-023-00375-5.
Wagner, A. “AI Facial Recognition Surveillance in the UK.” Tech Policy Press, 22 Oct. 2024, www.techpolicy.press/ai-facial-recognition-surveillance-in-the-uk.
Waldman, Ari Ezra. "Gender Data in the Automated Administrative State." Columbia Law Review, vol. 123, no. 8, 2023, pp. 2249–2320. https://columbialawreview.org/content/gender-data-in-the-automated-administrative-state.
Weheliye, Alexander Ghedi. Habeas Viscus: Racializing Assemblages, Biopolitics, and Black Feminist Theories of the Human. Duke University Press, 2014. https://doi.org/10.1515/9780822376491.
Wilcox, Lauren B. Bodies of Violence: Theorizing Embodied Subjects in International Relations. Oxford University Press, 2015.
———. "Embodying Algorithmic War: Gender, Race, and the Posthuman in Drone Warfare." Security Dialogue, vol. 48, no. 1, 2017, pp. 11–28. https://doi.org/10.1177/0967010616657947.
Wilson, N. “Plan to Slash Lengthy Passport Queues With New Facial Recognition Scans.” The Independent, 18 Mar. 2025, www.independent.co.uk/travel/news-and-advice/facial-recognition-scans-passports-ports-b2717154.html.
Wynter, Sylvia. "Human Being as Noun? Or Being Human as Praxis? Towards the Autopoetic Turn/Overturn: A Manifesto." 2007, https://bcrw.barnard.edu/wp-content/uploads/2015/10/Wynter_TheAutopoeticTurn.pdf.
Yoon, Hyungjoo. "Digital Flesh: A Feminist Approach to the Body in Cyberspace." Gender and Education, vol. 33, no. 5, 2021, pp. 578–593. https://doi.org/10.1080/09540253.2020.1802408.
Biography
Christoffer Koch Andersen is a PhD student in Multi-disciplinary Gender Studies at the Centre for Gender Studies, Department for Politics and International Studies, University of Cambridge. Located within Queer/Trans Studies, Feminist STS, Critical Algorithm Studies and International Relations, his research critically explores the intersections between the (im)possibility of trans lives, algorithmic assemblages, the coloniality of binary gender, global politics, and the category of the human. ORCID: https://orcid.org/0000-0002-5795-3641