Christoffer2
Embodying Liminal Data Lives: Encoding the Aesthetics of Trans Bodies as Algorithmic Distance
Abstract The idea that algorithms are infinitely improving our lives is presented as an undeniable truth, but for trans people, algorithms have violent, far-reaching implications. Behind the veil of neoliberal techno-optimism, algorithms perpetuate colonial and cisnormative legacies that anchor a binary idea of life, wherein the possible ‘human’ becomes the white, cisgender human, which in return violates and denounces trans people from not fitting the binary codes embedded into and making up algorithmic systems. Instead of complying with neoliberal beliefs in algorithms or falling short on critique, this article theorises the aesthetics of trans lives as embodied liminal data lives as a strategy of sensing distance to algorithms from the tactical uncodeability of transness in opposition to the binary confinements of algorithmic technologies. Taking this stance, this article asks: How can we create spaces of distance to algorithms in a world inherently entangled with them, and how can the liminality of trans data lives allow us to consider (im)possible ways of living and distancing as forms of resistance to the reality of algorithmic violence in which we exist? Keywords algorithms, colonial gender binary, liminal data lives, trans lives, algorithmic distance
Introduction: Algorithmic Ordering of Trans Lives The claim that algorithms are objectively beneficial to our lives stands as an axiomatic truth presented by Big tech companies and global governments. Popularly, algorithms are sold as tools to fix, tweak, improve, and exponentially advance our lives, but to trans people, this promise is not a given nor a truth. For trans people, algorithms and the spaces they enable, are violent, and at worst, deadly. Behind the neoliberal veil of techno-optimism fuelled by international politics, nation states and Big-tech companies, algorithms learn from, revigorate and perpetuate colonial and cisnormative legacies of violence that anchor a binary default (Amaro 2022; Hoffmann 2021). Within this default in algorithmic systems, the only possible ‘human’ becomes the white, cisgender human - forcing transness out of existence from not fitting the encoded template and binary codes making up the valorisation of human life. Contrary to the belief that ‘AI’ technologies are inherently novel, progressive and revolutionary, recent scholarship on trans experiences of algorithms has critically taken up how they reinscribe binary colonial markers of gender essentialism. For example, facial recognition software algorithms reject and re-essentialise trans faces (Keyes 2018; Scheuerman et al 2021), encode trans bodies as dangerous deviant threats in airport security scanners (Costanza-Chock 2018; Wilcox 2015), deny trans people access to crucial state welfare services, delete trans health data and create messy bureaucratic problems (Amnesty International UK 2024; Hicks 2019; Waldman 2023), intimately surveil trans identities (Keyes & Austin 2022; Shelton et al. 2021), erase trans existence through binary digital identification systems and platforms (Andersen 2025a; Dixon-Román 2016; Raj and Juned 2022; Shah 2023), and enact transphobic feedback loops on social media (Rauchberg 2022; Thach et al. 2024). Together, these algorithmic technologies share an enabled reiteration of colonial classification of humans along binary lines of life, which essentialise physio-phrenological traits of the body as corresponding to the gender binary as the singular comprehensible unit of algorithmic recognition, which reinforces systemic marginalisation of trans people and locates trans bodies as territories for surveillance. Meanwhile, scholarship also attends to how both queer and trans bodies enact “small but playful forms of disruption such as the error or glitch” (Gaboury 2018, 485), glitch out algorithmic technologies, resist systems of surveillance and establish epistemologies of ‘glitching’ (Leszczynski & Elwood, 2022; Russell 2020; Shabbar 2018), productively ‘fail’ in algorithmic technologies to unsettle the categories of ‘naming’ (Bridges 2021) and embrace the inherent subversive potential of embodying failure (Campanioni 2020). This following article extends both the current scholarship that 1) unveils and criticises the embodied and sociopolitical impacts of algorithmic violence for trans lives, and 2) analyses the productivity of ‘glitchy’ encounters of troubling, messing with or failing through algorithmic codes. Rather than merely focusing on the glitches, errors or failures of trans bodies in their encounters with algorithms, this article seeks to conceptualise this relationship along a different axis of analysis. This article is interested in theorising the aesthetic promise and potential of trans lives and their data reality of liminality—the state of existing in-between spaces simultaneously as equally invisible/visible, visual/invisual, codeable/uncodeable and liveable/unliveable—to investigate what the digital fleshiness of trans lives entitles as an embodied practices of distance to algorithmic technologies. This foregrounds trans techniques of refusal, misuse, and disruption that work with, through and against contemporary algorithmic technologies, and as such, establishes a trans critique of algorithms. Inventing the Data Body Instead of conceptualising algorithms as neutrally coded artefacts able to present objective truths about the world, this article theorises algorithms as sociopolitically contingent artefacts that fundamentally “engage in fabulation – they invent a people, write them into being as a curious body of correlated attributes, grouped into clusters derived from data that are themselves fabulatory devices” (Amoore 2020, 158) that construct fabricated hierarchised imaginaries of the world, and as a result, of the subjects entangled with them. Within this algorithmic age, these “flows of personal data—abstracted information—are sifted and channelled in the process of risk assessment, to privilege some and disadvantage others, to accept some as legitimately present and to reject others” (Lyon 2003, 674). Highlighting the algorithmically curated imaginaries of both people and the world reveal the contingent nature of trans people locked into their data shadows, where algorithms invent specific fabulas about trans lives. Algorithms invent and present trans people as if they are inherently unreadable, or even impossible within the systems, while claiming that they are incompatible with the idea of the human due to their inability to be correctly rendered by the same systems excluding trans lives from the possibility of being understood along the lines of humanness in the first place. Wilcox poignantly reminds us how bodies are not isolated from their political histories, so erasing “this process of materialization that makes it seem as if intelligible bodies are natural phenomena constitutes another moment of violence” (Wilcox 2015, 8). The act of constructing trans lives as unrecognisable entities in algorithmic systems is derived from data limited to tell a certain story and are thus fabulatory devices that dis/allow specific truths about the world. Essentially, the invention of this incomprehensibility to recognise trans lives leveraged by algorithms likewise becomes a fabulatory device itself – a device that serves to legitimise a story about trans people as ‘uncodeable’ and dismiss trans lives as a naturally given and inevitable reality that cannot be different despite the histories and lived realities of trans people. In addition to living in one’s own data shadow, not only have some bodies been historically ostracised through data and from their own data, but this data is ultimately part of larger sociopolitical relations and interconnected technological networks, where it is profiled, circulated and translated across several databases – administrative systems, digital databases, bureaucratic documents, biometric technologies and computational predictions that break the body down into coded digits. These sociopolitical relations inhabit the differentiation and hierarchisation that have disciplined the humanity of bodies (Weheliye 2014), where this making and distinction of difference enact the multiplicity and specific production, circulation, cementation and possibilities of human data (and datafied humans) across various algorithmic assemblages. By weaving together algorithms and trans lives, it becomes possible to consider how trans existence contributes to novel spatiotemporal forms of thinking about the algorithmic augmentation of social order and human differentiation in our digital societies: not only is ‘trans’ a technology for mapping deviance in reference to binary life, but further a tool for ordering, classifying, and controlling the embodiment of the human solidified through the colonial imposition of gender binarity. In this sense, the question is, can this function of difference that trans encapsulates be made productive as a rupture to create embodied forms of distance to algorithmic violence? At a crucial moment in time where algorithms are exponentially embedded into every facet of our everyday lives, and where they both prime global political imaginaries of human value and reinfuse colonial hierarchies of power, the disproportionate implications for trans lives must be investigated, but likewise must strategic techniques for curating distance to the algorithmic technologies themselves. This article is situated between two intersecting branches of scholarship – that on algorithmic violence and on trans experiences of and resistance against it – with the aim of contributing with a spatiotemporal digital orientation of trans bodies as liminal data lives in order to unveil the forces of algorithmic violence, as well as to provide a theory of the productive encounters that occur when the uncodeability of transness is inserted into algorithmic equations. Living with algorithms while trans presents an inescapable reality and precarious unliveability only predicted to intensify the impossibility of trans lives. The question becomes, how do we carve out liminal spaces in proximity to, but away from the algorithmic gaze of death? How can we create productive spaces of distance to algorithmic violence in a world inherently entangled with algorithms? I suggest an alternative coded rupture from transness itself to conceptualise the aesthetics of living as trans and trans lives as liminal data lives—lives that inherently inhabit a digital space in-between two states of being targeted and dismissed—which operationalises a productive strategy of sensing distance to algorithms by keeping with the complex uncodeability of transness in opposition to the binary limits of algorithmic technologies. In doing so, how might this shift from mere ‘error’ and ‘failure’ to uncodeability allow us to consider alternative ways of living and creating distance as resistance against algorithmic technologies towards encoding trans liveability? Coded Flesh, Coded Death: Algorithmic Violence and Binary Valorisation of Life Algorithms are maps of technical instructions that order and classify objects and humans into fixed categories; embodied by humans that code them and through the humans implicated by them. Algorithms are immaterial infrastructures of predictions, yet “need to be embodied in some combination of human and/or machine […] in relation to the systems of interpretation and to the bodies that do the interpreting and reacting to the information they provide.” (Wilcox 2017, 16-17). Crucially, in relation to bodies, transness—with its infiniteness, messiness and mutability—works against the operational principle of algorithms and their binary definiteness, fixedness, and immutability, which renders trans people either hypervisible as a deviance or invisible and erased. This imposes a violent gendering of the human in accordance with colonial rules of classification as the decision over life and distinction of who should live and who must die by “performatively enacting themselves/ourselves as being human, in the genre specific terms of each such codes’ positive/negative system of meanings” (Wynter 2007, 30). Under the contemporary code of the algorithmic reality, the white cisgender human represents a positive symbolic meaning of living, while transness characterises a negative impossibility of life. Algorithms essentially represent a computational figuration of the politics of classification; the act of classifying and sorting bodies as objects into neatly defined categories, which inevitably infuses an overwriting and exclusion of those who cannot be fitted into these strict categories. Trans people exist as neither-nor in a liminal space within the computational order of life: On one side, existing as codeable by being hypervisible in deviating from binary code, which positions trans people as targets for violence through failure to conform to the necropolitical norms and logics underlying the algorithmic order of life and death. On the other side, existing as uncodeable in its authentic and fleshy entirety as algorithms cannot comprehend transness, but neglects and computes transness to not exist in the first place as a non-life left to die outside of the territory of life - in both instances of (in)visibility, transness is fundamentally uncodeable. In this sense, the algorithmic entitles “identifying norm and multiple deviations from the norm [by deploying] an “architecture of enmity”, a drawing of the lines between self/other; us/them; safe/risky; inside/outside” (Amoore 2009, 51). These affective senses of ‘improper life’ stick to transness in its aberrations from normative binary structures, hence the trans body is subjected to coded operations of elimination that mark the flesh and strip the trans body of its human possibility as a coded death. If algorithms resemble a war-like architecture of enmity, then trans represents the compulsory fleshy reference for enabling the algorithmic distinction of value. In the current algorithmic reality, ”if war at a distance” produced a subject position of a viewer, “war as big data” produces the subject position of a user, that is, a subject that actively participate in securing the system as a whole” (Hu 2015, 113). Trans thus functions as a digital flesh to securitise the structures of algorithmic technologies and legitimise their war on certain ‘othered’ bodies as a whole through its interlinked assemblages of information, data and digits that do not correspond to trans existence, but rather, render and interpret trans lives as a computational incomprehensibility. Here, I strategically utilise the term digital flesh to “reflect the structure of digital phenomena as a continuum of reality, instead of an empty space lacking reality” (Yoon 2021, 585) to emphasise how bodies are inscribed into the algorithmic systems that co-construct their embodiment. If “algorithmic techniques are concerned with anticipating [and curating] an uncertain future, then the logic of algorithmic war is one of identifying norm and multiple deviations from the norm” (Amoore 2009, 55). The logic of war needs deviations to be identified in advance, and this is what underruns the encoding of the trans body as an existing difference and deviance from the norm; the assumed stable and secure cisnormative body template of life. Importantly, what I draw attention to here is an overarching differentiating order of embodiment predicated on the instrumentalised sequences of algorithmic necropolitical functions designed to configurate trans subjects as ontologically killable flesh and imminently uncodeable to the system, where “identity and subjectivity are stripped away from bodies; persons are objectified as their fleshy, material bodies.” (Wilcox 2015, 104). Transness, I argue, represents as an epiphenomenon of algorithmic processes of classification, sorting and ordering through abstracted code, references and proximity that turn trans bodies into data formations that deviate from the installed norms within the systems. This process pre-necessitates that rendering of trans bodies as ‘threats’, which legitimises their co-existing attribute of being coded for exposure rather than being coded as human. Through the operations of algorithmic technologies, which revolve around the “logistics in massive technical systems that work through the ability to abstract and optimize” (Parikka 2023, 31), algorithms appropriate the binary order of code as the framework of readable life, hence abstracting trans lives as malfunctioning data formations not apt with the system. As a technical object expressed through code, trans bodies are rendered as uncodeable in the symbolic order of the binary code, thus alienated from themselves and their flesh from not being possible to be read as trans, as life, as human. As Pugliese (2005) puts it, “Not to produce a template is equivalent to having no legal ontology, to being a non-being; you are equivalent to subjects who cannot be represented and whose presence can only be inferred by their very failure to be represented” (14). Instead, trans bodies come to represent coded signs of falling through the cracks, as something away from what constitutes the human and what the human is supposed to be. In this framing, transness is rather—through its inherent computational uncodeability in its own right—read as an absence of human that must be eliminated due to the lack of humanness. To the human witnessing algorithmic violence, this “radical absence [of humanness] is crucial to witnessing what is not there, or fails to materialise, or is destroyed, or has died” (Richardson 2024, 153). Within the algorithmic reality in which we are situated, one must be algorithmically readable in order to exist and live. Far from distancing the human partiality from that of the code, algorithmic technologies insist on executing the predetermined configuration of the human based on colonial legacies of binary gender, which “embed the discursive, affective, and fantastic logics of war in all their racializing and gendering dimensions into the algorithm at every stage of its design, training, and operation” (Richardson 2024, 103). They are, in this way, inseparable from the violent production of gender that formalise and exercise the impossibility of certain lives as digital flesh coded for death, while employing an algorithmically augmented valorisation and systematically upholding the liveability of binary lives. With these facets of coded violence in mind, by attending to the aesthetic-political potential of the liminality of trans lives rather than framing algorithmic technologies as simply failing to capture transness, how might we interpret this act of failure that trans flesh embodies—and the inherent partiality it reveals—as central to our unveiling and knowledge production of algorithms? I suggest that the coded trans flesh unveils a liminal data life that illuminates a unique property in its liminality that the algorithmic system cannot expect, predetermine or fully calculate, but a fluidity of life that runs between the codes. It is exactly at this liminality between the physical and the digital that the trans body arrives as digital flesh that is once appropriated and used by algorithmic systems to claim unrecognisability to target and legitimise war on the trans body as the logic of algorithmic war on deviant bodies relies on their presumed deviance to defend the war itself. On the contrary, this also enables the trans body to remove itself from its physical flesh and into the digital cracks as a liminal data life to speculate and simmer as a possibility of something different outside of the uncalculatable range within algorithmic systems. This liminality encoded through data creates a rupture where the possibilities of identification and life exceed the binary limitations of embodiment in the system and the digitally mediated boundaries to which life can be lived. Liminal Data Lives: Aestheticising the Digital Trans Flesh as Algorithmic Distance No system can enforce a fixed, undisrupted narration and computation of truth without cracks. Algorithmic technologies—despite their glaring appearance as territories of unambiguous domination—are places of messiness, frictions, interference and disruption. This reality is often concealed behind the myriads of efforts needed to make an artificial system of binary logics appear fortified as the truth, and thus not articulated as a feature or productive fragility core to the systems themselves. Through the disruptive potential of trans data lives, a rupture and opening into said fragility of binary code can be located and exposed through the inherent uncodeability of transness that creates a liminal distance to algorithmic code and binary life. The question is, how do we critically utilise this liminal data space that trans people embody to create distance and inscribe another possible sensing of algorithms? As Fuller and Weizman (2021) argues, aesthetic investigations—in this article through the lived experiences of transness—have a twofold aim as they “are at the same time investigations of the world [algorithmic violence] and enquiries into the means of knowing it [trans lives]” (15). Utilising the aesthetics of trans lives as means of sensing the world of algorithms and critically questioning the harmful colonial politics underlying its expansion involve “sensing – the capacity to register or to be affected, and sense-making – the capacity for such sensing to become knowledge” (33). This operationalisation of aesthetics enables us to attend to the affective facets of trans lived experiences with algorithms and translate these experiences into productive knowledge for refusal against algorithmic systems. In this aspect, trans existence is infinitely “wielded […] as an invaluable mapping tool, a means by which origins and boundaries are simultaneously traced and constructed and through which the visible traces of the body are tied to allegedly innate invisible characteristics” (Chun 2009, 10). By default, this marks both binarity as an ontological necessity and operationalise a spatiotemporal colonial reiteration of a hierarchised social order: ‘trans’ then is not only a tool for ordering, classifying, and controlling solidified through the imposition of the gender binary that is mirrored by algorithmic code, but inflicts disruption by existing as a mapping technology for locating destructible deviance and resistance in algorithmic technologies. Trans bodies embody and curate a crucial liminal data space—simmering simultaneously between two different places and states of being in and with data: visibility as targets of violence and invisibility from going under the coded radar. Firstly, this takes form in terms of codeability from being rendered as visibly ‘deviant’ and uncodeability from the computational inability to comprehend trans existence in holistic authenticity. The idea of codeability speaks to the fact that, despite the seemingly algorithmic inability to read trans lives, data is still produced about the trans body – in this instance, as a deviance, where the data generated come in the format of registered deviance from the systems’ norms. Meanwhile, this means there is an inherent uncodeability of trans lives in algorithmic systems, where they are not rendered and understood on their own terms in a holistic sense due to the algorithmic inability to comprehensively represent and define them. While some data is always produced about trans people in their encounters with algorithms, they cannot be fully and holistically rendered in their total legitimacy without misrecognition, flaws or exposure to risks or being held to a cisgendered comparison. Secondly, this tension relates to the liveability of trans people in their data. Liveability refers to the “holistic quality of life located at the trans body as situated in an algorithmic world, and in which ways algorithms complicate the degrees of (un)liveability under which trans lives are subjugated (…) [and] concerns how trans liveability is affected and through which different systematic, sociopolitical and structural hierarchies of power encoded into algorithmic detection and decision-making” (Andersen 2025b, 3). Liveability exists as a mode of inhabiting data that is always rendered in its perpetual precariousness and surveillant assemblages inscribed with precoded hierarchies of power, where trans people are not represented as liveable on their own terms in the code despite living in their own right. In comparison to other lives, trans lives are especially targeted, which position trans data lives in a state of programmed unliveability due to that same data manifesting as coded procedures of exposure, exclusion and death. In this way, trans lives—trapped within binary codes of life—inhabit a liminal yet powerful space of simmering and sensing the algorithmic world between the visible/invisible; codeable/uncodeable; liveable/unliveable as iterative modes of being that illustrate a significant and inescapable relationship between how trans bodies exist in the world and how algorithms interpret this existence as a constant coded negotiation between targeting and erasure. This relation between algorithms and trans bodies as a co-produced liminal distance begins at the point of dismissing, rejecting or omitting transness from categories necessary for the binary logics that undergird the operationality of algorithms. Existentially and algorithmically, this is essentially the coded trap that trans subjects find themselves in, or, phrased differently; the space in which they inhabit and sense, refuse, and distort algorithmic infrastructures. As exemplified by trans experiences, as trans lives interact with algorithmic systems—whether that be in facial recognition software technologies having trouble representing and verifying trans faces, body scanners at the border being stunted by the nonconformity of trans bodies, or state welfare systems glitching out on granting trans citizens access—they are inconvenienced by trans existence as this form of existence does not correspond with the preprogrammed space that lives are expected to inhabit. Altogether, in their various technical operations and attempts at rendering a tangible subject, the algorithmic systems are troubled, delayed and stunned by the interference from trans embodiment that they cannot account for, which speaks to the aesthetic potential of the liminal distance enacted by trans lives. Critically, within this space, it “require[s] ways of knowing and being that refuse to be reduced to the limits of normative digital-social orders (…) [where] queer life originates in desiring and doing that which normative social orders situate as impossible” (Elwood 2021, 213). The conditions of ‘error’ or ‘erasure’ in contrast to cisnormative data lives encode a distance that encourages strategic fugitive tactics of refusal for algorithmic infrastructures to be resisted and reimagined despite seeming impossible under the current neocolonial techno-optimism; a space where algorithmic infrastructures are troubled, delayed, distorted, and glitched from how transness exists in/against the code. Transness embodies a particular kind of ‘in-betweenness’ that at once infiltrates the binary code, renders it futile as a universal truth and effectuates distance to the reductionist algorithmic readability of humanness towards redefining what it means to be(come) human. By not fitting into binary code, transness strategically falls through the coded cracks of life. Despite the rigid boundaries of binary code, the ambivalent liminality of trans data lives allows for transness, as digital flesh, to become fluid and fugitive between the algorithmic codes. In this way, transness activates a fugitive resistance against algorithmic violence from embodied investment in failure by occupying a spatiotemporal position at both sides of the threshold of code utilised by algorithmic technologies; cutting over, falling through, going against and obscuring binary flows of code. At this dual threshold, a certain kind of productive and disruptive relationship is generated that alter what we understand as distance to, while inevitably in proximity to algorithms that only trans bodies can catalyse. This points to a crucial technical inception between the lived experiences and capacities of trans bodies and the systemic conditions of algorithms; their interfaces, systems and infrastructures. As an embodied tactic of trans lives, this in-betweenness operates at the level of the trans body in its interference with the systemic conditions of algorithms. Through embodying difference, they fall through the coded line that cannot capture their lives, obscure the efficiency of code by not fitting into the system, work directly against and expose the absurdity of binary reductionism, and cut over binary code by embodying more than what the binary can encapsulate. Trans lives introduce a disruptive plasticity to algorithmic systems through “their very gaps and indefiniteness (avoiding over-prescriptive recommendations), adaptability (being able to reset, forget or stay still), and overlaps (preferring repetitions to reduce risk and increase security)” (Chevillon 2024, 5), which embrace the multifaceted and unpredictable connections of trans lives and their data traces. These tactical breakages occurring from this in-betweenness act as operations that contrast what is otherwise considered legible lives in the infrastructures and outcomes of algorithms. Instead, this reveals how these operations conflict the rigidity of algorithmic technologies by enabling a productive distance to the algorithms themselves from the ways in which trans people occupy a constant space in-between as lives never fully rejected or accepted by the systems. By conflicting binary code, what kind of algorithmic distance does trans lives produce, and what does the liminality generate for the relationship between bodies and algorithms? Regardless of how encounters between trans bodies and algorithms occur, they exemplify the aesthetic operations as tactics of difference that trans people employ: When facial recognition software is failing on and dismisses trans faces as a part of their authentication process, the unrecognisability attributed to trans faces disrupts the programmed facial detection on binary metrics. When automated gender recognition algorithms singularly operationalise the ‘essence’ of gender only through essentialising it as binary, trans people utilise the visual aesthetics of difference to reject the auto-encoded singular logic of binarity. When body scanners at the border immanently locate risky deviance on trans bodies from not fitting the binary gendered template they are engineered to execute, trans bodies appropriate the space between the generated visuality of the scanner and sociopolitical gendered expectations inscribed into the system. When nation state data administrative systems lose trans data upon legal gender change from relying on the fortification of computable binary gender to function, trans lives upset both the digitalisation processes but also the rigid nation state conceptualisations of what categories of gender and citizenship mean. Taken together, they make visible a fractionated relationship always in proximity, where trans bodies can reach and sense algorithms, but are only tentatively computed and never comprehensively understood in their own right, where the promise of life is rendered at a distance but not constituting a full liveable life, that nonetheless work to decode and expose the inherent limitations and coded violence of algorithms. By design, collection, translation, operations and gaze, algorithms mould certain bodies not only for exposure, but also as never possible as human in the first place (Wilcox 2023), as already-always incompatible and deathable within and incomputable to the systems that propagate, disseminate, and commodify global political imaginaries of hierarchised human value and liveability. Trans Data Lives and Facial Recognition Algorithms In the case of facial recognition, where the algorithm persistently fails to not only recognise trans faces, but through this computational inability also forward the absence of humanness, it creates a looming uncoded presence that can only be inferred by the very failure to be represented. As Trinh Minh-ha writes “invisibility is built into each instance of visibility, and the very forms of invisibility generated within the visible are often what is at stake in a struggle” (Minh-ha 2016, 132), forcing an acknowledgement of the constitutive outside of the binary gaze and rendering distance of algorithms. Similar to documentary practices and recording gaze of ‘seeing’, this idea of ‘making visible’ accelerates exponentially with contemporary algorithmic technologies for “seeing faster, all at once, and always more” (Minh-ha 2016, 131). This is translatable to the all-encompassing surveillant gaze of algorithmic systems, where there has to be an exclusion for there to be an inclusion in the system as they are inseparable conditions enabling each other. As a prime trans example in Denmark, when I had to verify my identity through the Danish verification process linked to your personal digital identity (MitID), I had to take a picture of my passport and use their facial recognition algorithm to scan my face so it could cross-reference and match my passport to my face. Instead of—as algorithmic solutions are advertised—effortlessly verifying my identity by matching my passport to my facial scan, I consistently received error messages stating that the photo in my passport did not match the scan of my face after attempting to verify my identity countless times. At the time, I had been on testosterone for years, but my passport picture was taken pre testosterone, which made the facial recognition algorithm unable to recognise my face and thus authenticate my identity to the state after years on testosterone. Far from being an innocent system producing a simple technical error, this marks the reoccurring phrenological idea of pinpointing to physical facial structures as cornerstones of truth and as a tool for verifying someone’s real identity installed and packaged in a novel, automated format. Facial recognition technologies assume seamless and accurate detectability, while presuming and maintaining an immutable conception of binary gender (Danielsson et al. 2023; Keyes 2018; Thieme et al. 2025). Globally and across Europe, facial recognition software has largely been seen as an effective tool by governments and agencies to ensure security, direct war, protect borders, and make identification easier (Guo 2024; Opiah 2024; Wagner 2024; Wilson 2025). This optimism persists despite several international organisations (see e.g. Buolamwini 2023; Amnesty International 2023; DIHR 2020; Harding 2023) consistently warn against the embedded injustices and underlying forces of harm that facial recognition algorithms reinscribe when utilised for border surveillance, state welfare access, military warfare, crime detection, and immigration policies that fortify racial and gendered violence. In terms of risks for trans faces, facial recognition technologies cause problems across everyday life as they are implemented at access points between state infrastructures, international borders and spaces of movements. Between these points of access, trans people experience infringement on their human rights through how facial recognition technologies misgender, target and directly fail on trans faces and deny their personhood, limit equal access and excessively profile trans faces as a problem of unsolvable illegibility, making facial recognition technologies “dangerous when they fail and harmful when they work” (Crawford 2019). Yet, trans lives reveal further configurations than the mere split between visibility/invisibility. Between the lives seen, registered and recorded by algorithms either as legitimate or targets for violence and those not seen either through invisibility or erasure, there is also the power of the in-betweenness, the art of living between the coded lines, the illegible absences, and the digital silences that make up the space between each code. This intersection of trans bodies, data and colonial relations of binarity reiterated through facial recognition algorithms persuasively alter what it means to essentialise and secure ‘truth’ through the presumed essence of binary gender. In doing so, this further establishes a productive distance to the algorithms themselves, and in this case, the aim of recognition through image generation. In these encounters, trans lives redefine the spatial dynamics of recognition, confuse traditional claims of material visibility, and expose the profound dissonances that determine the relationship between trans identities and algorithmic perception of humans. The spatiotemporal dynamics originally intended by this algorithmic governance is disrupted by trans faces in ways that neither the infrastructures of transnational Big-tech companies suspected, or national legislative agendas can accomplish, essentially reconfiguring the spatiotemporal dynamics of recognition by turning them into something unrecognisable. In this way, to what extent can trans lives disruptively reconfigure the spatiotemporal dynamics and orientations of binary algorithmic recognition technologies at large? Trans Data Lives and Algorithmic Body Scanners Regarding trans encounters with body scanners implemented at international borders and airport security checkpoints, as Shachar and Mahmood (2021) highlight, “Treating the body as the site of regulation and control of mobility is no longer a matter of science fiction. It is the reality of the here and now.” (126). In this way, the body scanners put forward a move into the coded tactility of the flesh. These algorithmic body scanners work on an essentialising coded template of binary gender that, when encountering transness, as trans people stand in and walk through them, renders a visual imagery of the body silhouette in comparison to the outline of how the default cisgender body is expected to look. As articulated by Beauchamp (2019) “The generic “OK” body (…) is one with four limbs and a legible gender presentation, and it is absent any additional materials or objects.” (74). Held against this visual of the binary body, if any additional body parts are present that do not fit this template or if an absence of normatively expected parts is detected, an internal mechanism that flags the body as ‘suspicious’ and as a potential security threat that needs further inspection is catalysed in order to neutralise said threat potential to national security. Upon walking into the scanner, trans bodies become dematerialised as flesh and reassembled into misrepresenting code that, by the algorithm, read and flag trans bodies as deceptive based on an encoded template corresponding strictly to that of a normative cisgender body as a location from where everything else is rendered in a dangerous deficit to (inter)national security (Beauchamp 2019; Clarkson 2019; Currah and Mulqueen 2011; Hall and Clapton 2021; Quinan 2023). This encounter remediates the relationship between bodies and algorithms, where the physical positioning of the trans body in the scanner triggers the rendering as a ‘risky threat’ from not correlating to the programmed binarity of the system. As Drage and Frabetti (2024) notes, this threat “is often rendered analogous to the concealed sex/gender of a trans person in airport security who must be “outed” and surveilled to maintain public safety.” (90), making the trans body the deliberate target object through which political and affective senses of proximity to national security are mediated and maintained. As Wilcox (2017) poignantly argues in relation to the attachment of ‘threat’ to the trans body, “The construction of certain bodies as threatening is thus less a matter of what is known about them than a desire to make bodies into what we already know they must be” (22). Trans bodies must be deviant threats to maintain regimes of security. However, despite this technical rendering of the trans body as a threat in the automatic comparison to the constructed safety of the cisgender body accentuated by strings of trans and queer scholarship, the trans body catalyses an alternative form of embodiment that challenges the system. To the system, the material tactility of the trans body forwards a liminal distance that halts the body in proximity to the algorithmic operations of locating (in)security. Within this operation, this leaves transness as an embodiment that catalyses a requirement of impossible comprehension, which dissolves the appearance of its own perfectibility by showing its insufficient comprehension of human bodies. This redefinition reveals the algorithmic fragility and proneness to cracks that fail precisely “at the task which they have been set: to read the body perfectly” (Magnet 2011, 50), suggesting that the ways in which trans bodies are perceived as illegible are set to endanger public and national security, but also the very reliability and accuracy of algorithmic surveillance technologies themselves - such as with this example of the body scanner. The embodiment emerging from the trans body complicates this encoded binary body template and reorients the algorithmic imaginaries of the body itself. This liminality of trans data lives means that they are simultaneously misrecognised, while also exceeding the computational bounds of algorithms and the codified idea of the human body. Similar to the ways in which trans faces reveal further configurations of in/visibility and spatiotemporal dynamics within facial recognition technologies, trans bodies disrupt and unveil the artificiality of the cisgender body as the default body template programmed into these scanners and aesthetically stretch the boundaries of what it means to algorithmically ‘know a body’. This trans-aesthetic expansion into what it means to produce knowledge about and render bodies knowledgeable challenges the normatively embedded procedures of which bodies are visually and politically valued alongside the processes through which certain bodies are—without exception or much tactical questioning—synthetically constructed as permissible. The pressing question then is, how might this aesthetic of trans bodies be made productive into altering the desires of making certain bodies known and what it means to algorithmically ‘know a body’? Crucially, as argued by Os Keyes, “if these systems cannot conceptualize that you exist, then you are essentially living in a space that is constantly misgendering you and informing you that you’re not real” (cited in Cockerell, 2021). Together, the algorithmic technologies brought forward by this article highlight the shared, systematic algorithmic violation of trans bodies and showcase the inherent tension of the liminality embodied by trans data lives through their entanglement with and refusal of binary code. Resistance to these encoded modes of unliveability begins at the exact point of exposing the instability of said categories that trans lives—through the flesh and through data—are dismissed and rejected from due to the binary logics that undergird and effectuate the functionality and operations of algorithmic technologies. The liminality of trans data lives allows for an ‘aesthetic trick’ of—within in the acts of being positioned as targets for erasure and exclusion—confronting the gaze of the code and slipping through the systems. By attending to this simultaneous reality of trans data lives, it becomes clear how trans lives are parallelly positioned for violent exposure from algorithmic code, yet defy these bounds through a distance to algorithms as a way of living anyway in-between the coded lines.
Conclusion
Despite the global claim of algorithms as revolutionary and with an unprecedented perfectibility to improve human lives delivered by Big tech companies, nation states and far right lobbyist efforts, trans lives effectively locate unexpected, oppositional and unsolvable flaws to the binary code that embrace the fluidity, instability and messiness of gender beyond the colonial binary encoded into the fabric of algorithms; exposing their limitations for computing and comprehending life beyond the default white cisgender human that cement hierarchies of humanness. Fundamentally, algorithmic technologies “echo the imperialist ideologies that underpinned the development of physiognomy and other scientific projects of classification, meaning that these contemporary technologies have the potential to reify racist, sexist, and cisnormative beliefs and practices” (Scheuerman 2021, 2), which vindicate and reinforce global political imaginaries of colonial power intended to strengthen prior practices of exclusion through algorithmic force.
Theorising the aesthetics of trans lives as liminal data lives direct critical attention to the ways in which the appearing uncodeability of transness in binary algorithmic technologies interact, interfere and simmer distractingly in-between the coded lines of algorithmic assemblages that at once produce performative effects of violence and disruption located at the very trans bodies that algorithms cannot comprehend. This redirected attention disrupts not only popular narratives of algorithms as hegemonic and neutral, but advances queer and trans scholarship on glitches and errors to consider the liminality of trans data lives as they reveal crucial cracks, faults and flaws in the systems that can be utilised strategically to resist modes of algorithmic violence through establishing distance while in proximity from the lived experiences of uncodeability by design. It is this trap that trans people find themselves in and inhabit as a liminal space, where they refuse, trouble, and distort algorithmic infrastructures. By doing so, transness, as digital flesh, embodies a lived contrast and differentiating relationship to the algorithmic rendering of life by occupying a spatiotemporal position at both sides of the threshold of algorithmic code; cutting over, falling through, and obscuring the binary flows of code and confusing their anticipated technical outcomes. These errors generate an intricate relationship between trans bodies and algorithms - one perpetually in proximity, but always at a distance.
Situated at this contemporary inception, the questions for future research become, which imaginaries, thresholds, distances and embodied forms of resistance can the digital fleshiness of trans bodies and their lives as inherently situated between the (im)possible, between (in)visibility, (un)codeability and (un)liveability unveil and produce for curating fugitive procedures and operations against algorithmic violence and subverting the binary gaze of life? How can the potential of trans data lives be utilised to envision and engineer trans and gender affirming algorithmic technologies and imaginaries that do not limit, but rather multiply the lived realities outside of binary restrictions and technical confinements of current sociopolitical systems? Looking into the digital future, how can exploring and speculating with the aesthetics facets of the sociotechnical uncodeability and liminality of trans data lives work as a critical practice towards building and achieving algorithmic justice? As this article grabbles with the possibility of creating distance to algorithmic technologies, while simultaneously always already being entangled with and existing in proximity to them, this calls for future interventions looking at how this tensional space embedded in the liminality of trans data lives can be made productive from the situated and embodied perspectives of trans lives themselves against algorithmic technologies.
Acknowledgements
This work was undertaken as a part of my PhD at the University of Cambridge following the transmediale x DARC Proximity/Distance Research Workshop in January 2025; thanks in part to Pembroke College Cambridge for funding my conference trip to Berlin and to the Cambridge Trust for generously funding my PhD through the Cambridge International Scholarship. I also want to express a thank you to Pablo Velasco, Magdalena Tyżlik-Carver, Christian Ulrik Andersen, Jussi Parikka and Søren Pold for organising the transmediale workshop along with directing a special gratitude to Pablo, Magda and my reviewer for supporting the editorial work of this article.
Bibliography
Amaro, Ramon. The Black Technical Object: On Machine Learning and the Aspiration of Black Being. Sternberg Press, 2022. Amnesty International UK. “Denmark: New Report - Mass Surveillance and Discrimination in Automated Welfare State.” Amnesty International, 13 Nov. 2024, www.amnesty.org.uk/press-releases/denmark-new-report-mass-surveillance-and-discrimination-automated-welfare-state.
Amnesty International. “EU: AI Act at risk as European Parliament may legitimize abusive technologies.” European Institutions Office, 12 June 2023, www.amnesty.eu/news/eu-ai-act-at-risk-as-european-parliament-may-legitimize-abusive-technologies.
Amoore, Louise. "Algorithmic war: Everyday geographies of the war on terror." Antipode 41.1 (2009): 49-69. https://doi.org/10.1111/j.1467-8330.2008.00655.x Amoore, Louise. Cloud Ethics: Algorithms and the Attributes of Ourselves and Others. Duke University Press, 2020. Andersen, Christoffer Koch. "Wrapped Up in the Cis-Tem: Trans Liveability in the Age of Algorithmic Violence." Atlantis: Critical Studies in Gender, Culture & Social Justice 46.1 (2025a): 24-41. https://atlantisjournal.ca/index.php/atlantis/article/view/5790 Andersen, Christoffer Koch. "Beyond Fairness: Trans Unliveability in European Algorithmic Assemblages." European Workshop on Algorithmic Fairness. PMLR 294, 295-302. (2025b). https://proceedings.mlr.press/v294/andersen25a Beauchamp, Toby. Going Stealth: Transgender politics and US surveillance practices. Duke University Press, 2019. Bridges, Lauren E. "Digital failure: Unbecoming the “good” data subject through entropic, fugitive, and queer data." Big Data & Society 8.1 (2021): 2053951720977882. https://doi.org/10.1177/2053951720977882 Buolamwini, Joy. “Civil Rights Implications of the Federal Use of Facial Recognition Technology” Algorithmic Justice League, 8 March 2024, https://www.ajl.org/civil-rights-commission-written-testimony. Campanioni, Chris. "The glitch of biometrics and the error as evasion: The subversive potential of self-effacement." Diacritics 48.4 (2020): 28-51. https://dx.doi.org/10.1353/dia.2020.0028 Chevillon, Guillaume. "The Queer Algorithm." Available at SSRN 4742138 (2024). http://dx.doi.org/10.2139/ssrn.4742138 Chun, Wendy Hui Kyong. "Introduction: Race and/as technology; or, how to do things to race." Camera Obscura 24.1 (70) (2009): 7-35. https://doi.org/10.1215/02705346-2008-013 Clarkson, Nicholas L. "Incoherent assemblages: Transgender conflicts in US security." Surveillance & Society 17.5 (2019): 618-630. https://doi.org/10.24908/ss.v17i5.12946 Cockerell, Isobel. Facial recognition systems decide your gender for you. Activists say it needs to stop. Rappler. (2021). https://www.rappler.com/technology/features/facial-recognition-automated-gender-coda-story Costanza-Chock, Sasha. "Design justice, AI, and escape from the matrix of domination." Journal of Design and Science 3.5 (2018): 1-14. https://doi.org/10.21428/96c8d426 Crawford, Kate. "Halt the use of facial-recognition technology until it is regulated." Nature 572.7771 (2019): 565-566. https://www.nature.com/articles/d41586-019-02514-7. Currah, Paisley, and Tara Mulqueen. "Securitizing gender: Identity, biometrics, and transgender bodies at the airport." Social Research: An International Quarterly 78.2 (2011): 557-582. https://dx.doi.org/10.1353/sor.2011.0030 Danielsson, Karin, et al. "Queer Eye on AI: binary systems versus fluid identities." Handbook of Critical Studies of Artificial Intelligence. Edward Elgar Publishing, 2023. 595-606. https://doi.org/10.4337/9781803928562.00061.
Danish Institute for Human Rights (DIHR). “Facial Recognition to Combat Crime.” The Danish Institute for Human Rights, 20 Feb. 2020, www.humanrights.dk/publications/facial-recognition-combat-crime.
Dixon-Román, Ezekiel. "Algo-ritmo: More-than-human performative acts and the racializing assemblages of algorithmic architectures." Cultural Studies? Critical Methodologies 16.5 (2016): 482-490. https://doi.org/10.1177/1532708616655769 Drage, Eleanor, and Federica Frabetti. "Copies without an original: the performativity of biometric bordering technologies." Communication and Critical/Cultural Studies 21.1 (2024): 79-97. https://doi.org/10.1080/14791420.2023.2292493. Elwood, Sarah. "Digital geographies, feminist relationality, Black and queer code studies: Thriving otherwise." Progress in Human Geography 45.2 (2021): 209-228. https://doi.org/10.1177/030913251989973 Fuller, Matthew, and Eyal Weizman. Investigative Aesthetics: Conflicts and Commons in the Politics of Truth. Verso Books, 2021. Gaboury, Jacob. "Critical unmaking: Toward a queer computation." The Routledge companion to media studies and digital humanities. Routledge, 2018. 483-491. Guo, E. “The US Wants to Use Facial Recognition to Identify Migrant Children as They Age.” MIT Technology Review, 19 Aug. 2024, www.technologyreview.com/2024/08/14/1096534/homeland-security-facial-recognition-immigration-border.
Hall, Lucy B., and William Clapton. "Programming the machine: gender, race, sexuality, AI, and the construction of credibility and deceit at the border." Internet Policy Review 10.4 (2021): 1-23. https://doi.org/10.14763/2021.4.1601
Harding, Xavier. “Facial Recognition Bias: Why Racism Appears in Face Detection Tech.” Mozilla Foundation, 7 Aug. 2023, www.mozillafoundation.org/en/blog/facial-recognition-bias.
Hicks, Mar. "Hacking the Cis-tem." IEEE Annals of the History of Computing 41.1 (2019): 20-33. 10.1109/MAHC.2019.2897667 Hoffmann, Anna Lauren. "Terms of inclusion: Data, discourse, violence." New Media & Society 23.12 (2021): 3539-3556. https://doi.org/10.1177/1461444820958725 Hu, Tung-Hui. A Prehistory of the Cloud. MIT Press, 2015. Keyes, Os, and Jeanie Austin. "Feeling fixes: Mess and emotion in algorithmic audits." Big Data & Society 9.2 (2022): 1-12. https://doi.org/10.1177/20539517221113772 Keyes, Os. "The misgendering machines: Trans/HCI implications of automatic gender recognition." Proceedings of the ACM on human-computer interaction 2.CSCW (2018): 1-22. https://doi.org/10.1145/3274357 Leszczynski, Agnieszka, and Sarah Elwood. "Glitch epistemologies for computational cities." Dialogues in Human Geography 12.3 (2022): 361-378. https://doi.org/10.1177/20438206221075714 Lyon, David. "Technology vs ‘terrorism’: circuits of city surveillance since September 11th." International Journal of Urban and Regional Research 27.3 (2003): 666-678. https://doi.org/10.1111/1468-2427.00473 Magnet, Shoshana Amielle. When Biometrics Fail: Gender, Race, and the Technology of Identity. Duke University Press, 2020. Minh-Ha, Trinh T. "The image and the void." Journal of Visual Culture 15.1 (2016): 131-140. https://doi.org/10.1177/1470412915619458 Opiah, A. “Biometric Surveillance Testing Data Sparks Ethics Concerns in Germany.” Biometric Update | Biometrics News, Companies and Explainers, 13 May 2024, www.biometricupdate.com/202405/biometric-surveillance-testing-data-sparks-ethics-concerns-in-germany.
Parikka, Jussi. Operational Images: From the Visual to the Invisual. U of Minnesota Press, 2023. Pugliese, Joseph. "In silico race and the heteronomy of biometric proxies: Biometrics in the context of civilian life, border security and counter-terrorism laws." Australian Feminist Law Journal 23.1 (2005): 1-32. https://doi.org/10.1080/13200968.2005.10854342 Quinan, C. L. "Biometric Technologies, Gendered Subjectivities and Artistic Resistance." Rethinking Identities Across Boundaries: Genders/Genres/Genera. Cham: Springer International Publishing, 2023. 21-41. https://doi.org/10.1007/978-3-031-40795-6_2 Raj, Arushi, and Fatima Juned. "Gendered identities and digital inequalities: An exploration of the lived realities of the transgender community in the Indian digital welfare state." Gender & Development 30.3 (2022): 531-549. https://doi.org/10.1080/13552074.2022.2131250 Rauchberg, Jessica Sage. "# Shadowbanned: Queer, Trans, and Disabled creator responses to algorithmic oppression on TikTok." LGBTQ digital cultures. Routledge, 2022. 196-209. Richardson, Michael. Nonhuman Witnessing: War, Data, and Ecology After the End of the World. Duke University Press, 2024. Russell, Legacy. Glitch Feminism: A Manifesto. Verso Books, 2020. Scheuerman, Morgan Klaus, Madeleine Pape, and Alex Hanna. "Auto-essentialization: Gender in automated facial analysis as extended colonial project." Big Data & Society 8.2 (2021): 1-15. https://doi.org/10.1177/20539517211053712 Shabbar, Andie. "Queer-Alt-Delete: glitch art as protest against the surveillance cis-tem." WSQ: Women's Studies Quarterly 46.3 (2018): 195-211. https://dx.doi.org/10.1353/wsq.2018.0039. Shachar, Ayelet, and Aaqib Mahmood. "The body as the border." Historical Social Research/Historische Sozialforschung 46.3 (2021): 124-150. Shah, Nishant. "I spy, with my little AI: How queer bodies are made dirty for digital technologies to claim cleanness." Queer Reflections on AI. Routledge, 2023. 57-72. Shelton, Jama, et al. "Digital technologies and the violent surveillance of nonbinary gender." Journal of gender-based violence 5.3 (2021): 517-529. https://doi.org/10.1332/239868021X16153783053180
Thach, Hibby, Samuel Mayworm, Michaelanne Thomas, and Oliver L. Haimson. "Trans-centered moderation: Trans technology creators and centering transness in platform and community governance." In Proceedings of the 2024 ACM Conference on Fairness, Accountability, and Transparency (2024): 326-336. Thieme, Katja, Mary Ann S. Saunders, and Laila Ferreira. "From language to algorithm: trans and non-binary identities in research on facial and gender recognition." AI and Ethics 5.2 (2025): 991-1008. https://doi.org/10.1007/s43681-023-00375-5 Wagner, A. “AI Facial Recognition Surveillance in the UK.” Tech Policy Press, 22 Oct. 2024, www.techpolicy.press/ai-facial-recognition-surveillance-in-the-uk.
Waldman, Ari Ezra. "Gender data in the automated administrative state." Columbia Law Review 123.8 (2023): 2249-2320. https://columbialawreview.org/content/gender-data-in-the-automated-administrative-state Weheliye, Alexander Ghedi. Habeas Viscus: Racializing Assemblages, Biopolitics, and Black Feminist Theories of the Human, New York, USA: Duke University Press, 2014. https://doi.org/10.1515/9780822376491 Wilcox, Lauren B. Bodies of Violence: Theorizing embodied subjects in international relations. Oxford University Press, 2015. Wilcox, Lauren B. "Embodying algorithmic war: Gender, race, and the posthuman in drone warfare." Security dialogue 48.1 (2017): 11-28. https://doi.org/10.1177/0967010616657947 Wilson, N. “Plan to Slash Lengthy Passport Queues With New Facial Recognition Scans.” The Independent, 18 Mar. 2025, www.independent.co.uk/travel/news-and-advice/facial-recognition-scans-passports-ports-b2717154.html.
Wynter, Sylvia. "Human being as noun? Or being human as praxis? Towards the autopoetic turn/overturn: A Manifesto." (2007). https://bcrw.barnard.edu/wp-content/uploads/2015/10/Wynter_TheAutopoeticTurn.pdf Yoon, Hyungjoo. "Digital flesh: A feminist approach to the body in cyberspace." Gender and Education 33.5 (2021): 578-593. https://doi.org/10.1080/09540253.2020.1802408