Ruben - Perplexity. Surveilling Through Indifference

From CTPwiki

Camera surveillance has become ubiquitous in public space. Its effects are often understood through Foucault’s description of the panopticon[1]. Regardless of whether an observer is present to monitor its subject, the mere idea of being observed keeps people in check. When discussing surveillance in the public space, the self-disciplining implied by the panopticon suggests people would bow their heads and walk the line. However, in practice, most people seem not to care. Even I, a researcher of algorithmic security, shrug about cameras when I routinely cross the train station, only worrying about catching the next train home. Where then does indifference leave us with critique of algorithmic surveillance?

The past years have seen the introduction of algorithmic techniques in observer rooms that are to guide the operator’s eyes by singling out particular behaviours. By examining how these contemporary technologies negotiate deviancy and normality, I propose to rethink the subject under surveillance.

[Pablo] "… I particularly like how your approach questions the role of the "average citizen" in this power relation, beyond the duality of victim and guilt (or indifference, but I would question that this is the right word for our attitude)."

Algorithmic anticipation

In surveillance, the “deviancy” and “anomaly” serve as catch-all categories for any unexpected behaviour. Security practitioners consider noticing such behaviour an art[2][3]. Working through anticipation practitioners relate “almost in a bodily, physical manner with ‘risky’ and ‘at risk’ groups”[4] to mark people as ‘out of place’.

[Maya] "What made me contemplate is, how should we comprehend this ambiguity of gaze? For example, in the feminist theory, gaze is often discussed as a form of objectification (referred to as male gaze) therefore as a harmful one. […] However, "gaze" also entails care, affection, curiosity, or even desire. I wonder what transforms the same gaze from fatal one to care-full one. […] Indifference (the lack of attention or care) could be a powerful weapon to enforce the larger sociopolitical control."

With the introduction of algorithmic deviancy scoring however, the construction of anticipation needs to be reconsidered. A traditional machine learning detector, trained by example, struggles with an open category like deviancy, which collapses heterogenous behaviours — robbery, traffic accidents, etc. — and includes all that is unknown. Moreover, much more footage is available of people going about their business than of "deviant" behaviour. To overcome these limitations, a logical reversal is invoked.

Instead of detecting deviancy, normality is predicted. Trained on “normal” data, a generative model uses past measurements to simulate the present. These nowcasts are then used to assess the likeliness of present movements. This unpredictability score resembles a metric known as perplexity, which has become a prominent error metric for assessing the futures brought forth by generative algorithms — large language models in particular. Applied to surveillance however, it is no longer the algorithm that errs but the human that is deemed unpredictable. As the present is governed through perplexity, the anomalous is no longer considered as the proximity to a predefined ‘risky’ other, but as a measured distance from a simulated normality.

Routines

Perplexity makes apparent how surveillance capitalises on our day-to-day routines. As we travel set paths through streets, train stations and parks, we co-produce the backdrop of normality against which anomalous movement stands out[5] [6]. No longer is there a clear demarcation between those in the panopticon's tower and those in prison cells. While everyone is watched by surveillance, the majority is not targeted. Rather, they are complicit in constituting normality.

Rethinking the relation between normalcy and deviancy opens up new avenues for resistance. As Michel de Certeau[7] reminds us, walking not only affirms and respects, it can also try out and transgress. In the reciprocal relationship between individual and population, “every action … necessarily destroys the whole pattern in whose frame the prediction moves and where it finds its evidence.” [8] Collectively, we can make normality more unpredictable.

[Daria] "Your last paragraph inspired me to imagine forms of unpredictability. This brought to mind Trisha Brown’s experiments [which seem to be inspired by de Certeau]. However, I wonder: do we by 'making normality more unpredictable' allow AI systems to absorb deviance into the framework of the 'normal'?"

  1. Foucault, Michel. 1977. Discipline and punish: the birth of the prison. 1st American ed. New York: Pantheon Books.
  2. Amicelle, Anthony, and David Grondin. 2021. “Algorithms as Suspecting Machines - Financial Surveillance for Security Intelligence.” In Big Data Surveillance and Security Intelligence: The Canadian Case, edited by David Lyon and David Murakami Wood. Vancouver ; Toronto: UBC Press.
  3. Norris, Clive, and Gary Armstrong. 2010. The Maximum Surveillance Society: The Rise of CCTV. Repr. Oxford: Berg.
  4. Bonelli, Laurent, and Francesco Ragazzi. ‘Low-Tech Security: Files, Notes, and Memos as Technologies of Anticipation’. Security Dialogue, vol. 45, no. 5, Oct. 2014, pp. 476–93. SAGE Journals, https://doi.org/10.1177/0967010614545200.
  5. Pasquinelli, Matteo. 2015. “Anomaly Detection: The Mathematization of the Abnormal in the Metadata Society.” Panel presentation presented at the Transmediale festival, Berlin, Germany.
  6. Canguilhem, Georges. 1978. On the Normal and the Pathological. Dordrecht: Springer Netherlands. https://doi.org/10.1007/978-94-009-9853-7.
  7. Certeau, Michel de. The Practice of Everyday Life. 1984. Translated by Steven Rendall, 1. paperback pr., 8. [Repr.], Univ. of California Press, 1988.
  8. Arendt, Hannah. 1970. "On Violence". New York: Harcourt, Brace & World.