Ruben - Perplexity. Surveilling Through Indifference: Difference between revisions

From CTPwiki

Ruben (talk | contribs)
Created page with "<div class="metadata"> == Perplexity — surveilling through indifference == '''Ruben van de Ven''' </div> Cameras have become ubiquitous in public space. In city centres, shopping malls or train stations, camera surveillance sets out to spot “deviant” behaviours, in order to detect or pre-empt unwanted events. However, the increasing number of cameras produce so much footage that there are not enough eyes to constantly monitor all video feeds. It is not uncommon th..."
 
No edit summary
 
(15 intermediate revisions by 3 users not shown)
Line 1: Line 1:
<div class="metadata">
<div class="metadata">
== Perplexity — surveilling through indifference ==
== Perplexity — surveilling through surprise ==
'''Ruben van de Ven'''
'''Ruben van de Ven'''
</div>
</div>


Cameras have become ubiquitous in public space. In city centres, shopping malls or train stations, camera surveillance sets out to spot “deviant” behaviours, in order to detect or pre-empt unwanted events. However, the increasing number of cameras produce so much footage that there are not enough eyes to constantly monitor all video feeds. It is not uncommon that one person is responsible for more than a hundred simultaneous streams. The past years have seen the introduction of algorithmic techniques in observer rooms that are to guide the operator’s eyes by singling out particular behaviours. In trying to critically understand how such algorithms make public space legible to an operator, these technologies can help us to rethink the relationship between surveiller and surveilled.
Camera surveillance has become ubiquitous in public space. Its effects are often understood through Foucault’s description of the ''panopticon''<ref><small class="references csl-bib-body hanging-indent">Foucault, Michel. 1977. ''Discipline and punish: the birth of the prison''. 1st American ed. New York: Pantheon Books.</small> </ref>. Regardless of whether an observer is present to monitor its subject, the mere idea of being observed keeps people in check. When discussing surveillance in the public space, the self-disciplining implied by the panopticon suggests people would bow their heads and walk the line. However, in practice, most people seem not to care. Even I, a researcher of algorithmic security, shrug about cameras when I routinely cross the train station, only worrying about catching the next train home. Where then does indifference leave us with critique of algorithmic surveillance?


The effects of both camera surveillance and algorithmic surveillance are often understood in light of Foucault’s description of the ''panopticon'' (Foucault 1977). Most crucially, Foucault made palpable that it is not relevant whether an observer is actually present to monitor its subject; the mere idea of being observed is enough to keep people in check. The subject under surveillance internalizes the vision of the other. In the model of the panopticon, as subjects of an ever expanding surveillance infrastructure we internalize its vision, bow our heads and walk the line. While the archetype of the panopticon provides a very useful description of the normalizing effects of many security practices, it is not without limitations (for example Lianos, n.d.; Haggerty 2006; Davidshofer, Jeandesboz, and Ragazzi 2017). When discussing surveillance in the public space, the self-disciplining implied by the panopticon seems to tell only a partial story, for ultimately, most people simply do not care.
The past years have seen the introduction of algorithmic techniques in observer rooms that are to guide the operator’s eyes by singling out particular behaviours.  
By examining how these contemporary technologies negotiate deviancy and normality, I propose to rethink the subject under surveillance.


The vast majority cares as little about being watched by the state, as they care about the data gathering by ad companies. When one would ask a passer-by about camera surveillance, they might respond with surprise, or voice some obligatory comments of concern, but it’s only seldomly heartfelt. They go about and do their business. Already in his 1984 description of “Wandersmanner” in public space, Michel de Certeau (1984) suggests the “chorus of idle footsteps” traversing the city is largely indifferent to any top-down interference. Even I, a researcher of algorithmic security, shrug about cameras when I routinely cross the train station, only worrying about catching the next train home. Where then does indifference leave us with critique of algorithmic surveillance? <!-- In what follows, I turn to algorithmic techniques for anomaly detection to rethink the relationship between surveiller and surveilled. -->
'''[Pablo]''' "… I particularly like how your approach questions the role of the "average citizen" in this power relation, beyond the duality of victim and guilt (or indifference, but I would question that this is the right word for our attitude)."


<span id="algorithmic-anticipation"></span>
=== Algorithmic anticipation ===
== Algorithmic anticipation ==


The indifference that the majority of the surveilled exhibits to being under surveillance, implies framing them as ‘victim’ tells only a partial story. Rather, by examining how contemporary technologies negotiate deviancy and normality, I propose a reconfiguration of the subject under surveillance.
In surveillance, the “deviancy” and “anomaly” serve as catch-all categories for any unexpected behaviour. Security practitioners consider noticing such behaviour an art<ref><small class="references csl-bib-body hanging-indent">Amicelle, Anthony, and David Grondin. 2021. “Algorithms as Suspecting Machines - Financial Surveillance for Security Intelligence.” In ''Big Data Surveillance and Security Intelligence: The Canadian Case'', edited by David Lyon and David Murakami Wood. Vancouver ; Toronto: UBC Press.
</small></ref><ref><small class="references csl-bib-body hanging-indent">Norris, Clive, and Gary Armstrong. 2010. ''The Maximum Surveillance Society: The Rise of CCTV''. Repr. Oxford: Berg.</small></ref>.
Working through ''anticipation'' practitioners relate “almost in a bodily, physical manner with ‘risky’ and ‘at risk’ groups”<ref><small class="references csl-bib-body hanging-indent">Bonelli, Laurent, and Francesco Ragazzi. ‘Low-Tech Security: Files, Notes, and Memos as Technologies of Anticipation’. ''Security Dialogue'', vol. 45, no. 5, Oct. 2014, pp. 476–93. ''SAGE Journals'', https://doi.org/10.1177/0967010614545200.</small></ref> to mark people as ‘out of place’.


In surveillance practices, the notions of “deviancy” and “anomaly” serve as a catch-all category for any unexpected behaviour. Spotting such behaviour is often considered an art — a “gut feeling” conditioned by experience; or a sharp eye that some have while others don’t. (Norris and Armstrong 2010; Amicelle and Grondin 2021) While the threat models that warrant camera surveillance describe public’s safety from terrorism or ‘high-impact crimes’, everyday surveillance practice hardly mobilizes such possible future scenarios. Rather than relying on such risk technologies, <!-- surveillance performs a social sorting out of a fear of others [@galicBenthamDeleuzeOverview2017]. --> security scholars Bonelli and Ragazzi argue, security practitioners work through ''anticipation'', as they relate “almost in a bodily, physical manner with ‘risky’ and ‘at risk’ groups” (2014) to mark people as ‘out of place’.
'''[Maya]''' "What made me contemplate is, how should we comprehend this ambiguity of gaze? For example, in the feminist theory, gaze is often discussed as a form of objectification (referred to as male gaze) therefore as a harmful one. […] However, "gaze" also entails care, affection, curiosity, or even desire. I wonder what transforms the same gaze from fatal one to care-full one. [] Indifference (the lack of attention or care) could be a powerful weapon to enforce the larger sociopolitical control."


With the introduction of algorithmic deviancy scoring, the construction of anticipation needs to be reconsidered. Where a traditional machine learning detector is trained by example, such a setup struggles when it comes to deviancy. First, there is much more footage available of people going about their business than of the behaviours that are relevant to an operator, such as those containing fighting. Second, as an open set, the anomaly collapses a set of heterogenous behaviours — robbery, traffic accidents, etc. — into a single category, making it difficult for a mathematical model to converge. To overcome these challenges, a logical reversal is invoked. Instead of detecting deviancy, normality is measured. Trained on vast quantities of “normal” data, a generative model uses past measurements to ''simulate'' the present. These forecasts (or, nowcasts) are then used to assess the likeliness of present movements. The anomalous is thus no longer considered in terms of proximity to a predefined ‘risky’ other, but as a measured distance from a simulated normality.
With the introduction of algorithmic deviancy scoring however, the construction of anticipation needs to be reconsidered. A traditional machine learning detector, trained by example, struggles with an open category like deviancy, which collapses heterogenous behaviours — robbery, traffic accidents, etc. — and includes all that is unknown. Moreover, much more footage is available of people going about their business than of "deviant" behaviour. To overcome these limitations, a logical reversal is invoked.  


This unpredictability score resembles a metric known as ''perplexity''. Perplexity, a concept from information theory, originally introduced in the context of speech recognition, has become a prominent error metric for assessing the futures brought forth by generative algorithms — large language models in particular. With perplexity, for each ‘token’ in a series — whether it is a word in a sentence, or a step in a trajectory — the probability of that token in relation to what came before is calculated. As a measure of surprise, perplexity is the logical inversion of algorithmic anticipation.
Instead of detecting deviancy, normality is predicted. Trained on “normal” data, a generative model uses past measurements to simulate the present. These nowcasts are then used to assess the likeliness of present movements.
This unpredictability score resembles a metric known as ''perplexity'', which has become a prominent error metric for assessing the futures brought forth by generative algorithms — large language models in particular. Applied to surveillance however, it is no longer the algorithm that errs but the human that is deemed unpredictable.
As the present is governed through perplexity, the anomalous is no longer considered as the proximity to a predefined ‘risky’ other, but as a measured distance from a simulated normality.


<span id="routines"></span>
=== Routines ===
== Routines ==


With perplexity the present is governed through simulation. This simulation forfeits any relationship with a predefined ‘risky’ other, but rather defines it through a degree of predictability, that is, it represents normality. The failure to predict, an erroneous forecast, is no longer a bug that needs to be solved, but has become a feature. By subduing human steps to a model of their likeliness, it is no longer the algorithm that errs but the human that is deemed unpredictable. <!-- With this inversion, a shift takes place: the anticipation of security operators relates to the 'at risk' group, while perplexity describes fundamentally a mathematical relation to the "normal". -->
Perplexity makes apparent how surveillance capitalises on our day-to-day routines. As we travel set paths through streets, train stations and parks, we co-produce the backdrop of normality against which anomalous movement stands out<ref><small class="references csl-bib-body hanging-indent">Pasquinelli, Matteo. 2015. “Anomaly Detection: The Mathematization of the Abnormal in the Metadata Society.” Panel presentation presented at the Transmediale festival, Berlin, Germany.</small></ref> <ref><small class="references csl-bib-body hanging-indent">Canguilhem, Georges. 1978. ''On the Normal and the Pathological''. Dordrecht: Springer Netherlands. https://doi.org/10.1007/978-94-009-9853-7.</small></ref>.
No longer is there a clear demarcation between those in the panopticon's tower and those in prison cells. While everyone is watched by surveillance, the majority is not targeted. Rather, they are complicit in constituting normality.


For de Certeau (1984) the trajectories of Wandersmanner elude legibility. With perplexity, it is precisely the lack of legibility that becomes an indicator for suspicion. In our day-to-day routines, we travel set paths through streets, train stations and parks. Though perplexity, surveillance capitalises on these movements, as we co-produce the backdrop of normality against which anomalous movement stands out (see also Pasquinelli 2015; Canguilhem 1978). There is no outside to surveillance. In the production of perplexity, everyone is implied.
Rethinking the relation between normalcy and deviancy opens up new avenues for resistance. As Michel de Certeau<ref><small>Certeau, Michel de. ''The Practice of Everyday Life''. 1984. Translated by Steven Rendall, 1. paperback pr., 8. [Repr.], Univ. of California Press, 1988.</small></ref> reminds us, walking not only affirms and respects, it can also try out and transgress. In the reciprocal relationship between individual and population, “every action … necessarily destroys the whole pattern in whose frame the prediction moves and where it finds its evidence.” <ref><small>Arendt, Hannah. 1970. "On Violence". New York: Harcourt, Brace <nowiki>&</nowiki>amp; World.</small></ref> Collectively, we can make normality more unpredictable.


The limits of the panopticon as a model for surveillance in public space become visible. Bentham’s architecture, and subsequently Foucault’s analysis, exhibits a clear demarcation between those in the tower and those in prison cells. These boundaries have blurred. Rethinking the relation between normalcy and deviancy makes apparent that while everyone is watched by surveillance, the majority is not targeted. Thus, critique of surveillance should refrain from convincing people they are being harmed. Rather, they are complicit in constituting normality.
'''[Daria]''' "Your last paragraph inspired me to imagine forms of unpredictability. This brought to mind Trisha Brown’s experiments [which seem to be inspired by de Certeau]. However, I wonder: do we by 'making normality more unpredictable' allow AI systems to absorb deviance into the framework of the 'normal'?"
 
This then, opens up new avenues for resistance. As de Certeau also reminds us, walking not only affirms and respects, it can also try out and transgress. In the reciprocal relationship between individual and population, “standing out” — breaking with predictability — is only a momentary interruption that is ultimately enrolled in next forecast of normality. In words of Hannah Arendt: “every action … necessarily destroys the whole pattern in whose frame the prediction moves and where it finds its evidence.” (Arendt 1970, 7) Collectively, we can make normality more unpredictable.
 
 
-----
 
<small id="refs" class="references csl-bib-body hanging-indent" entry-spacing="0">
 
<div id="ref-amicelleAlgorithmsSuspectingMachines2021a" class="csl-entry">
Amicelle, Anthony, and David Grondin. 2021. <span>“Algorithms as Suspecting Machines - Financial Surveillance for Security Intelligence.”</span> In ''Big Data Surveillance and Security Intelligence: The Canadian Case'', edited by David Lyon and David Murakami Wood. Vancouver ; Toronto: UBC Press.
</div>
<div id="ref-arendtViolence1970" class="csl-entry">
Arendt, Hannah. 1970. ''On Violence''. New York: Harcourt, Brace &amp; World.
</div>
<div id="ref-bonelliLowtechSecurityFiles2014" class="csl-entry">
Bonelli, Laurent, and Francesco Ragazzi. 2014. <span>“Low-Tech Security: Files, Notes, and Memos as Technologies of Anticipation.”</span> ''Security Dialogue'' 45 (5): 476–93.
</div>
<div id="ref-canguilhemNormalPathological1978" class="csl-entry">
Canguilhem, Georges. 1978. ''On the Normal and the Pathological''. Dordrecht: Springer Netherlands. https://doi.org/10.1007/978-94-009-9853-7.
</div>
<div id="ref-davidshoferTechnologySecurityPractices2017" class="csl-entry">
Davidshofer, Stephan, Julien Jeandesboz, and Francesco Ragazzi. 2017. <span>“Technology and Security Practices: Situating the Technological Imperative.”</span> In ''International Political Sociology: Transversal Lines'', edited by Tugba Basaran, Didier Bigo, Emmanuel-Pierre Guittet, and Robert BJ Walker, 204–27. London ; New York: Routledge, Taylor &amp; Francis Group.
</div>
<div id="ref-foucaultDisciplinePunishBirth1977" class="csl-entry">
Foucault, Michel. 1977. ''Discipline and punish: the birth of the prison''. 1st American ed. New York: Pantheon Books.
</div>
<div id="ref-haggerty2006tear" class="csl-entry">
Haggerty, Kevin D. 2006. <span>“Tear down the Walls: On Demolishing the Panopticon.”</span> In ''Theorizing Surveillance'', by David Lyon, 23–45. Willan.
</div>
<div id="ref-lianosSocialControlFoucault" class="csl-entry">
Lianos, Michalis. n.d. <span>“Social Control After Foucault.”</span> 19.
</div>
<div id="ref-norrisMaximumSurveillanceSociety2010" class="csl-entry">
Norris, Clive, and Gary Armstrong. 2010. ''The Maximum Surveillance Society: The Rise of CCTV''. Repr. Oxford: Berg.
</div>
<div id="ref-pasquinelliAnomalyDetectionMathematization2015" class="csl-entry">
Pasquinelli, Matteo. 2015. <span>“Anomaly Detection: The Mathematization of the Abnormal in the Metadata Society.”</span> Panel presentation presented at the Transmediale festival, Berlin, Germany.
</div>


</small>
</small>


<references/>




[[Category:emd]]
[[Category:emd]]
=== Comments ===
here is the comment

Latest revision as of 00:00, 31 January 2025

Camera surveillance has become ubiquitous in public space. Its effects are often understood through Foucault’s description of the panopticon[1]. Regardless of whether an observer is present to monitor its subject, the mere idea of being observed keeps people in check. When discussing surveillance in the public space, the self-disciplining implied by the panopticon suggests people would bow their heads and walk the line. However, in practice, most people seem not to care. Even I, a researcher of algorithmic security, shrug about cameras when I routinely cross the train station, only worrying about catching the next train home. Where then does indifference leave us with critique of algorithmic surveillance?

The past years have seen the introduction of algorithmic techniques in observer rooms that are to guide the operator’s eyes by singling out particular behaviours. By examining how these contemporary technologies negotiate deviancy and normality, I propose to rethink the subject under surveillance.

[Pablo] "… I particularly like how your approach questions the role of the "average citizen" in this power relation, beyond the duality of victim and guilt (or indifference, but I would question that this is the right word for our attitude)."

Algorithmic anticipation

In surveillance, the “deviancy” and “anomaly” serve as catch-all categories for any unexpected behaviour. Security practitioners consider noticing such behaviour an art[2][3]. Working through anticipation practitioners relate “almost in a bodily, physical manner with ‘risky’ and ‘at risk’ groups”[4] to mark people as ‘out of place’.

[Maya] "What made me contemplate is, how should we comprehend this ambiguity of gaze? For example, in the feminist theory, gaze is often discussed as a form of objectification (referred to as male gaze) therefore as a harmful one. […] However, "gaze" also entails care, affection, curiosity, or even desire. I wonder what transforms the same gaze from fatal one to care-full one. […] Indifference (the lack of attention or care) could be a powerful weapon to enforce the larger sociopolitical control."

With the introduction of algorithmic deviancy scoring however, the construction of anticipation needs to be reconsidered. A traditional machine learning detector, trained by example, struggles with an open category like deviancy, which collapses heterogenous behaviours — robbery, traffic accidents, etc. — and includes all that is unknown. Moreover, much more footage is available of people going about their business than of "deviant" behaviour. To overcome these limitations, a logical reversal is invoked.

Instead of detecting deviancy, normality is predicted. Trained on “normal” data, a generative model uses past measurements to simulate the present. These nowcasts are then used to assess the likeliness of present movements. This unpredictability score resembles a metric known as perplexity, which has become a prominent error metric for assessing the futures brought forth by generative algorithms — large language models in particular. Applied to surveillance however, it is no longer the algorithm that errs but the human that is deemed unpredictable. As the present is governed through perplexity, the anomalous is no longer considered as the proximity to a predefined ‘risky’ other, but as a measured distance from a simulated normality.

Routines

Perplexity makes apparent how surveillance capitalises on our day-to-day routines. As we travel set paths through streets, train stations and parks, we co-produce the backdrop of normality against which anomalous movement stands out[5] [6]. No longer is there a clear demarcation between those in the panopticon's tower and those in prison cells. While everyone is watched by surveillance, the majority is not targeted. Rather, they are complicit in constituting normality.

Rethinking the relation between normalcy and deviancy opens up new avenues for resistance. As Michel de Certeau[7] reminds us, walking not only affirms and respects, it can also try out and transgress. In the reciprocal relationship between individual and population, “every action … necessarily destroys the whole pattern in whose frame the prediction moves and where it finds its evidence.” [8] Collectively, we can make normality more unpredictable.

[Daria] "Your last paragraph inspired me to imagine forms of unpredictability. This brought to mind Trisha Brown’s experiments [which seem to be inspired by de Certeau]. However, I wonder: do we by 'making normality more unpredictable' allow AI systems to absorb deviance into the framework of the 'normal'?"

  1. Foucault, Michel. 1977. Discipline and punish: the birth of the prison. 1st American ed. New York: Pantheon Books.
  2. Amicelle, Anthony, and David Grondin. 2021. “Algorithms as Suspecting Machines - Financial Surveillance for Security Intelligence.” In Big Data Surveillance and Security Intelligence: The Canadian Case, edited by David Lyon and David Murakami Wood. Vancouver ; Toronto: UBC Press.
  3. Norris, Clive, and Gary Armstrong. 2010. The Maximum Surveillance Society: The Rise of CCTV. Repr. Oxford: Berg.
  4. Bonelli, Laurent, and Francesco Ragazzi. ‘Low-Tech Security: Files, Notes, and Memos as Technologies of Anticipation’. Security Dialogue, vol. 45, no. 5, Oct. 2014, pp. 476–93. SAGE Journals, https://doi.org/10.1177/0967010614545200.
  5. Pasquinelli, Matteo. 2015. “Anomaly Detection: The Mathematization of the Abnormal in the Metadata Society.” Panel presentation presented at the Transmediale festival, Berlin, Germany.
  6. Canguilhem, Georges. 1978. On the Normal and the Pathological. Dordrecht: Springer Netherlands. https://doi.org/10.1007/978-94-009-9853-7.
  7. Certeau, Michel de. The Practice of Everyday Life. 1984. Translated by Steven Rendall, 1. paperback pr., 8. [Repr.], Univ. of California Press, 1988.
  8. Arendt, Hannah. 1970. "On Violence". New York: Harcourt, Brace & World.