Ruben - Perplexity. Surveilling Through Indifference: Difference between revisions

From CTPwiki

Ruben (talk | contribs)
No edit summary
Ruben (talk | contribs)
shorten + inline comments
Line 4: Line 4:
</div>
</div>


Camera surveillance has become ubiquitous in public space.  
Camera surveillance has become ubiquitous in public space. Its effects are often understood through Foucault’s description of the ''panopticon''<ref><div class="csl-entry" id="ref-foucaultDisciplinePunishBirth1977"> <small class="references csl-bib-body hanging-indent">
Its effects are often understood through Foucault’s description of the ''panopticon'' (Foucault 1977). Regardless of whether an observer is present to monitor its subject, the mere idea of being observed keeps people in check. When discussing surveillance in the public space, the self-disciplining implied by the panopticon suggests people would bow their heads and walk the line. However, in practice, most people seem not to care.
Foucault, Michel. 1977. ''Discipline and punish: the birth of the prison''. 1st American ed. New York: Pantheon Books.
Even I, a researcher of algorithmic security, shrug about cameras when I routinely cross the train station, only worrying about catching the next train home. Where then does indifference leave us with critique of algorithmic surveillance?  
</small> </div></ref>. Regardless of whether an observer is present to monitor its subject, the mere idea of being observed keeps people in check. When discussing surveillance in the public space, the self-disciplining implied by the panopticon suggests people would bow their heads and walk the line. However, in practice, most people seem not to care. Even I, a researcher of algorithmic security, shrug about cameras when I routinely cross the train station, only worrying about catching the next train home. Where then does indifference leave us with critique of algorithmic surveillance?  


The past years have seen the introduction of algorithmic techniques in observer rooms that are to guide the operator’s eyes by singling out particular behaviours.  
The past years have seen the introduction of algorithmic techniques in observer rooms that are to guide the operator’s eyes by singling out particular behaviours.  
Line 15: Line 15:
== Algorithmic anticipation ==
== Algorithmic anticipation ==


In surveillance, the “deviancy” and “anomaly” serve as catch-all categories for any unexpected behaviour. Security practitioners consider noticing such behaviour an art. (Norris and Armstrong 2010; Amicelle and Grondin 2021)
In surveillance, the “deviancy” and “anomaly” serve as catch-all categories for any unexpected behaviour. Security practitioners consider noticing such behaviour an art<ref><div class="csl-entry" id="ref-amicelleAlgorithmsSuspectingMachines2021a"> <small class="references csl-bib-body hanging-indent">
Working through ''anticipation'' practitioners relate “almost in a bodily, physical manner with ‘risky’ and ‘at risk’ groups” (Bonelli and Ragazzi 2014) to mark people as ‘out of place’.
Amicelle, Anthony, and David Grondin. 2021. “Algorithms as Suspecting Machines - Financial Surveillance for Security Intelligence.” In ''Big Data Surveillance and Security Intelligence: The Canadian Case'', edited by David Lyon and David Murakami Wood. Vancouver ; Toronto: UBC Press.
</small> </div></ref><ref><div class="csl-entry" id="ref-norrisMaximumSurveillanceSociety2010"> <small class="references csl-bib-body hanging-indent">
Norris, Clive, and Gary Armstrong. 2010. ''The Maximum Surveillance Society: The Rise of CCTV''. Repr. Oxford: Berg.
</small> </div></ref>.
Working through ''anticipation'' practitioners relate “almost in a bodily, physical manner with ‘risky’ and ‘at risk’ groups”<ref> <small class="references csl-bib-body hanging-indent">
Bonelli, Laurent, and Francesco Ragazzi. 2014. “Low-Tech Security: Files, Notes, and Memos as Technologies of Anticipation.” ''Security Dialogue'' 45 (5): 476–93.
 
</small></ref> to mark people as ‘out of place’.


With the introduction of algorithmic deviancy scoring however, the construction of anticipation needs to be reconsidered. A traditional machine learning detector, trained by example, struggles with an open category like deviancy, which collapses heterogenous behaviours — robbery, traffic accidents, etc. — and includes all that is unknown. Moreover, much more footage is available of people going about their business than of "deviant" behaviour. To overcome these limitations, a logical reversal is invoked.  
With the introduction of algorithmic deviancy scoring however, the construction of anticipation needs to be reconsidered. A traditional machine learning detector, trained by example, struggles with an open category like deviancy, which collapses heterogenous behaviours — robbery, traffic accidents, etc. — and includes all that is unknown. Moreover, much more footage is available of people going about their business than of "deviant" behaviour. To overcome these limitations, a logical reversal is invoked.  
Line 24: Line 31:
As the present is governed through perplexity, the anomalous is no longer considered as the proximity to a predefined ‘risky’ other, but as a measured distance from a simulated normality.
As the present is governed through perplexity, the anomalous is no longer considered as the proximity to a predefined ‘risky’ other, but as a measured distance from a simulated normality.


  "What made me contemplate is, how should we comprehend this ambiguity of gaze? For example, in the feminist theory, gaze is often discussed as a form of objectification (referred to as male gaze) therefore as a harmful one. And as you mentioned, the surveillance system certainly marginalises and eliminates people deemed "unpredictable" from the society through their gaze, thus I believe it could do a significant harm to the minority communities. However, essentially "gaze" also entails care, affection, curiosity, or even desire. I wonder what transforms the same gaze from fatal one to care-full one. or, how could we reimagine the mechanical gaze (or algorithmic gaze) to uphold such attitudes? I believe this discussion revisits your discourse on indifference as well - indifference (the lack of attention or care)could be a powerful weapon to enforce the larger sociopolitical control." - Daria
"What made me contemplate is, how should we comprehend this ambiguity of gaze? For example, in the feminist theory, gaze is often discussed as a form of objectification (referred to as male gaze) therefore as a harmful one. And as you mentioned, the surveillance system certainly marginalises and eliminates people deemed "unpredictable" from the society through their gaze, thus I believe it could do a significant harm to the minority communities. However, essentially "gaze" also entails care, affection, curiosity, or even desire. I wonder what transforms the same gaze from fatal one to care-full one. or, how could we reimagine the mechanical gaze (or algorithmic gaze) to uphold such attitudes? I believe this discussion revisits your discourse on indifference as well - indifference (the lack of attention or care)could be a powerful weapon to enforce the larger sociopolitical control." - Daria


== Routines ==
== Routines ==


Perplexity makes apparent how surveillance capitalises on our day-to-day routines. As we travel set paths through streets, train stations and parks, we co-produce the backdrop of normality against which anomalous movement stands out (see also Pasquinelli 2015; Canguilhem 1978).  
Perplexity makes apparent how surveillance capitalises on our day-to-day routines. As we travel set paths through streets, train stations and parks, we co-produce the backdrop of normality against which anomalous movement stands out<ref><div class="csl-entry" id="ref-pasquinelliAnomalyDetectionMathematization2015"> <small class="references csl-bib-body hanging-indent">
Pasquinelli, Matteo. 2015. “Anomaly Detection: The Mathematization of the Abnormal in the Metadata Society.” Panel presentation presented at the Transmediale festival, Berlin, Germany.
</small> </div></ref> <ref><div class="csl-entry" id="ref-canguilhemNormalPathological1978"> <small class="references csl-bib-body hanging-indent">
Canguilhem, Georges. 1978. ''On the Normal and the Pathological''. Dordrecht: Springer Netherlands. https://doi.org/10.1007/978-94-009-9853-7.
</small> </div></ref>.  
No longer is there a clear demarcation between those in the panopticon's tower and those in prison cells. While everyone is watched by surveillance, the majority is not targeted. Rather, they are complicit in constituting normality.
No longer is there a clear demarcation between those in the panopticon's tower and those in prison cells. While everyone is watched by surveillance, the majority is not targeted. Rather, they are complicit in constituting normality.


Rethinking the relation between normalcy and deviancy opens up new avenues for resistance. As de Certeau also reminds us, walking not only affirms and respects, it can also try out and transgress. In the reciprocal relationship between individual and population, “every action … necessarily destroys the whole pattern in whose frame the prediction moves and where it finds its evidence.” (Arendt 1970, 7) Collectively, we can make normality more unpredictable.
Rethinking the relation between normalcy and deviancy opens up new avenues for resistance. As Michel de Certeau<ref>Certeau, Michel de. ''The Practice of Everyday Life''. 1984. Translated by Steven Rendall, 1. paperback pr., 8. [Repr.], Univ. of California Press, 1988.</ref> also reminds us, walking not only affirms and respects, it can also try out and transgress. In the reciprocal relationship between individual and population, “every action … necessarily destroys the whole pattern in whose frame the prediction moves and where it finds its evidence.” <ref>Arendt, Hannah. 1970. <nowiki>''</nowiki>On Violence<nowiki>''</nowiki>. New York: Harcourt, Brace <nowiki>&</nowiki>amp; World.</ref> Collectively, we can make normality more unpredictable.


  "Your last paragraph inspired me to imagine forms of unpredictability. This brought to mind Trisha Brown’s experiments [which seem to be inspired by de Certeau]. However, I wonder: do we by 'making normality more unpredictable' allow AI systems to absorb deviance into the framework of the 'normal'?" - Daria
  "Your last paragraph inspired me to imagine forms of unpredictability. This brought to mind Trisha Brown’s experiments [which seem to be inspired by de Certeau]. However, I wonder: do we by 'making normality more unpredictable' allow AI systems to absorb deviance into the framework of the 'normal'?" - Daria
-----
<small id="refs" class="references csl-bib-body hanging-indent" entry-spacing="0">
<div id="ref-amicelleAlgorithmsSuspectingMachines2021a" class="csl-entry">
Amicelle, Anthony, and David Grondin. 2021. <span>“Algorithms as Suspecting Machines - Financial Surveillance for Security Intelligence.”</span> In ''Big Data Surveillance and Security Intelligence: The Canadian Case'', edited by David Lyon and David Murakami Wood. Vancouver ; Toronto: UBC Press.
</div>
<div id="ref-arendtViolence1970" class="csl-entry">
Arendt, Hannah. 1970. ''On Violence''. New York: Harcourt, Brace &amp; World.
</div>
<div id="ref-bonelliLowtechSecurityFiles2014" class="csl-entry">
Bonelli, Laurent, and Francesco Ragazzi. 2014. <span>“Low-Tech Security: Files, Notes, and Memos as Technologies of Anticipation.”</span> ''Security Dialogue'' 45 (5): 476–93.
</div>
<div id="ref-canguilhemNormalPathological1978" class="csl-entry">
Canguilhem, Georges. 1978. ''On the Normal and the Pathological''. Dordrecht: Springer Netherlands. https://doi.org/10.1007/978-94-009-9853-7.
</div>
<div id="ref-davidshoferTechnologySecurityPractices2017" class="csl-entry">
Davidshofer, Stephan, Julien Jeandesboz, and Francesco Ragazzi. 2017. <span>“Technology and Security Practices: Situating the Technological Imperative.”</span> In ''International Political Sociology: Transversal Lines'', edited by Tugba Basaran, Didier Bigo, Emmanuel-Pierre Guittet, and Robert BJ Walker, 204–27. London ; New York: Routledge, Taylor &amp; Francis Group.
</div>
<div id="ref-foucaultDisciplinePunishBirth1977" class="csl-entry">
Foucault, Michel. 1977. ''Discipline and punish: the birth of the prison''. 1st American ed. New York: Pantheon Books.
</div>
<div id="ref-haggerty2006tear" class="csl-entry">
Haggerty, Kevin D. 2006. <span>“Tear down the Walls: On Demolishing the Panopticon.”</span> In ''Theorizing Surveillance'', by David Lyon, 23–45. Willan.
</div>
<div id="ref-lianosSocialControlFoucault" class="csl-entry">
Lianos, Michalis. 2003. ‘Social Control after Foucault.’ <i>Surveillance &amp; Society</i> 1 (3): 412–30.
</div>
<div id="ref-norrisMaximumSurveillanceSociety2010" class="csl-entry">
Norris, Clive, and Gary Armstrong. 2010. ''The Maximum Surveillance Society: The Rise of CCTV''. Repr. Oxford: Berg.
</div>
<div id="ref-pasquinelliAnomalyDetectionMathematization2015" class="csl-entry">
Pasquinelli, Matteo. 2015. <span>“Anomaly Detection: The Mathematization of the Abnormal in the Metadata Society.”</span> Panel presentation presented at the Transmediale festival, Berlin, Germany.
</div>


</small>
</small>

Revision as of 13:26, 30 January 2025

Camera surveillance has become ubiquitous in public space. Its effects are often understood through Foucault’s description of the panopticon[1]. Regardless of whether an observer is present to monitor its subject, the mere idea of being observed keeps people in check. When discussing surveillance in the public space, the self-disciplining implied by the panopticon suggests people would bow their heads and walk the line. However, in practice, most people seem not to care. Even I, a researcher of algorithmic security, shrug about cameras when I routinely cross the train station, only worrying about catching the next train home. Where then does indifference leave us with critique of algorithmic surveillance?

The past years have seen the introduction of algorithmic techniques in observer rooms that are to guide the operator’s eyes by singling out particular behaviours. By examining how these contemporary technologies negotiate deviancy and normality, I propose to rethink the subject under surveillance.

 "... I particularly like how your approach questions the role of the "average citizen" in this power relation, beyond the duality of victim and guilt (or indifference, but I would question that this is the right word for our attitude)."  -- Pablo

Algorithmic anticipation

In surveillance, the “deviancy” and “anomaly” serve as catch-all categories for any unexpected behaviour. Security practitioners consider noticing such behaviour an art[2][3]. Working through anticipation practitioners relate “almost in a bodily, physical manner with ‘risky’ and ‘at risk’ groups”[4] to mark people as ‘out of place’.

With the introduction of algorithmic deviancy scoring however, the construction of anticipation needs to be reconsidered. A traditional machine learning detector, trained by example, struggles with an open category like deviancy, which collapses heterogenous behaviours — robbery, traffic accidents, etc. — and includes all that is unknown. Moreover, much more footage is available of people going about their business than of "deviant" behaviour. To overcome these limitations, a logical reversal is invoked.

Instead of detecting deviancy, normality is predicted. Trained on “normal” data, a generative model uses past measurements to simulate the present. These nowcasts are then used to assess the likeliness of present movements. This unpredictability score resembles a metric known as perplexity, which has become a prominent error metric for assessing the futures brought forth by generative algorithms — large language models in particular. However, it is no longer the algorithm that errs but the human that is deemed unpredictable. As the present is governed through perplexity, the anomalous is no longer considered as the proximity to a predefined ‘risky’ other, but as a measured distance from a simulated normality.

"What made me contemplate is, how should we comprehend this ambiguity of gaze? For example, in the feminist theory, gaze is often discussed as a form of objectification (referred to as male gaze) therefore as a harmful one. And as you mentioned, the surveillance system certainly marginalises and eliminates people deemed "unpredictable" from the society through their gaze, thus I believe it could do a significant harm to the minority communities. However, essentially "gaze" also entails care, affection, curiosity, or even desire. I wonder what transforms the same gaze from fatal one to care-full one. or, how could we reimagine the mechanical gaze (or algorithmic gaze) to uphold such attitudes? I believe this discussion revisits your discourse on indifference as well - indifference (the lack of attention or care)could be a powerful weapon to enforce the larger sociopolitical control." - Daria

Routines

Perplexity makes apparent how surveillance capitalises on our day-to-day routines. As we travel set paths through streets, train stations and parks, we co-produce the backdrop of normality against which anomalous movement stands out[5] [6]. No longer is there a clear demarcation between those in the panopticon's tower and those in prison cells. While everyone is watched by surveillance, the majority is not targeted. Rather, they are complicit in constituting normality.

Rethinking the relation between normalcy and deviancy opens up new avenues for resistance. As Michel de Certeau[7] also reminds us, walking not only affirms and respects, it can also try out and transgress. In the reciprocal relationship between individual and population, “every action … necessarily destroys the whole pattern in whose frame the prediction moves and where it finds its evidence.” [8] Collectively, we can make normality more unpredictable.

"Your last paragraph inspired me to imagine forms of unpredictability. This brought to mind Trisha Brown’s experiments [which seem to be inspired by de Certeau]. However, I wonder: do we by 'making normality more unpredictable' allow AI systems to absorb deviance into the framework of the 'normal'?" - Daria

  1. Foucault, Michel. 1977. Discipline and punish: the birth of the prison. 1st American ed. New York: Pantheon Books.

  2. Amicelle, Anthony, and David Grondin. 2021. “Algorithms as Suspecting Machines - Financial Surveillance for Security Intelligence.” In Big Data Surveillance and Security Intelligence: The Canadian Case, edited by David Lyon and David Murakami Wood. Vancouver ; Toronto: UBC Press.

  3. Norris, Clive, and Gary Armstrong. 2010. The Maximum Surveillance Society: The Rise of CCTV. Repr. Oxford: Berg.

  4. Bonelli, Laurent, and Francesco Ragazzi. 2014. “Low-Tech Security: Files, Notes, and Memos as Technologies of Anticipation.” Security Dialogue 45 (5): 476–93.
  5. Pasquinelli, Matteo. 2015. “Anomaly Detection: The Mathematization of the Abnormal in the Metadata Society.” Panel presentation presented at the Transmediale festival, Berlin, Germany.

  6. Canguilhem, Georges. 1978. On the Normal and the Pathological. Dordrecht: Springer Netherlands. https://doi.org/10.1007/978-94-009-9853-7.

  7. Certeau, Michel de. The Practice of Everyday Life. 1984. Translated by Steven Rendall, 1. paperback pr., 8. [Repr.], Univ. of California Press, 1988.
  8. Arendt, Hannah. 1970. ''On Violence''. New York: Harcourt, Brace & World.