Ruben - Perplexity. Surveilling Through Indifference: Difference between revisions
Created page with "<div class="metadata"> == Perplexity — surveilling through indifference == '''Ruben van de Ven''' </div> Cameras have become ubiquitous in public space. In city centres, shopping malls or train stations, camera surveillance sets out to spot “deviant” behaviours, in order to detect or pre-empt unwanted events. However, the increasing number of cameras produce so much footage that there are not enough eyes to constantly monitor all video feeds. It is not uncommon th..." |
No edit summary |
||
Line 6: | Line 6: | ||
Cameras have become ubiquitous in public space. In city centres, shopping malls or train stations, camera surveillance sets out to spot “deviant” behaviours, in order to detect or pre-empt unwanted events. However, the increasing number of cameras produce so much footage that there are not enough eyes to constantly monitor all video feeds. It is not uncommon that one person is responsible for more than a hundred simultaneous streams. The past years have seen the introduction of algorithmic techniques in observer rooms that are to guide the operator’s eyes by singling out particular behaviours. In trying to critically understand how such algorithms make public space legible to an operator, these technologies can help us to rethink the relationship between surveiller and surveilled. | Cameras have become ubiquitous in public space. In city centres, shopping malls or train stations, camera surveillance sets out to spot “deviant” behaviours, in order to detect or pre-empt unwanted events. However, the increasing number of cameras produce so much footage that there are not enough eyes to constantly monitor all video feeds. It is not uncommon that one person is responsible for more than a hundred simultaneous streams. The past years have seen the introduction of algorithmic techniques in observer rooms that are to guide the operator’s eyes by singling out particular behaviours. In trying to critically understand how such algorithms make public space legible to an operator, these technologies can help us to rethink the relationship between surveiller and surveilled. | ||
The effects of both camera surveillance and algorithmic surveillance are often understood in light of Foucault’s description of the ''panopticon'' (Foucault 1977). Most crucially, Foucault made palpable that it is not relevant whether an observer is actually present to monitor its subject; the mere idea of being observed is enough to keep people in check. The subject under surveillance internalizes the vision of the other. In the model of the panopticon, as subjects of an ever expanding surveillance infrastructure we internalize its vision, bow our heads and walk the line. While the archetype of the panopticon provides a very useful description of the normalizing effects of many security practices, it is not without limitations (for example Lianos | The effects of both camera surveillance and algorithmic surveillance are often understood in light of Foucault’s description of the ''panopticon'' (Foucault 1977). Most crucially, Foucault made palpable that it is not relevant whether an observer is actually present to monitor its subject; the mere idea of being observed is enough to keep people in check. The subject under surveillance internalizes the vision of the other. In the model of the panopticon, as subjects of an ever expanding surveillance infrastructure we internalize its vision, bow our heads and walk the line. While the archetype of the panopticon provides a very useful description of the normalizing effects of many security practices, it is not without limitations (for example Lianos 2003; Haggerty 2006; Davidshofer, Jeandesboz, and Ragazzi 2017). When discussing surveillance in the public space, the self-disciplining implied by the panopticon seems to tell only a partial story, for ultimately, most people simply do not care. | ||
The vast majority cares as little about being watched by the state, as they care about the data gathering by ad companies. When one would ask a passer-by about camera surveillance, they might respond with surprise, or voice some obligatory comments of concern, but it’s only seldomly heartfelt. They go about and do their business. Already in his 1984 description of “Wandersmanner” in public space, Michel de Certeau (1984) suggests the “chorus of idle footsteps” traversing the city is largely indifferent to any top-down interference. Even I, a researcher of algorithmic security, shrug about cameras when I routinely cross the train station, only worrying about catching the next train home. Where then does indifference leave us with critique of algorithmic surveillance? <!-- In what follows, I turn to algorithmic techniques for anomaly detection to rethink the relationship between surveiller and surveilled. --> | The vast majority cares as little about being watched by the state, as they care about the data gathering by ad companies. When one would ask a passer-by about camera surveillance, they might respond with surprise, or voice some obligatory comments of concern, but it’s only seldomly heartfelt. They go about and do their business. Already in his 1984 description of “Wandersmanner” in public space, Michel de Certeau (1984) suggests the “chorus of idle footsteps” traversing the city is largely indifferent to any top-down interference. Even I, a researcher of algorithmic security, shrug about cameras when I routinely cross the train station, only worrying about catching the next train home. Where then does indifference leave us with critique of algorithmic surveillance? <!-- In what follows, I turn to algorithmic techniques for anomaly detection to rethink the relationship between surveiller and surveilled. --> | ||
Line 28: | Line 28: | ||
For de Certeau (1984) the trajectories of Wandersmanner elude legibility. With perplexity, it is precisely the lack of legibility that becomes an indicator for suspicion. In our day-to-day routines, we travel set paths through streets, train stations and parks. Though perplexity, surveillance capitalises on these movements, as we co-produce the backdrop of normality against which anomalous movement stands out (see also Pasquinelli 2015; Canguilhem 1978). There is no outside to surveillance. In the production of perplexity, everyone is implied. | For de Certeau (1984) the trajectories of Wandersmanner elude legibility. With perplexity, it is precisely the lack of legibility that becomes an indicator for suspicion. In our day-to-day routines, we travel set paths through streets, train stations and parks. Though perplexity, surveillance capitalises on these movements, as we co-produce the backdrop of normality against which anomalous movement stands out (see also Pasquinelli 2015; Canguilhem 1978). There is no outside to surveillance. In the production of perplexity, everyone is implied. | ||
The limits of the panopticon as a model for surveillance in public space become visible. Bentham’s architecture, and subsequently Foucault’s analysis, exhibits a clear demarcation between those in the tower and those in prison cells. These boundaries have blurred. Rethinking the relation between normalcy and deviancy makes apparent that while everyone is watched by surveillance, the majority is not targeted. Thus, critique of surveillance should refrain from convincing people they are being harmed. Rather, they are complicit in constituting normality. | The limits of the panopticon as a model for surveillance in public space become visible. Bentham’s architecture, and subsequently Foucault’s analysis, exhibits a clear demarcation between those in the tower and those in prison cells. These boundaries have blurred. Rethinking the relation between normalcy and deviancy makes apparent that while everyone is watched by surveillance, the majority is not targeted. Thus, critique of surveillance should refrain from (only) convincing people they are being harmed. Rather, they are complicit in constituting normality. | ||
This then, opens up new avenues for resistance. As de Certeau also reminds us, walking not only affirms and respects, it can also try out and transgress. In the reciprocal relationship between individual and population, “standing out” — breaking with predictability — is only a momentary interruption that is ultimately enrolled in next forecast of normality. In words of Hannah Arendt: “every action … necessarily destroys the whole pattern in whose frame the prediction moves and where it finds its evidence.” (Arendt 1970, 7) Collectively, we can make normality more unpredictable. | This then, opens up new avenues for resistance. As de Certeau also reminds us, walking not only affirms and respects, it can also try out and transgress. In the reciprocal relationship between individual and population, “standing out” — breaking with predictability — is only a momentary interruption that is ultimately enrolled in next forecast of normality. In words of Hannah Arendt: “every action … necessarily destroys the whole pattern in whose frame the prediction moves and where it finds its evidence.” (Arendt 1970, 7) Collectively, we can make normality more unpredictable. | ||
Line 59: | Line 59: | ||
</div> | </div> | ||
<div id="ref-lianosSocialControlFoucault" class="csl-entry"> | <div id="ref-lianosSocialControlFoucault" class="csl-entry"> | ||
Lianos, Michalis. | Lianos, Michalis. 2003. ‘Social Control after Foucault.’ <i>Surveillance & Society</i> 1 (3): 412–30. | ||
</div> | </div> | ||
<div id="ref-norrisMaximumSurveillanceSociety2010" class="csl-entry"> | <div id="ref-norrisMaximumSurveillanceSociety2010" class="csl-entry"> |
Revision as of 17:10, 10 January 2025
Cameras have become ubiquitous in public space. In city centres, shopping malls or train stations, camera surveillance sets out to spot “deviant” behaviours, in order to detect or pre-empt unwanted events. However, the increasing number of cameras produce so much footage that there are not enough eyes to constantly monitor all video feeds. It is not uncommon that one person is responsible for more than a hundred simultaneous streams. The past years have seen the introduction of algorithmic techniques in observer rooms that are to guide the operator’s eyes by singling out particular behaviours. In trying to critically understand how such algorithms make public space legible to an operator, these technologies can help us to rethink the relationship between surveiller and surveilled.
The effects of both camera surveillance and algorithmic surveillance are often understood in light of Foucault’s description of the panopticon (Foucault 1977). Most crucially, Foucault made palpable that it is not relevant whether an observer is actually present to monitor its subject; the mere idea of being observed is enough to keep people in check. The subject under surveillance internalizes the vision of the other. In the model of the panopticon, as subjects of an ever expanding surveillance infrastructure we internalize its vision, bow our heads and walk the line. While the archetype of the panopticon provides a very useful description of the normalizing effects of many security practices, it is not without limitations (for example Lianos 2003; Haggerty 2006; Davidshofer, Jeandesboz, and Ragazzi 2017). When discussing surveillance in the public space, the self-disciplining implied by the panopticon seems to tell only a partial story, for ultimately, most people simply do not care.
The vast majority cares as little about being watched by the state, as they care about the data gathering by ad companies. When one would ask a passer-by about camera surveillance, they might respond with surprise, or voice some obligatory comments of concern, but it’s only seldomly heartfelt. They go about and do their business. Already in his 1984 description of “Wandersmanner” in public space, Michel de Certeau (1984) suggests the “chorus of idle footsteps” traversing the city is largely indifferent to any top-down interference. Even I, a researcher of algorithmic security, shrug about cameras when I routinely cross the train station, only worrying about catching the next train home. Where then does indifference leave us with critique of algorithmic surveillance?
Algorithmic anticipation
The indifference that the majority of the surveilled exhibits to being under surveillance, implies framing them as ‘victim’ tells only a partial story. Rather, by examining how contemporary technologies negotiate deviancy and normality, I propose a reconfiguration of the subject under surveillance.
In surveillance practices, the notions of “deviancy” and “anomaly” serve as a catch-all category for any unexpected behaviour. Spotting such behaviour is often considered an art — a “gut feeling” conditioned by experience; or a sharp eye that some have while others don’t. (Norris and Armstrong 2010; Amicelle and Grondin 2021) While the threat models that warrant camera surveillance describe public’s safety from terrorism or ‘high-impact crimes’, everyday surveillance practice hardly mobilizes such possible future scenarios. Rather than relying on such risk technologies, security scholars Bonelli and Ragazzi argue, security practitioners work through anticipation, as they relate “almost in a bodily, physical manner with ‘risky’ and ‘at risk’ groups” (2014) to mark people as ‘out of place’.
With the introduction of algorithmic deviancy scoring, the construction of anticipation needs to be reconsidered. Where a traditional machine learning detector is trained by example, such a setup struggles when it comes to deviancy. First, there is much more footage available of people going about their business than of the behaviours that are relevant to an operator, such as those containing fighting. Second, as an open set, the anomaly collapses a set of heterogenous behaviours — robbery, traffic accidents, etc. — into a single category, making it difficult for a mathematical model to converge. To overcome these challenges, a logical reversal is invoked. Instead of detecting deviancy, normality is measured. Trained on vast quantities of “normal” data, a generative model uses past measurements to simulate the present. These forecasts (or, nowcasts) are then used to assess the likeliness of present movements. The anomalous is thus no longer considered in terms of proximity to a predefined ‘risky’ other, but as a measured distance from a simulated normality.
This unpredictability score resembles a metric known as perplexity. Perplexity, a concept from information theory, originally introduced in the context of speech recognition, has become a prominent error metric for assessing the futures brought forth by generative algorithms — large language models in particular. With perplexity, for each ‘token’ in a series — whether it is a word in a sentence, or a step in a trajectory — the probability of that token in relation to what came before is calculated. As a measure of surprise, perplexity is the logical inversion of algorithmic anticipation.
Routines
With perplexity the present is governed through simulation. This simulation forfeits any relationship with a predefined ‘risky’ other, but rather defines it through a degree of predictability, that is, it represents normality. The failure to predict, an erroneous forecast, is no longer a bug that needs to be solved, but has become a feature. By subduing human steps to a model of their likeliness, it is no longer the algorithm that errs but the human that is deemed unpredictable.
For de Certeau (1984) the trajectories of Wandersmanner elude legibility. With perplexity, it is precisely the lack of legibility that becomes an indicator for suspicion. In our day-to-day routines, we travel set paths through streets, train stations and parks. Though perplexity, surveillance capitalises on these movements, as we co-produce the backdrop of normality against which anomalous movement stands out (see also Pasquinelli 2015; Canguilhem 1978). There is no outside to surveillance. In the production of perplexity, everyone is implied.
The limits of the panopticon as a model for surveillance in public space become visible. Bentham’s architecture, and subsequently Foucault’s analysis, exhibits a clear demarcation between those in the tower and those in prison cells. These boundaries have blurred. Rethinking the relation between normalcy and deviancy makes apparent that while everyone is watched by surveillance, the majority is not targeted. Thus, critique of surveillance should refrain from (only) convincing people they are being harmed. Rather, they are complicit in constituting normality.
This then, opens up new avenues for resistance. As de Certeau also reminds us, walking not only affirms and respects, it can also try out and transgress. In the reciprocal relationship between individual and population, “standing out” — breaking with predictability — is only a momentary interruption that is ultimately enrolled in next forecast of normality. In words of Hannah Arendt: “every action … necessarily destroys the whole pattern in whose frame the prediction moves and where it finds its evidence.” (Arendt 1970, 7) Collectively, we can make normality more unpredictable.
Amicelle, Anthony, and David Grondin. 2021. “Algorithms as Suspecting Machines - Financial Surveillance for Security Intelligence.” In Big Data Surveillance and Security Intelligence: The Canadian Case, edited by David Lyon and David Murakami Wood. Vancouver ; Toronto: UBC Press.
Arendt, Hannah. 1970. On Violence. New York: Harcourt, Brace & World.
Bonelli, Laurent, and Francesco Ragazzi. 2014. “Low-Tech Security: Files, Notes, and Memos as Technologies of Anticipation.” Security Dialogue 45 (5): 476–93.
Canguilhem, Georges. 1978. On the Normal and the Pathological. Dordrecht: Springer Netherlands. https://doi.org/10.1007/978-94-009-9853-7.
Davidshofer, Stephan, Julien Jeandesboz, and Francesco Ragazzi. 2017. “Technology and Security Practices: Situating the Technological Imperative.” In International Political Sociology: Transversal Lines, edited by Tugba Basaran, Didier Bigo, Emmanuel-Pierre Guittet, and Robert BJ Walker, 204–27. London ; New York: Routledge, Taylor & Francis Group.
Foucault, Michel. 1977. Discipline and punish: the birth of the prison. 1st American ed. New York: Pantheon Books.
Haggerty, Kevin D. 2006. “Tear down the Walls: On Demolishing the Panopticon.” In Theorizing Surveillance, by David Lyon, 23–45. Willan.
Lianos, Michalis. 2003. ‘Social Control after Foucault.’ Surveillance & Society 1 (3): 412–30.
Norris, Clive, and Gary Armstrong. 2010. The Maximum Surveillance Society: The Rise of CCTV. Repr. Oxford: Berg.
Pasquinelli, Matteo. 2015. “Anomaly Detection: The Mathematization of the Abnormal in the Metadata Society.” Panel presentation presented at the Transmediale festival, Berlin, Germany.
Comments
here is the comment