{"id":1076,"date":"2016-04-11T21:08:50","date_gmt":"2016-04-11T21:08:50","guid":{"rendered":"http:\/\/designing.rutgers.edu\/?p=1076"},"modified":"2019-05-07T21:22:47","modified_gmt":"2019-05-07T21:22:47","slug":"a-sea-of-data-apophenia-and-pattern-mis-recognition","status":"publish","type":"post","link":"https:\/\/designing.rutgers.edu\/?p=1076","title":{"rendered":"A Sea of Data:  Apophenia and Pattern (Mis-) Recognition"},"content":{"rendered":"\n<div data-elementor-type=\"wp-post\" data-elementor-id=\"1076\" class=\"elementor elementor-1076\" data-elementor-settings=\"[]\">\n<div class=\"elementor-inner\">\n<div class=\"elementor-section-wrap\">\n<section class=\"elementor-section elementor-top-section elementor-element elementor-element-e1819f5 elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"e1819f5\" data-element_type=\"section\">\n<div class=\"elementor-container elementor-column-gap-default\">\n<div class=\"elementor-row\">\n<div class=\"elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-9ae257a\" data-id=\"9ae257a\" data-element_type=\"column\">\n<div class=\"elementor-column-wrap elementor-element-populated\">\n<div class=\"elementor-widget-wrap\">\n<div class=\"elementor-element elementor-element-c9a6916 elementor-widget elementor-widget-text-editor\" data-id=\"c9a6916\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-text-editor elementor-clearfix\">\n<div>\n<div>\n<h3><a href=\"http:\/\/www.e-flux.com\/journal\/a-sea-of-data-apophenia-and-pattern-mis-recognition\/\">Hito Steyerl<\/a><br \/><a style=\"font-size: 16px;\" href=\"http:\/\/www.e-flux.com\/journal\/a-sea-of-data-apophenia-and-pattern-mis-recognition\/\">Link to the e-flux page<\/a><\/h3>\n<\/div>\n<\/div><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/section>\n<section class=\"elementor-section elementor-top-section elementor-element elementor-element-a8b6711 elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"a8b6711\" data-element_type=\"section\">\n<div class=\"elementor-container elementor-column-gap-default\">\n<div class=\"elementor-row\">\n<div class=\"elementor-column elementor-col-50 elementor-top-column elementor-element elementor-element-c0167d9\" data-id=\"c0167d9\" data-element_type=\"column\">\n<div class=\"elementor-column-wrap elementor-element-populated\">\n<div class=\"elementor-widget-wrap\">\n<div class=\"elementor-element elementor-element-5388fdf elementor-widget elementor-widget-image\" data-id=\"5388fdf\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-image\">\n<figure class=\"wp-caption\">\n\t\t\t\t\t\t\t\t\t\t\t<a href=\"http:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/snowden-filesWEB1.jpg\" data-elementor-open-lightbox=\"yes\" data-elementor-lightbox-title=\"snowden-filesWEB1\"><br \/>\n\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"382\" height=\"415\" src=\"https:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/snowden-filesWEB1.jpg\" class=\"attachment-full size-full\" alt=\"\" srcset=\"https:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/snowden-filesWEB1.jpg 382w, https:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/snowden-filesWEB1-276x300.jpg 276w\" sizes=\"(max-width: 382px) 100vw, 382px\" \/>\t\t\t\t\t\t\t\t<\/a><figcaption class=\"widget-image-caption wp-caption-text\">This image from the Snowden files was captioned: \u201cA single frame of scrambled video imagery.\u201d<\/figcaption><\/figure>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-element elementor-element-25afe2b elementor-widget elementor-widget-image\" data-id=\"25afe2b\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-image\">\n<figure class=\"wp-caption\">\n\t\t\t\t\t\t\t\t\t\t\t<a href=\"http:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/Rosemary_4WEB.jpg\" data-elementor-open-lightbox=\"yes\" data-elementor-lightbox-title=\"Rosemary_4WEB\"><br \/>\n\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"550\" height=\"349\" src=\"https:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/Rosemary_4WEB.jpg\" class=\"attachment-large size-large\" alt=\"\" srcset=\"https:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/Rosemary_4WEB.jpg 550w, https:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/Rosemary_4WEB-300x190.jpg 300w\" sizes=\"(max-width: 550px) 100vw, 550px\" \/>\t\t\t\t\t\t\t\t<\/a><figcaption class=\"widget-image-caption wp-caption-text\">Rose Mary Woods, Nixon\u2019s lifelong secretary, demonstrates the \u201cRose Mary Stretch,\u201d a gesticulation that purportedly led to the erasure of a section of the Watergate tapes. The quality of noise in this section of the tapes has been throughly analyzed to understand if the omission was intentional. Photo: Wikimedia commons.<\/figcaption><\/figure>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-element elementor-element-ec5ec6a elementor-widget elementor-widget-image\" data-id=\"ec5ec6a\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-image\">\n<figure class=\"wp-caption\">\n\t\t\t\t\t\t\t\t\t\t\t<a href=\"http:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/manning-controls-of-reaper-drone-in-ground-based-cockpit-WEB.jpg\" data-elementor-open-lightbox=\"yes\" data-elementor-lightbox-title=\"manning-controls-of-reaper-drone-in-ground-based-cockpit-WEB\"><br \/>\n\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"550\" height=\"366\" src=\"https:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/manning-controls-of-reaper-drone-in-ground-based-cockpit-WEB.jpg\" class=\"attachment-large size-large\" alt=\"\" srcset=\"https:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/manning-controls-of-reaper-drone-in-ground-based-cockpit-WEB.jpg 550w, https:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/manning-controls-of-reaper-drone-in-ground-based-cockpit-WEB-300x200.jpg 300w\" sizes=\"(max-width: 550px) 100vw, 550px\" \/>\t\t\t\t\t\t\t\t<\/a><figcaption class=\"widget-image-caption wp-caption-text\">This photograph from June 6, 2012 shows a student pilot and sensor operator manning the controls of a MQ-9 Reaper in a ground-based cockpit during a training mission flown from Hancock Field Air National Guard Base, Syracuse, New York. Photo: AP Photo.<\/figcaption><\/figure>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-element elementor-element-f441ad0 elementor-widget elementor-widget-spacer\" data-id=\"f441ad0\" data-element_type=\"widget\" data-widget_type=\"spacer.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-spacer\">\n<div class=\"elementor-spacer-inner\"><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-column elementor-col-50 elementor-top-column elementor-element elementor-element-e19ea12\" data-id=\"e19ea12\" data-element_type=\"column\">\n<div class=\"elementor-column-wrap elementor-element-populated\">\n<div class=\"elementor-widget-wrap\">\n<div class=\"elementor-element elementor-element-a49574c elementor-widget elementor-widget-text-editor\" data-id=\"a49574c\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-text-editor elementor-clearfix\">\n<p>This is an image from the Snowden files. It is labeled \u201csecret.\u201d<a href=\"http:\/\/www.e-flux.com\/journal\/a-sea-of-data-apophenia-and-pattern-mis-recognition\/#_ftn1\">1<\/a>Yet one cannot see anything on it.<\/p>\n<p>This is exactly why it is symptomatic.<\/p>\n<p>\u00a0<\/p>\n<p>Not seeing anything intelligible is the new normal. Information is passed on as a set of signals that cannot be picked up by human senses. Contemporary perception is machinic to large degrees. The spectrum of human vision only covers a tiny part of it. Electric charges, radio waves, light pulses encoded by machines for machines are zipping by at slightly subluminal speed. Seeing is superseded by calculating probabilities. Vision loses importance and is replaced by filtering, decrypting, and pattern recognition. Snowden\u2019s image of noise could stand in for a more general human inability to perceive technical signals unless they are processed and translated accordingly.<\/p>\n<p>But noise is not nothing. On the contrary, noise is a huge issue, not only for the NSA but for machinic modes of perception as a whole.<\/p>\n<p>Signal v. Noise\u00a0was the title of a column on the internal NSA website running from 2011 to 2012. It succinctly frames the NSA\u2019s main problem: how to extract \u201cinformation from the truckloads of data\u201d:<\/p>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-element elementor-element-a1736c3 elementor-widget elementor-widget-text-editor\" data-id=\"a1736c3\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-text-editor elementor-clearfix\">\n<blockquote>\n<p>It\u2019s not about the data or even access to the data. It\u2019s about getting information from the truckloads of data \u2026 Developers, please help! We\u2019re drowning (not waving) in a sea of data\u2014with data, data everywhere, but not a drop of information.<a href=\"http:\/\/www.e-flux.com\/journal\/a-sea-of-data-apophenia-and-pattern-mis-recognition\/#_ftn2\">2<\/a><\/p>\n<\/blockquote><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-element elementor-element-1b962c5 elementor-widget elementor-widget-text-editor\" data-id=\"1b962c5\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-text-editor elementor-clearfix\">\n<p>Analysts are choking on intercepted communication. They need to unscramble, filter, decrypt, refine, and process \u201ctruckloads of data.\u201d The focus moves from acquisition to discerning, from scarcity to overabundance, from adding on to filtering, from research to pattern recognition. This problem is not restricted to secret services. Even WikiLeaks Julian Assange states: \u201cWe are drowning in material.\u201d<a id=\"_ftnref3\" href=\"http:\/\/www.e-flux.com\/journal\/a-sea-of-data-apophenia-and-pattern-mis-recognition\/#_ftn3\">3<\/a><\/p>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/section>\n<section class=\"elementor-section elementor-top-section elementor-element elementor-element-cc6486d elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"cc6486d\" data-element_type=\"section\">\n<div class=\"elementor-container elementor-column-gap-default\">\n<div class=\"elementor-row\">\n<div class=\"elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-17a35b9\" data-id=\"17a35b9\" data-element_type=\"column\">\n<div class=\"elementor-column-wrap elementor-element-populated\">\n<div class=\"elementor-widget-wrap\">\n<div class=\"elementor-element elementor-element-82973cb elementor-widget elementor-widget-text-editor\" data-id=\"82973cb\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-text-editor elementor-clearfix\">\n<h3>Apophenia<\/h3>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-element elementor-element-dc9f446 elementor-widget elementor-widget-text-editor\" data-id=\"dc9f446\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-text-editor elementor-clearfix\">\n<p>But let\u2019s return to the initial image. The noise on it was actually decrypted by GCHQ technicians to reveal a picture of clouds in the sky. British analysts have been hacking video feeds from Israeli drones at least since 2008, a period which includes the recent IDF aerial campaigns against Gaza.<a id=\"_ftnref4\" href=\"http:\/\/www.e-flux.com\/journal\/a-sea-of-data-apophenia-and-pattern-mis-recognition\/#_ftn4\">4<\/a>\u00a0But no images of these attacks exist in Snowden\u2019s archive. Instead, there are all sorts of abstract renderings of intercepted broadcasts. Noise. Lines. Color patterns.<a id=\"_ftnref5\" href=\"http:\/\/www.e-flux.com\/journal\/a-sea-of-data-apophenia-and-pattern-mis-recognition\/#_ftn5\">5<\/a>\u00a0According to leaked training manuals, one needs to apply all sorts of massively secret operations to produce these kinds of images.<a id=\"_ftnref6\" href=\"http:\/\/www.e-flux.com\/journal\/a-sea-of-data-apophenia-and-pattern-mis-recognition\/#_ftn6\">6<\/a><\/p>\n<p>But let me tell you something. I will decrypt this image for you without any secret algorithm. I will use a secret ninja technique instead. And I will even teach you how to do it for free. Please focus very strongly on this image right now.<\/p>\n<p>Doesn\u2019t it look like a shimmering surface of water in the evening sun? Is this perhaps the \u201csea of data\u201d itself? An overwhelming body of water, which one could drown in? Can you see the waves moving ever so slightly?<\/p>\n<p>I am using a good old method called apophenia.<\/p>\n<p>Apophenia is defined as the perception of patterns within random data.<a id=\"_ftnref7\" href=\"http:\/\/www.e-flux.com\/journal\/a-sea-of-data-apophenia-and-pattern-mis-recognition\/#_ftn7\">7<\/a>\u00a0The most common examples are people seeing faces in clouds or on the moon. Apophenia is about \u201cdrawing connections and conclusions from sources with no direct connection other than their indissoluble perceptual simultaneity,\u201d as Benjamin Bratton recently argued.<a id=\"_ftnref8\" href=\"http:\/\/www.e-flux.com\/journal\/a-sea-of-data-apophenia-and-pattern-mis-recognition\/#_ftn8\">8<\/a><\/p>\n<p>One has to assume that sometimes, analysts also use apophenia.<\/p>\n<p>Someone must have seen the face of Amani al-Nasasra in a cloud. The forty-three-year-old was blinded by an aerial strike in Gaza in 2012 in front of her TV:<\/p>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-element elementor-element-e8ffb1e elementor-widget elementor-widget-text-editor\" data-id=\"e8ffb1e\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-text-editor elementor-clearfix\">\n<blockquote>\n<p>\u201cWe were in the house watching the news on TV. My husband said he wanted to go to sleep, but I wanted to stay up and watch Al Jazeera to see if there was any news of a ceasefire. The last thing I remember, my husband asked if I changed the channel and I said yes. I didn\u2019t feel anything when the bomb hit\u2014I was unconscious. I didn\u2019t wake up again until I was in the ambulance.\u201d Amani suffered second degree burns and was largely blinded.<a id=\"_ftnref9\" href=\"http:\/\/www.e-flux.com\/journal\/a-sea-of-data-apophenia-and-pattern-mis-recognition\/#_ftn9\">9<\/a><\/p>\n<\/blockquote><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-element elementor-element-d033a74 elementor-widget elementor-widget-text-editor\" data-id=\"d033a74\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-text-editor elementor-clearfix\">\n<p>What kind of \u201csignal\u201d was extracted from what kind of \u201cnoise\u201d to suggest that al-Nasasra was a legitimate target? Which faces appear on which screens, and why? Or to put it differently: Who is \u201csignal,\u201d and who disposable \u201cnoise\u201d?<\/p>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-element elementor-element-53a78bf elementor-widget elementor-widget-image\" data-id=\"53a78bf\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-image\">\n<figure class=\"wp-caption\">\n\t\t\t\t\t\t\t\t\t\t\t<a href=\"http:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/PutincloudWEB.jpg\" data-elementor-open-lightbox=\"yes\" data-elementor-lightbox-title=\"PutincloudWEB\"><br \/>\n\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"550\" height=\"369\" src=\"https:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/PutincloudWEB.jpg\" class=\"attachment-large size-large\" alt=\"\" srcset=\"https:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/PutincloudWEB.jpg 550w, https:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/PutincloudWEB-300x201.jpg 300w\" sizes=\"(max-width: 550px) 100vw, 550px\" \/>\t\t\t\t\t\t\t\t<\/a><figcaption class=\"widget-image-caption wp-caption-text\">The Russian TV station Zvezda claimed this flock of birds over New York City appeared to form the shape of President Vladimir Putin\u2019s face. YouTube video screenshot.<\/figcaption><\/figure>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-element elementor-element-365760e elementor-widget elementor-widget-spacer\" data-id=\"365760e\" data-element_type=\"widget\" data-widget_type=\"spacer.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-spacer\">\n<div class=\"elementor-spacer-inner\"><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/section>\n<section class=\"elementor-section elementor-top-section elementor-element elementor-element-af7112b elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"af7112b\" data-element_type=\"section\">\n<div class=\"elementor-container elementor-column-gap-default\">\n<div class=\"elementor-row\">\n<div class=\"elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-25d8a0d\" data-id=\"25d8a0d\" data-element_type=\"column\">\n<div class=\"elementor-column-wrap elementor-element-populated\">\n<div class=\"elementor-widget-wrap\">\n<div class=\"elementor-element elementor-element-97cc123 elementor-widget elementor-widget-text-editor\" data-id=\"97cc123\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-text-editor elementor-clearfix\">\n<h3>Pattern Recognition<\/h3>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-element elementor-element-dc2468d elementor-widget elementor-widget-text-editor\" data-id=\"dc2468d\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-text-editor elementor-clearfix\">\n<p>Jacques Ranci\u00e8re tells a mythical story about how the separation of signal and noise might have been accomplished in Ancient Greece. Sounds produced by affluent male locals were defined as speech, whereas women, children, slaves, and foreigners were assumed to produce garbled noise.<a id=\"_ftnref10\" href=\"http:\/\/www.e-flux.com\/journal\/a-sea-of-data-apophenia-and-pattern-mis-recognition\/#_ftn10\">10<\/a>\u00a0The distinction between speech and noise served as a kind of political spam filter. Those identified as speaking were labeled citizens and the rest as irrelevant, irrational, and potentially dangerous nuisances. Similarly, today, the question of separating signal and noise has a fundamental political dimension. Pattern recognition resonates with the wider question of political recognition. Who is recognized on a political level and as what? As a subject? A person? A legitimate category of the population? Or perhaps as \u201cdirty data\u201d?<\/p>\n<p>What are dirty data? Here is one example:<\/p>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-element elementor-element-ee457d4 elementor-widget elementor-widget-text-editor\" data-id=\"ee457d4\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-text-editor elementor-clearfix\">\n<p>Sullivan, from Booz Allen, gave the example the time his team was analyzing demographic information about customers for a luxury hotel chain and came across data showing that teens from a wealthy Middle Eastern country were frequent guests.<\/p>\n<blockquote>\n<p>\u201cThere were a whole group of 17 year-olds staying at the properties worldwide,\u201d Sullivan said. \u201cWe thought, \u2018That can\u2019t be true.\u2019\u201d<a id=\"_ftnref11\" href=\"http:\/\/www.e-flux.com\/journal\/a-sea-of-data-apophenia-and-pattern-mis-recognition\/#_ftn11\">11<\/a><\/p>\n<\/blockquote><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-element elementor-element-9368a54 elementor-widget elementor-widget-text-editor\" data-id=\"9368a54\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-text-editor elementor-clearfix\">\n<p>The demographic finding was\u00a0dismissed as dirty data\u2014a messed up and worthless set of information\u2014before someone found out that, actually, it was true.<\/p>\n<p>Brown teenagers, in this worldview, are likely to exist. Dead brown teenagers? Why not? But rich brown teenagers? This is so improbable that they must be dirty data and cleansed from your system! The pattern emerging from this operation to separate noise and signal is not very different from Ranci\u00e8re\u2019s political noise filter for allocating citizenship, rationality, and privilege. Affluent brown teenagers seem just as unlikely as speaking slaves and women in the Greek polis.<\/p>\n<p>On the other hand, dirty data are also something like a cache of surreptitious refusal; they express a refusal to be counted and measured:<\/p>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-element elementor-element-3302b6c elementor-widget elementor-widget-text-editor\" data-id=\"3302b6c\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-text-editor elementor-clearfix\">\n<blockquote>\n<p>A study of more than 2,400 UK consumers by research company Verve found that 60%\u00a0intentionally provided wrong information\u00a0when submitting personal details online. Almost one quarter (23 percent) said they sometimes gave out incorrect dates of birth, for example, while 9 percent said they did this most of the time and 5 percent always did it.<a id=\"_ftnref12\" href=\"http:\/\/www.e-flux.com\/journal\/a-sea-of-data-apophenia-and-pattern-mis-recognition\/#_ftn12\">12<\/a><\/p>\n<\/blockquote><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-element elementor-element-7e0c871 elementor-widget elementor-widget-text-editor\" data-id=\"7e0c871\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-text-editor elementor-clearfix\">\n<p>Dirty data is where all of our refusals to fill out the constant onslaught of online forms accumulate. Everyone is lying all the time, whenever possible, or at least cutting corners. Not surprisingly, the \u201cdirtiest\u201d area of data collection is consistently pointed out to be the health sector, especially in the US. Doctors and nurses are singled out for filling out forms incorrectly. It seems that health professionals are just as unenthusiastic about filling out forms for systems designed to replace them, as consumers are about performing clerical work for corporations that will spam them in return.<\/p>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-element elementor-element-c0c4177 elementor-widget elementor-widget-image\" data-id=\"c0c4177\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-image\">\n<figure class=\"wp-caption\">\n\t\t\t\t\t\t\t\t\t\t\t<a href=\"http:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/ezgif-4-b1905172bfa3.gif\" data-elementor-open-lightbox=\"yes\" data-elementor-lightbox-title=\"ezgif-4-b1905172bfa3\"><br \/>\n\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"550\" height=\"319\" src=\"https:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/ezgif-4-b1905172bfa3.gif\" class=\"attachment-large size-large\" alt=\"\" \/>\t\t\t\t\t\t\t\t<\/a><figcaption class=\"widget-image-caption wp-caption-text\">An animated gif shows a dirty data sandstorm.<\/figcaption><\/figure>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-element elementor-element-da37454 elementor-widget elementor-widget-text-editor\" data-id=\"da37454\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-text-editor elementor-clearfix\">\n<p>In his book\u00a0<i>The Utopia of Rules<\/i>, David Graeber gives a profoundly moving example of the forced extraction of data. After his mom suffered a stroke, he went through the ordeal of having to apply for Medicaid on her behalf:<\/p>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-element elementor-element-a6237f9 elementor-widget elementor-widget-text-editor\" data-id=\"a6237f9\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-text-editor elementor-clearfix\">\n<blockquote>\n<p>I had to spend over a month \u2026 dealing with the ramifying consequences of the act of whatever anonymous functionary in the New York Department of Motor Vehicles had inscribed my given name as \u201cDaid,\u201d not to mention the Verizon clerk who spelled my surname \u201cGrueber.\u201d Bureaucracies public and private appear\u2014for whatever historical reasons\u2014to be organized in such a way as to guarantee that a significant proportion of actors will not be able to perform their tasks as expected.<a id=\"_ftnref13\" href=\"http:\/\/www.e-flux.com\/journal\/a-sea-of-data-apophenia-and-pattern-mis-recognition\/#_ftn13\">13<\/a><\/p>\n<\/blockquote><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-element elementor-element-4080c87 elementor-widget elementor-widget-text-editor\" data-id=\"4080c87\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-text-editor elementor-clearfix\">\n<p>Graeber goes on to call this an example of utopian thinking. Bureaucracy is based on utopian thinking because it assumes people to be perfect from it\u2019s own point of view. Graeber\u2019s mother died before she was accepted into the program.<\/p>\n<p>The endless labor of filling out completely meaningless forms is a new kind of domestic labor in the sense that it is not considered labor at all and assumed to be provided \u201cvoluntarily\u201d or performed by underpaid so-called data janitors.<a id=\"_ftnref14\" href=\"http:\/\/www.e-flux.com\/journal\/a-sea-of-data-apophenia-and-pattern-mis-recognition\/#_ftn14\">14<\/a>\u00a0Yet all the seemingly swift and invisible action of algorithms, their elegant optimization of everything, their recognition of patterns and anomalies\u2014this is based on the endless and utterly senseless labor of providing or fixing messy data.<\/p>\n<p>Dirty data is simply real data in the sense that it documents the struggle of real people with a bureaucracy that exploits the uneven distribution and implementation of digital technology.<a id=\"_ftnref15\" href=\"http:\/\/www.e-flux.com\/journal\/a-sea-of-data-apophenia-and-pattern-mis-recognition\/#_ftn15\">15<\/a>\u00a0Consider the situation at LaGeSo (the Health and Social Affairs Office) in Berlin, where refugees are risking their health on a daily basis by standing in line outdoors in severe winter weather for hours or even days just to have their data registered and get access to services to which they are entitled (for example, money to buy food).<a id=\"_ftnref16\" href=\"http:\/\/www.e-flux.com\/journal\/a-sea-of-data-apophenia-and-pattern-mis-recognition\/#_ftn16\">16<\/a>\u00a0These people are perceived as anomalies because, in addition to having the audacity to arrive in the first place, they ask that their rights be respected. There is a similar political algorithm at work: people are blanked out. They cannot even get to the stage to be recognized as claimants. They are not taken into account.<\/p>\n<p>On the other hand, technology also promises to separate different categories of refugees. IBM\u2019s Watson AI system was experimentally programmed to potentially identify terrorists posing as refugees:<\/p>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-element elementor-element-3f21a1b elementor-widget elementor-widget-text-editor\" data-id=\"3f21a1b\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-text-editor elementor-clearfix\">\n<blockquote>\n<p>IBM hoped to show that the i2 EIA could separate the sheep from the wolves: that is, the masses of harmless asylum-seekers from the few who might be connected to jihadism or who were simply lying about their\u00a0identities \u2026<\/p>\n<p>IBM created a hypothetical scenario, bringing together several data sources to match against a fictional list of passport-carrying refugees. Perhaps the most important dataset was a list of names of casualties from the conflict gleaned from open press reports and other sources. Some of the material came from the Dark Web, data related to the black market for passports; IBM says that they anonymized or obscured personally identifiable information in this set \u2026<\/p>\n<p>Borene said the system could provide a score to indicate the likelihood that a hypothetical asylum seeker was who they said they were, and do it fast enough to be useful to a border guard or policeman walking a\u00a0beat.<a id=\"_ftnref17\" href=\"http:\/\/www.e-flux.com\/journal\/a-sea-of-data-apophenia-and-pattern-mis-recognition\/#_ftn17\">17<\/a><\/p>\n<\/blockquote><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-element elementor-element-0aaf8ee elementor-widget elementor-widget-text-editor\" data-id=\"0aaf8ee\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-text-editor elementor-clearfix\">\n<p>The cross-referencing of unofficial databases, including dark web sources, is used to produce a \u201cscore,\u201d which calculates the probability that a refugee might be a terrorist. The hope is for a pattern to emerge across different datasets, without actually checking how or if they correspond to any empirical reality. This example is actually part of a much larger subset of \u201cscores\u201d: credit scores, academic ranking scores, scores ranking interaction on online forums etc., which classify people according to financial interactions, online behavior, market data, and other sources. A variety of inputs are boiled down to a single number\u2014a superpattern\u2014which may be a \u201cthreat\u201d score or a \u201csocial sincerity score,\u201d as planned by Chinese authorities for every single citizen within the next decade. But the input parameters are far from being transparent or verifiable. And while it may be seriously desirable to identify Daesh moles posing as refugees, a similar system seems to have worrying flaws.<\/p>\n<p>The NSA\u2019s SKYNET program was trained to find terrorists in Pakistan by sifting through cell phone customer metadata. But experts criticize the NSA\u2019s methodologies. \u201cThere are\u00a0very few\u00a0\u2018known terrorists\u2019 to use to train\u00a0and\u00a0test\u00a0the model,\u201d explained Patrick Ball, a data scientist and director of the\u00a0Human Rights Data Analysis Group, to\u00a0Ars Technica. \u201cIf they are using the same records to train the model as they are using to test the model, their assessment of the fit is completely\u00a0bullshit.\u201d<a id=\"_ftnref18\" href=\"http:\/\/www.e-flux.com\/journal\/a-sea-of-data-apophenia-and-pattern-mis-recognition\/#_ftn18\">18<\/a><\/p>\n<p>Human Rights Data Analysis Group estimates that around 99,000 Pakistanis might have ended up wrongly classified as terrorists by SKYNET, a statistical margin of error that might have had deadly consequences given the fact that the US is waging a drone war on suspected militants in the country and between 2500 and four thousand people are estimated to have been killed since 2004: \u201cIn the years that have followed, thousands of innocent people in Pakistan may have been mislabelled as terrorists by that \u2018scientifically unsound\u2019 algorithm, possibly resulting in their untimely demise.\u201d<a id=\"_ftnref19\" href=\"http:\/\/www.e-flux.com\/journal\/a-sea-of-data-apophenia-and-pattern-mis-recognition\/#_ftn19\">19<\/a><\/p>\n<p>One needs to emphasize strongly that SKYNET\u2019s operations cannot be objectively assessed, since it is not known how it\u2019s results were utilized. It was most certainly not the only factor in determining drone targets.<a id=\"_ftnref20\" href=\"http:\/\/www.e-flux.com\/journal\/a-sea-of-data-apophenia-and-pattern-mis-recognition\/#_ftn20\">20<\/a>\u00a0But the example of SKYNET demonstrates just as strongly that a \u201csignal\u201d extracted by assessing correlations and probabilities is not the same as an actual fact, but determined by the inputs the software uses to learn, and the parameters for filtering, correlating, and \u201cidentifying.\u201d The old engineer wisdom \u201ccrap in\u2014crap out\u201d seems to still apply. In all of these cases\u2014as completely different as they are technologically, geographically, and also ethically\u2014some version of pattern recognition was used to classify groups of people according to political and social parameters. Sometimes it is as simple as,\u00a0we try to avoid registering refugees. Sometimes there is more mathematical mumbo jumbo involved. But many methods used are opaque, partly biased, exclusive, and\u2014as one expert points out\u2014sometimes also \u201cridiculously optimistic.\u201d<a id=\"_ftnref21\" href=\"http:\/\/www.e-flux.com\/journal\/a-sea-of-data-apophenia-and-pattern-mis-recognition\/#_ftn21\">21<\/a><\/p>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-element elementor-element-9cdbddc elementor-widget elementor-widget-image\" data-id=\"9cdbddc\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-image\">\n<figure class=\"wp-caption\">\n\t\t\t\t\t\t\t\t\t\t\t<a href=\"http:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/spaghetti-meatballs-become-really-frighteningWEB.jpg\" data-elementor-open-lightbox=\"yes\" data-elementor-lightbox-title=\"spaghetti-meatballs-become-really-frighteningWEB\"><br \/>\n\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"550\" height=\"366\" src=\"https:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/spaghetti-meatballs-become-really-frighteningWEB.jpg\" class=\"attachment-large size-large\" alt=\"\" srcset=\"https:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/spaghetti-meatballs-become-really-frighteningWEB.jpg 550w, https:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/spaghetti-meatballs-become-really-frighteningWEB-300x200.jpg 300w\" sizes=\"(max-width: 550px) 100vw, 550px\" \/>\t\t\t\t\t\t\t\t<\/a><figcaption class=\"widget-image-caption wp-caption-text\"> A plate of spaghetti meatballs return our gaze, courtesy of Google inceptionism. Source: Mary-Ann Russon, \u201cGoogle DeepDream robot: 10 weirdest images produced by AI \u2018inceptionism\u2019 and users online,\u201d International Business Times, July 6, 2015<\/figcaption><\/figure>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-element elementor-element-53d470f elementor-widget elementor-widget-spacer\" data-id=\"53d470f\" data-element_type=\"widget\" data-widget_type=\"spacer.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-spacer\">\n<div class=\"elementor-spacer-inner\"><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/section>\n<section class=\"elementor-section elementor-top-section elementor-element elementor-element-b74c737 elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"b74c737\" data-element_type=\"section\">\n<div class=\"elementor-container elementor-column-gap-default\">\n<div class=\"elementor-row\">\n<div class=\"elementor-column elementor-col-50 elementor-top-column elementor-element elementor-element-778f9de\" data-id=\"778f9de\" data-element_type=\"column\">\n<div class=\"elementor-column-wrap elementor-element-populated\">\n<div class=\"elementor-widget-wrap\">\n<div class=\"elementor-element elementor-element-511954c elementor-widget elementor-widget-text-editor\" data-id=\"511954c\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-text-editor elementor-clearfix\">\n<h3>Corporate Animism<\/h3>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-element elementor-element-8056b14 elementor-widget elementor-widget-text-editor\" data-id=\"8056b14\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-text-editor elementor-clearfix\">\n<p>How to recognize something in sheer noise? A striking visual example of pure and conscious apophenia was recently demonstrated by research labs at Google:<a id=\"_ftnref22\" href=\"http:\/\/www.e-flux.com\/journal\/a-sea-of-data-apophenia-and-pattern-mis-recognition\/#_ftn22\">22<\/a><\/p>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-element elementor-element-aa3afef elementor-widget elementor-widget-text-editor\" data-id=\"aa3afef\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-text-editor elementor-clearfix\">\n<blockquote>\n<p>We train an artificial neural network by showing it millions of training examples and gradually adjusting the network parameters until it gives the classifications we want. The network typically consists of 10\u201330 stacked layers of artificial neurons. Each image is fed into the input layer, which then talks to the next layer, until eventually the \u201coutput\u201d layer is reached. The network\u2019s \u201canswer\u201d comes from this final output layer.<a id=\"_ftnref23\" href=\"http:\/\/www.e-flux.com\/journal\/a-sea-of-data-apophenia-and-pattern-mis-recognition\/#_ftn23\">23<\/a><\/p>\n<\/blockquote><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-element elementor-element-0e89744 elementor-widget elementor-widget-text-editor\" data-id=\"0e89744\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-text-editor elementor-clearfix\">\n<p>Neural networks were trained to discern edges, shapes, and a number of objects and animals and then applied to pure noise. They ended up \u201crecognizing\u201d a rainbow-colored mess of disembodied fractal eyes, mostly without lids, incessantly surveilling their audience in a strident display of conscious pattern overidentification.<\/p>\n<p>Google researchers call the act of creating a pattern or an image from nothing but noise \u201cinceptionism\u201d or \u201cdeep dreaming.\u201d But these entities are far from mere hallucinations. If they are dreams, those dreams can be interpreted as condensations or displacements of the current technological disposition. They reveal the networked operations of computational image creation, certain presets of machinic vision, its hardwired ideologies and preferences.<\/p>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-element elementor-element-b278522 elementor-widget elementor-widget-text-editor\" data-id=\"b278522\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-text-editor elementor-clearfix\">\n<blockquote>\n<p>One way to visualize what goes on is to turn the network upside down and ask it to enhance an input image in such a way as to elicit a particular interpretation. Say you want to know what sort of image would result in \u201cBanana.\u201d Start with an image full of random noise, then gradually tweak the image towards what the neural net considers a banana. By itself, that doesn\u2019t work very well, but it does if we impose a prior constraint that the image should have similar statistics to natural images, such as neighboring pixels needing to be correlated.<a id=\"_ftnref24\" href=\"http:\/\/www.e-flux.com\/journal\/a-sea-of-data-apophenia-and-pattern-mis-recognition\/#_ftn24\">24<\/a><\/p>\n<\/blockquote><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-element elementor-element-9a821a4 elementor-widget elementor-widget-text-editor\" data-id=\"9a821a4\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-text-editor elementor-clearfix\">\n<p>In a feat of genius, inceptionism manages to visualize the unconscious of prosumer networks: images surveilling users, constantly registering their eye movements, behavior, preferences, aesthetically helplessly adrift between\u00a0Hundertwasser\u00a0mug knockoffs and Art Deco friezes gone ballistic. Walter Benjamin\u2019s \u201coptical unconscious\u201d has been upgraded to the unconscious of computational image divination.<a id=\"_ftnref25\" href=\"http:\/\/www.e-flux.com\/journal\/a-sea-of-data-apophenia-and-pattern-mis-recognition\/#_ftn25\">25<\/a><\/p>\n<p>By \u201crecognizing\u201d things and patterns that were not given, inceptionist neural networks eventually end up effectively identifying a new totality of aesthetic and social relations. Presets and stereotypes are applied, regardless of whether they \u201capply\u201d or not: \u201cThe results are intriguing\u2014even a relatively simple neural network can be used to over-interpret an image, just like as children we enjoyed watching clouds and interpreting the random shapes.\u201d<a id=\"_ftnref26\" href=\"http:\/\/www.e-flux.com\/journal\/a-sea-of-data-apophenia-and-pattern-mis-recognition\/#_ftn26\">26<\/a><\/p>\n<p>But inceptionism is not just a digital hallucination. It is a document of an era that trains smartphones to identify kittens, thus hardwiring truly terrifying jargons of cutesy into the means of production.<a id=\"_ftnref27\" href=\"http:\/\/www.e-flux.com\/journal\/a-sea-of-data-apophenia-and-pattern-mis-recognition\/#_ftn27\">27<\/a>\u00a0It demonstrates a version of corporate animism in which commodities are not only fetishes but morph into franchised chimeras.<\/p>\n<p>Yet these are deeply realist representations. According to Gy\u00f6rgy Lukacs, \u201cclassical realism\u201d creates \u201ctypical characters,\u201d insofar as they represent the objective social (and in this case technological) forces of our times.<a id=\"_ftnref28\" href=\"http:\/\/www.e-flux.com\/journal\/a-sea-of-data-apophenia-and-pattern-mis-recognition\/#_ftn28\">28<\/a><\/p>\n<p>Inceptionism does that and more. It also gives those forces a face\u2014or more precisely, innumerable eyes. The creature that stares at you from your plate of spaghetti and meatballs is not an amphibian beagle. It is the ubiquitous surveillance of networked image production, a form of memetically modified intelligence that watches you in the shape of the lunch that you will Instagram in a second if it doesn\u2019t attack you first. Imagine a world of enslaved objects remorsefully scrutinizing you. Your car, your yacht, your art collection observes you with a gloomy and utterly desperate expression.\u00a0You may own us, they seem to say,\u00a0but we are going to inform on you. And guess what kind of creature we are going to recognize in you!<a id=\"_ftnref29\" href=\"http:\/\/www.e-flux.com\/journal\/a-sea-of-data-apophenia-and-pattern-mis-recognition\/#_ftn29\">29<\/a><\/p>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-element elementor-element-9492f97 elementor-widget elementor-widget-spacer\" data-id=\"9492f97\" data-element_type=\"widget\" data-widget_type=\"spacer.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-spacer\">\n<div class=\"elementor-spacer-inner\"><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-column elementor-col-50 elementor-top-column elementor-element elementor-element-454b2da\" data-id=\"454b2da\" data-element_type=\"column\">\n<div class=\"elementor-column-wrap elementor-element-populated\">\n<div class=\"elementor-widget-wrap\">\n<div class=\"elementor-element elementor-element-c50174d elementor-widget elementor-widget-spacer\" data-id=\"c50174d\" data-element_type=\"widget\" data-widget_type=\"spacer.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-spacer\">\n<div class=\"elementor-spacer-inner\"><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-element elementor-element-50fa8f9 elementor-widget elementor-widget-image\" data-id=\"50fa8f9\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-image\">\n<figure class=\"wp-caption\">\n\t\t\t\t\t\t\t\t\t\t\t<a href=\"http:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/facial-recognition-markers-640x353WEB.jpg\" data-elementor-open-lightbox=\"yes\" data-elementor-lightbox-title=\"facial-recognition-markers-640x353WEB\"><br \/>\n\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"550\" height=\"303\" src=\"https:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/facial-recognition-markers-640x353WEB.jpg\" class=\"attachment-large size-large\" alt=\"\" srcset=\"https:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/facial-recognition-markers-640x353WEB.jpg 550w, https:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/facial-recognition-markers-640x353WEB-300x165.jpg 300w\" sizes=\"(max-width: 550px) 100vw, 550px\" \/>\t\t\t\t\t\t\t\t<\/a><figcaption class=\"widget-image-caption wp-caption-text\">CGI acupuncture: Face Robot, a general-purpose animation system, promises efficiency in motion capturing actor\u2019s faces through this 32-point system.<\/figcaption><\/figure>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-element elementor-element-cc1a16e elementor-widget elementor-widget-spacer\" data-id=\"cc1a16e\" data-element_type=\"widget\" data-widget_type=\"spacer.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-spacer\">\n<div class=\"elementor-spacer-inner\"><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-element elementor-element-231a1d8 elementor-widget elementor-widget-spacer\" data-id=\"231a1d8\" data-element_type=\"widget\" data-widget_type=\"spacer.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-spacer\">\n<div class=\"elementor-spacer-inner\"><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-element elementor-element-c040685 elementor-widget elementor-widget-image\" data-id=\"c040685\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-image\">\n<figure class=\"wp-caption\">\n\t\t\t\t\t\t\t\t\t\t\t<a href=\"http:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/3D_reconstruction_WEB.jpg\" data-elementor-open-lightbox=\"yes\" data-elementor-lightbox-title=\"3D_reconstruction_WEB\"><br \/>\n\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"550\" height=\"310\" src=\"https:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/3D_reconstruction_WEB.jpg\" class=\"attachment-large size-large\" alt=\"\" srcset=\"https:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/3D_reconstruction_WEB.jpg 550w, https:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/3D_reconstruction_WEB-300x169.jpg 300w\" sizes=\"(max-width: 550px) 100vw, 550px\" \/>\t\t\t\t\t\t\t\t<\/a><figcaption class=\"widget-image-caption wp-caption-text\">Previously unknown archaeological monuments have been revealed as of September 2015 by the Stonehenge Hidden Landscapes Project. The findings include new information about the world\u2019s largest \u201csuper henge\u201d and include ritual monuments such as the mortuary building pictured above in a 3-D reconstruction. Copyright: LBI ArchPro, Joachim Brandtner<\/figcaption><\/figure>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/section>\n<section class=\"elementor-section elementor-top-section elementor-element elementor-element-2e4c80d elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"2e4c80d\" data-element_type=\"section\">\n<div class=\"elementor-container elementor-column-gap-default\">\n<div class=\"elementor-row\">\n<div class=\"elementor-column elementor-col-50 elementor-top-column elementor-element elementor-element-6bed681\" data-id=\"6bed681\" data-element_type=\"column\">\n<div class=\"elementor-column-wrap elementor-element-populated\">\n<div class=\"elementor-widget-wrap\">\n<div class=\"elementor-element elementor-element-268ee7f elementor-widget elementor-widget-text-editor\" data-id=\"268ee7f\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-text-editor elementor-clearfix\">\n<h3>Data Neolithic<\/h3>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-element elementor-element-cb0b5c3 elementor-widget elementor-widget-text-editor\" data-id=\"cb0b5c3\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-text-editor elementor-clearfix\">\n<p>But what are we going to make of automated apophenia?<a href=\"http:\/\/www.e-flux.com\/journal\/a-sea-of-data-apophenia-and-pattern-mis-recognition\/#_ftn30\">30<\/a>\u00a0Are we to assume that machinic perception has entered its own phase of magical thinking? Is this what commodity enchantment means nowadays: hallucinating products? It might be more accurate to assume that humanity has entered yet another new phase of magical thinking. The vocabulary deployed for separating signal and noise is surprisingly pastoral: data \u201cfarming\u201d and \u201charvesting,\u201d \u201cmining\u201d and \u201cextraction\u201d are embraced as if we lived through another massive neolithic revolution<a href=\"http:\/\/www.e-flux.com\/journal\/a-sea-of-data-apophenia-and-pattern-mis-recognition\/#_ftn31\">31<\/a>\u00a0with it\u2019s own kind of magic formulas.<\/p>\n<p>All sorts of agricultural and mining technologies\u2014that were developed during the neolithic\u2014are reinvented to apply to data. The stones and ores of the past are replaced by silicone and rare earth minerals, while a Minecraft paradigm of extraction describes the processing of minerals into elements of information architecture.<a href=\"http:\/\/www.e-flux.com\/journal\/a-sea-of-data-apophenia-and-pattern-mis-recognition\/#_ftn32\">32<\/a><\/p>\n<p>Pattern recognition was an important asset of neolithic technologies too. It marked the transition between magic and more empirical modes of thinking. The development of the calendar by observing patterns in time enabled more efficient irrigation and agricultural scheduling. Storage of cereals created the idea of property. This period also kick-started institutionalized religion and bureaucracy, as well as managerial techniques including laws and registers. All these innovations also impacted society: hunter and gatherer bands were replaced by farmer kings and slaveholders. The neolithic revolution was not only technological but also had major social consequences.<\/p>\n<p>Today, expressions of life as reflected in data trails become a farmable, harvestable, minable resource managed by informational biopolitics.<a href=\"http:\/\/www.e-flux.com\/journal\/a-sea-of-data-apophenia-and-pattern-mis-recognition\/#_ftn33\">33<\/a><\/p>\n<p>And if you doubt that this is another age of magical thinking, just look at the NSA training manual for unscrambling hacked drone intercepts. As you can see, you need to bewitch the files with a magic wand. (Image Magick is a free image converter):<\/p>\n<p>\u00a0<\/p>\n<p>The supposedly new forms of governance emerging from these technologies look partly archaic and partly superstitious. What kind of corporate\/state entities are based on data storage, image unscrambling, high-frequency trading, and Daesh Forex gaming? What are the contemporary equivalents of farmer kings and slaveholders, and how are existing social hierarchies radicalized through examples as vastly different as tech-related gentrification and jihadi online forum gamification? How does the world of pattern recognition and big-data divination relate to the contemporary jumble of oligocracies, troll farms, mercenary hackers, and data robber barons supporting and enabling bot governance, Khelifah clickbait and polymorphous proxy warfare? Is the state in the age of Deep Mind, Deep Learning, and Deep Dreaming a Deep State\u2122? One in which there is no appeal nor due process against algorithmic decrees and divination?<\/p>\n<p>\u00a0<\/p>\n<p>But there is another difference between the original and the current type of \u201cneolithic,\u201d and it harks back to pattern recognition. In ancient astronomy, star constellations were imagined by projecting animal shapes into the skies. After cosmic rhythms and trajectories had been recorded on clay tablets, patterns of movement started to emerge. As additional points of orientation, some star groups were likened to animals and heavenly beings. However, progress in astronomy and mathematics happened not because people kept believing there were animals or gods in space, but on the contrary, because they accepted that constellations were expressions of a physical logic. The patterns were projections, not reality. While today statisticians and other experts routinely acknowledge that their findings are mostly probabilistic projections, policymakers of all sorts conveniently ignore this message. In practice you become coextensive with the data-constellation you project. Social scores of all different kinds\u2014credit scores, academic scores, threat scores\u2014as well as commercial and military pattern-of-life observations impact the real lives of real people, both reformatting and radicalizing social hierarchies by ranking, filtering, and classifying.<\/p>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-element elementor-element-af858c6 elementor-widget elementor-widget-spacer\" data-id=\"af858c6\" data-element_type=\"widget\" data-widget_type=\"spacer.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-spacer\">\n<div class=\"elementor-spacer-inner\"><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-column elementor-col-50 elementor-top-column elementor-element elementor-element-2965e42\" data-id=\"2965e42\" data-element_type=\"column\">\n<div class=\"elementor-column-wrap elementor-element-populated\">\n<div class=\"elementor-widget-wrap\">\n<div class=\"elementor-element elementor-element-c79310c elementor-widget elementor-widget-spacer\" data-id=\"c79310c\" data-element_type=\"widget\" data-widget_type=\"spacer.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-spacer\">\n<div class=\"elementor-spacer-inner\"><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-element elementor-element-dc44224 elementor-widget elementor-widget-image\" data-id=\"dc44224\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-image\">\n\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"550\" height=\"343\" src=\"https:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/ImageMagickWEB.jpg\" class=\"attachment-large size-large\" alt=\"\" srcset=\"https:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/ImageMagickWEB.jpg 550w, https:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/ImageMagickWEB-300x187.jpg 300w\" sizes=\"(max-width: 550px) 100vw, 550px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-element elementor-element-26513a5 elementor-widget elementor-widget-image\" data-id=\"26513a5\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-image\">\n<figure class=\"wp-caption\">\n\t\t\t\t\t\t\t\t\t\t\t<a href=\"http:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/nebula-photograph-captured-by-hubble-telescopeWEB.jpg\" data-elementor-open-lightbox=\"yes\" data-elementor-lightbox-title=\"nebula-photograph-captured-by-hubble-telescopeWEB\"><br \/>\n\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"400\" height=\"479\" src=\"https:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/nebula-photograph-captured-by-hubble-telescopeWEB.jpg\" class=\"attachment-large size-large\" alt=\"\" srcset=\"https:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/nebula-photograph-captured-by-hubble-telescopeWEB.jpg 400w, https:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/nebula-photograph-captured-by-hubble-telescopeWEB-251x300.jpg 251w\" sizes=\"(max-width: 400px) 100vw, 400px\" \/>\t\t\t\t\t\t\t\t<\/a><figcaption class=\"widget-image-caption wp-caption-text\">Could this image be a representation of the neo-neolithic? Source: Mary-Ann Russon, \u201cGoogle DeepDream robot: 10 weirdest images produced by AI \u2018inceptionism\u2019 and users online,\u201d International Business Times, July 6, 2015<\/figcaption><\/figure>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-element elementor-element-644480f elementor-widget elementor-widget-image\" data-id=\"644480f\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-image\">\n<figure class=\"wp-caption\">\n\t\t\t\t\t\t\t\t\t\t\t<a href=\"http:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/distortion-pictureWEB.jpg\" data-elementor-open-lightbox=\"yes\" data-elementor-lightbox-title=\"distortion-pictureWEB\"><br \/>\n\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"550\" height=\"446\" src=\"https:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/distortion-pictureWEB.jpg\" class=\"attachment-large size-large\" alt=\"\" srcset=\"https:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/distortion-pictureWEB.jpg 550w, https:\/\/designing.rutgers.edu\/wp-content\/uploads\/2016\/04\/distortion-pictureWEB-300x243.jpg 300w\" sizes=\"(max-width: 550px) 100vw, 550px\" \/>\t\t\t\t\t\t\t\t<\/a><figcaption class=\"widget-image-caption wp-caption-text\">Source: Anh Nguyen, Jason Yosinski, and Jeff Clune, \u201cDeep Neural Networks are Easily Fooled: High Confidence Predictions for Unrecognizable Images,\u201d cv-foundation.org, 2015<\/figcaption><\/figure>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/section>\n<section class=\"elementor-section elementor-top-section elementor-element elementor-element-76ee39a elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"76ee39a\" data-element_type=\"section\">\n<div class=\"elementor-container elementor-column-gap-default\">\n<div class=\"elementor-row\">\n<div class=\"elementor-column elementor-col-33 elementor-top-column elementor-element elementor-element-0031234\" data-id=\"0031234\" data-element_type=\"column\">\n<div class=\"elementor-column-wrap\">\n<div class=\"elementor-widget-wrap\">\n\t\t\t\t\t\t\t\t<\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-column elementor-col-33 elementor-top-column elementor-element elementor-element-1d6814f\" data-id=\"1d6814f\" data-element_type=\"column\">\n<div class=\"elementor-column-wrap elementor-element-populated\">\n<div class=\"elementor-widget-wrap\">\n<div class=\"elementor-element elementor-element-f724658 elementor-widget elementor-widget-text-editor\" data-id=\"f724658\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n<div class=\"elementor-widget-container\">\n<div class=\"elementor-text-editor elementor-clearfix\">\n<div>\n<div>\n<h3>Gestalt Realism<\/h3>\n<\/div>\n<\/div>\n<p>But let\u2019s assume we are actually dealing with projections. Once one accepts that the patterns derived from machinic sensing are\u00a0not\u00a0the same as reality, information definitely becomes available with a certain degree of veracity.<\/p>\n<p>Let\u2019s come back to Amani al-Nasasra, the woman blinded by an aerial attack in Gaza. We know: the abstract images recorded as intercepts of IDF drones by British spies do\u00a0notshow the aerial strike in Gaza that blinded her in 2012. The dates\u00a0don\u2019t\u00a0match. There is\u00a0noevidence in Snowden\u2019s archive. There are\u00a0no\u00a0images of this attack, at least as far as I know of. All we know is what she told Human Rights Watch. This is what she said: \u201cI can\u2019t see\u2014ever since the bombing, I can only see shadows.\u201d<a id=\"_ftnref34\" href=\"http:\/\/www.e-flux.com\/journal\/a-sea-of-data-apophenia-and-pattern-mis-recognition\/#_ftn34\">34<\/a><\/p>\n<p>So there is one more way to decode this image. It\u2019s plain for everyone to see. We see what Amani\u00a0cannot\u00a0see.<\/p>\n<p>In this case, the noise must be a \u201cdocument\u201d of what she \u201csees\u201d now: \u201cthe shadows.\u201d<\/p>\n<p>Is this a document of the drone war\u2019s optical unconscious? Of it\u2019s dubious and classified methods of \u201cpattern recognition\u201d? And if so, is there a way to ever \u201cunscramble\u201d the \u201cshadows\u201d Amani has been left with?<\/p>\n<p>\u00d7<\/p>\n<p><small>Acknowledgments: The initial version of this text was written at the request of Laura Poitras, who most generously allowed access to some unclassified documents from the Snowden archive, and a short version was presented during the opening of her show \u201cAstro Noise\u201d at the Whitney Museum. Further thanks to Henrik Moltke for facilitating access to the documents, to Brenda and other members of Laura\u2019s studio, to Linda Stupart for introducing me to the term \u201capophenia,\u201d and to Ben Bratton for fleshing it out for me.<\/small><\/p>\n<p>\u00a9 2016 e-flux and the author<\/p>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<div class=\"elementor-column elementor-col-33 elementor-top-column elementor-element elementor-element-23a1948\" data-id=\"23a1948\" data-element_type=\"column\">\n<div class=\"elementor-column-wrap\">\n<div class=\"elementor-widget-wrap\">\n\t\t\t\t\t\t\t\t<\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/section><\/div>\n<\/p><\/div>\n<\/p><\/div>\n","protected":false},"excerpt":{"rendered":"<p>Hito SteyerlLink to the e-flux page This image from the Snowden files was captioned: \u201cA single frame of scrambled video imagery.\u201d Rose Mary Woods, Nixon\u2019s lifelong secretary, demonstrates the \u201cRose Mary Stretch,\u201d a gesticulation that purportedly led to the erasure of a section of the Watergate tapes. The quality of noise in this section of &#8230; <a title=\"A Sea of Data:  Apophenia and Pattern (Mis-) Recognition\" class=\"read-more\" href=\"https:\/\/designing.rutgers.edu\/?p=1076\">Read more<span class=\"screen-reader-text\">A Sea of Data:  Apophenia and Pattern (Mis-) Recognition<\/span><\/a><\/p>\n","protected":false},"author":3,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[135,6,52],"tags":[],"class_list":["post-1076","post","type-post","status-publish","format-standard","hentry","category-apophenia","category-art","category-data-visualization"],"_links":{"self":[{"href":"https:\/\/designing.rutgers.edu\/index.php?rest_route=\/wp\/v2\/posts\/1076"}],"collection":[{"href":"https:\/\/designing.rutgers.edu\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/designing.rutgers.edu\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/designing.rutgers.edu\/index.php?rest_route=\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/designing.rutgers.edu\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=1076"}],"version-history":[{"count":11,"href":"https:\/\/designing.rutgers.edu\/index.php?rest_route=\/wp\/v2\/posts\/1076\/revisions"}],"predecessor-version":[{"id":1906,"href":"https:\/\/designing.rutgers.edu\/index.php?rest_route=\/wp\/v2\/posts\/1076\/revisions\/1906"}],"wp:attachment":[{"href":"https:\/\/designing.rutgers.edu\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=1076"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/designing.rutgers.edu\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=1076"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/designing.rutgers.edu\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=1076"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}