Biosensing: Tracking Persons
Biosensing: Tracking Persons
Abstract and Keywords
This chapter argues that tracking involves an increasingly significant and diverse set of techniques in relation to the ongoing transformation of relations between observer and observed, and between observers. These developments include not only the proliferation of individual sensing devices associated with a growing variety of platforms, but also the emergence of new data infrastructures that pool, scale, and link data in ways that promote their repurposing. By means of examples ranging from genes and currencies to social media and the disappearance of an airplane, it is suggested that practices of tracking are creating new public-private distinctions in the dynamic problem space resulting from the analytics that pattern these data. These new distinctions are linked to changing forms of personhood and changing relations between market and state, economy and society.
Biosensing is proliferating: sensors can be embedded in glaciers to monitor global warming, act as “labs on a chip” or as components in monitoring devices that function as a form of care. So-called self-sensing lifestyle devices attract excitement among consumers and industry alike and promise to enable us to bypass medical, financial, and other forms of authoritative knowledge production. Appeal is made to idioms of inclusion, DIY experimentation, and the redistribution of expertise, but, in parallel, contentious questions about privacy, ownership, and surveillance have emerged. How do such competing interpretations of the potential of biosensing coexist? In this chapter we introduce a range of examples to suggest that they are a response to, and can be understood best in terms of, the increasing importance of a particular organization of observation that supports practices of tracking in biosensing and in a wide range of other observational practices. We argue further that these practices recognize forms of personhood that challenge the political and legal construct of the person as individual.
In developing this argument, we make use of a very general sociological understanding of observation as “indicating something by means of distinction.”1 With this understanding, the collection of data as data can itself be seen as observation. If data are necessarily an observation, we could propose an understanding of tracking in terms of practices in which observation, or the collection of data, is folded—looped—into their analysis repeatedly and thus serially or recursively at defined intervals in time.2 The recursive looping turns observations or traces made through data collection into tracks: it makes data meaningful insofar as the traces are related, linked, or connected to each other.
(p.44) Our argument is that tracking is becoming an increasingly significant and diverse set of techniques in relation to the ongoing reconfiguration of relations between the observer and observed, and between observers. This reconfiguration includes not only the proliferation of individual sensing devices associated with a growing diversity of platforms, but also the emergence of new data infrastructures that pool, scale, and link data in ways that promote their repurposing (Kitchin 2014), opening them up to the use of analytics that produce a patterning of data as a dynamic problem space. It is in this space, so we suggest, that practices of tracking are coming to be implicated in new ways of making public-private distinctions that are themselves linked to changing forms of personhood and the reconfiguring of relations between market and state, economy and society.
We develop this argument in what follows through notes on contemporary examples of data use, a consideration of the public-private distinction, and a discussion of forms of tracking.
“Personal” Data and the Private-Public Distinction
Our first example is the case of the incorporated woman. In what started as an art project, a person named Jennifer Lyn Morone has become a corporation: Jennifer Lyn Morone™ Inc (JLM). This project to establish the value of a person in a data-driven economy was initially a response to a brief that was part of a master’s degree at the Royal College of Art, London, to “design a protest.” The project highlights the appropriation of personal data by corporations through establishing provocatively a specific person as a corporation that “derives value from three sources, and legally protects and bestows rights upon the total output” of that person. The sources, reports3 inform us, are: the accumulation, categorization, and evaluation of data generated as a result of Jennifer Lyn Morone’s life; her experience and capabilities offered as biological, physical, and mental services; and the sale of her future potential in the form of shares. As a protest, this project thus draws attention to, and caricatures, that extraordinary legal fiction whereby the corporation is seen as a person and accorded the legal rights of natural citizens.4
As a development of the project, data from Jennifer Lyn Morone’s life are to be captured and stored on the corporation’s own servers, via a software application that will be known as Database of Me, or DOME. The goal is to create a software “platform” for personal-data management; companies and other entities will be able to purchase data from DOME (p.45) via the platform, but their use of the data would be limited by encryption or data tagging. The software is thus to act as an automated data broker in a way that configures the person as an individual, that is, as a person whose agency is recognized in a relation to property established through the exercise of (self-) possession.5 Indeed, having calculated that a substantial sum had been “invested” in her person over her life, Morone, as CEO, has also decided that JLM should offer: biological services, ranging from blood plasma at £30 ($50) and bone marrow donations ($5,100) to eggs at $170,000 apiece, and mental services such as problem solving (discounted if JLM gains something in return, such as knowledge), and is considering how to value services she currently gives away freely, such as compassion.
This project of incorporation is described as an experiment as well as a protest and an art project. Another experiment that gained publicity at the time we were writing was that conducted by Facebook during the week of January 11–18, 2012.6 This study, published as “Experimental Evidence of Massive-Scale Emotional Contagion through Social Networks” (Kramer, Guillory, and Hancock 2014), sought to test whether “emotional contagion” occurs independently of in-person interaction between individuals by reducing the “amount of emotional content” in both positive and negative posts in the News Feed of Facebook. The authors described the conduct and reporting of the experiment as “consistent with Facebook’s Data Use Policy,” but the research was widely condemned as unethical. The ensuing debate led to an editorial entitled “Expression of Concern” by Inder M. Verma (2014), the editor in chief of Proceedings of the National Academy of Sciences, who described the best practice for obtaining informed consent and allowing participants to opt out; she notes that private companies, unlike public institutions, are not obliged to conform to this “Common Rule.”
These two examples might be seen as inversions of each other. In the first, an individual tries to acquire for herself the economic value of personal activities by becoming a corporation and, in the second, it is a corporation that attempts to acquire and maximize value by collecting the traces of interpersonal activities. As such, they might be taken to indicate a deep fault line between private and public in practices of biosensing, where manipulating private emotions or bodily states for corporate profit causes ethical or moral concern. However, as our next pair of examples suggest, this line is not fixed or given but continually (re-)produced. In thinking about this continuous re-making, we have been inspired by Susan Gal (2002), who conceives the public-private distinction as the outcome of subdivisions, recalibrations, and fractal recursions. She writes:
(p.46) [W]hatever the local, historically specific content of the dichotomy, the distinction between public and private can be reproduced repeatedly by projecting it onto narrower contexts or broader ones. Or, it can be projected onto different social “objects”—activities, identities, institutions, spaces and interactions—that can be further categorized into private and public parts. Then, through recursivity, and recalibration, each of these parts can be recategorized again, by the same public/private distinction.
(Gal 2002, 80)
Importantly, these distinctions do not all line up: as Gal notes, the definitions of public and private “are partially transformed with each indexical recalibration, while deceptively retaining the same label and the same co-constituting contrast.” Following Gal (2002) we also suggest that the intertwined public-private pair is not a single division, but a recursive fractal.7 Significantly Gal further argues that we often fail to see the public-private as a recursive fractal because “[o]nce named and thus semanticized, the fleeting distinctions of different roles, spaces and categories indexically invoked in interaction turn into ‘reified objects’ of the social world that seem solid and distinct” (2002, 85). Indeed Gal uses the notion of ideology to describe the categorization of public-private as a single division or dichotomy since it erases the dynamic recalibration of distinctions performed through the recursive use of indexical signs. The construct of the individual, we suggest, may also be seen in such ideological terms. However, while Gal suggests that indexical signs are typically ephemeral, we propose that they are increasingly widely supported in assorted data infrastructures, each of which has its own memory system. Indeed, we suggest, such developments may destabilize the ideological construct of the individual linked to the public-private distinction that has historically been such a dominant form of the person in many societies.
Consider, in this respect, our next pair of examples. A Science (Gymrek et al. 2013) article, “Identifying Personal Genomes by Surname Inference,” suggests that genetic privacy is most likely impossible in the context of our cultural investment in ancestry. Melissa Gymrek et al. show that publicly available data, accessed through freely available Internet resources, allow you to infer the identity of “anonymous” genomes or sequencing datasets that are shared among scientists without identifiers. By profiling “short tandem repeats on the Y chromosome (Y-STRs) and querying recreational genetic genealogy databases,” the authors recovered surname data that, in combination with other types of metadata such as age and state, enabled them to identity the “target.”8 Although the authors of this article do not themselves make public these links between the names of persons and genetic sequences, they offer the example of (p.47) Craig Venter, who was widely known to have sequenced his own genome. The authors show that the use of shared (public) family data or pedigrees enables an unidentified genome to be characterized as Venter’s with reasonable certainty: “A search with Craig Venter’s haplotype returned a clear match to a ‘Venter’ record that was concordant at all 33 comparable markers and with an estimated TMRCA of less than eight generations. … We further tested whether it would be feasible to trace back Craig Venter by combining the inferred surname with demographic profiling. A query for ‘Surname: Venter; Year of Birth: 1946; State: California’ in online public record search engines retrieved two matching records of males, one of whom was Craig Venter himself” (ibid., 323). In the same issue of Science, bioethicist Amy Gutmann notes, “A ‘deidentified’ genome has become a spectrum of possibilities for re-identification, rather than an absolute protection against privacy invasion” (2013, 1032).
The confirmation of these possibilities led a group of medical research bodies in the UK to accept the recommendations “that sanctions, such as the withdrawal of funding for or access to data resources, should be applied if researchers deliberately attempted to reidentify individuals from anonymized data. The group also set out the existing legal sanctions under the [UK] Data Protection Act, which include a maximum penalty of £500,000 for deliberately reidentifying participants from anonymised research data” (Wise 2014). Fundamental to such policies is a belief that trust will secure medical research for the public good. The report concludes with a citation from John Savill, chief executive of the Medical Research Council: “It’s important that we protect the interests and anonymity of individuals while enabling research that benefits all of society. As funders we are committed to working together to reduce the risk of reidentification in a way that does not block valuable research to advance social and medical science and improve health” (ibid.).
The issue of trust—or its apparent obverse of trustlessness—has also been raised in relation to our fourth example of crypto-currencies, notably bitcoin (Mallard, Méadel, and Musiani 2014), for which the safety, integrity, and balance of its ledgers were, at least in its early days, maintained by a community of mutually distrustful parties referred to as miners.9 These “private” currencies are seen by some to have the potential to provide an alternative to the national currencies of sovereign states, to undermine the ability of central banks to influence the price of credit for national economies, and to make it more difficult for national statistical agencies to gather data on economic activity for the benefit of the public.10 At the same time, the “private” currency bitcoin also depends on an (p.48) ideal type of public, in which trust is either irrelevant or taken for granted as part of the constitution of such a public. What is described as trustlessness derives from the transparency of the ledger, whose distributed, anonymized recursive operation on top of a P2P (peer-to-peer) network protocol11 makes it difficult to fake transactions and generate anything resembling counterfeit currency.12 As Mallard et al. put it, “every user that accepts to mine contributes a brick to the collective building of a trust that would, then, no longer need to be incarnated in specific institutional authorities.” This second pair of examples thus suggests that the organization of trust(-less)ness—alongside (de-)individuation—also contributes to the making of the recursive fractals of the public-private.13 There is no single distinction between public and private but multiple distinctions that do not all align: a private individual becomes a public entity through incorporation, but the resulting corporation addresses specifically the personal: a private currency has to be distinguished from a public one, but, in its mining or making, requires its own public.
To consider these fractals further we turn to our fifth case, which provides another instance of how the distinction between public/private is drawn, this time in relation to matters of security and (state) secrecy. We discuss this case in more detail to show something of the complexity of the production of the recursive fractal through forms of tracking that make use of repeated categorizations and calibrations. We then discuss tracking explicitly since we believe it is a set of observational practices that is coming to profoundly reconfigure the nature of the public-private distinction and to challenge existing forms of personhood.
Tracking the Disappearing Plane
Our final example is the case of Malaysia Airlines (MA) Flight 370, the scheduled passenger flight from Kuala Lumpur to Beijing that lost contact with air traffic control on March 8, 2014 at 01:20 MYT (Malaysia Time), less than an hour after takeoff. The aircraft, a Boeing 777–299ER, was carrying twelve Malaysian crew and 227 passengers from fourteen nations. No crash site has yet been found and there has been no confirmation of any flight debris.14 But how can a plane “disappear”? We suggest that “disappearance” is a consequence of a combination of “double blind” and “double bind” operating across what Luhmann (1998) calls first- and second-order observations. The former (first order) observes things; the latter (second order) observes things, including other observers, in their environment of observation. Any interpretation resulting (p.49) from this second-order observation is, Luhmann says, likely to be both paradoxical and contingent.
To start our consideration of this case, we introduce some images. The first (figure 3.1) shows a ship falling off the edge of the (flat) world; this edge, or boundary, is redefined in the second (figure 3.2) as a horizon by hypothesizing a line of sight—of observation—between observer and observed in relation to a globe.
Our next image (figure 3.3) shows a diagram devised to track the possible movements of MA Flight 370 once it lost contact with air traffic control, that is, not only after the “seer” could no longer observe the observed but also after the observed stopped observing the observer, at least by the usual methods. In this image, we suggest, boundaries are drawn in the locally flat surface of a sphere.
To organize our analysis of these images we want to make use of a claim made by the artist Mel Bochner: namely, that boundary making is
“a hypothesis of sight,” a process of composition (2008, 128). In relation to this last image, we suggest that the hypothesis takes place in relation to a surface of visualization, the organization of which produces a patterning of a problem space that configures what is visible and what is invisible, and indeed determines whether that invisibility or non-visibility is configured as secrecy, privacy, or, as in the case of MA Flight 370, disappearance. At a very general level, this organization is supported by a gamut of technical supports or platforms placed in specific relations to each other. Weber, for example, has shown how television emerges across the relations among three sites: the place of recording, the place of reception, and the place of transmission (Weber 1996). Increasingly, surfaces also emerge in the relations established among GPS, the military, governments, and everyday users of location-based devices and applications through algorithmic rules.
The initial hypothesis of sight that operates to produce boundaries in this case is, we suggest, what one of the founders of cybernetics, Heinz von Foerster (1995), describes as “double blind.” A blind spot, or scotoma, is an obscuration of the visual field. A particular blind spot known as the physiological blind spot, or punctum cecum, is the place in the visual field that corresponds to the lack of light-detecting photoreceptor cells on the optic disc of the retina where the optic nerve passes. Since (p.51)
there are no cells to detect light on the optic disc at this point, the brain interpolates surrounding detail and information from the other eye, with the result that the blind spot is not normally perceived. Interestingly, von Foerster does not discuss the blind spot but the double blind; indeed, he sums up the long-established finding that each of our two eyes has a blind spot with the aphorism: “We do not see what we do not see” (or as he put it elsewhere, “We do not see that we do not see”). And certainly, we live in an era defined by a cultural imaginary of total planetary observation made possible by the belief that, because of the multiplication of “eyes” to observe, “we” can see always and everywhere.
Laura Kurgan provides an example in her discussion of a sequence of views of the planet Earth that began with The Blue Marble, a photographic view of the Earth as seen by the Apollo 17 crew traveling toward (p.52) the moon in 1972, and ends with analogous digital 2012 versions of this image. She describes these later versions as being
… assembled from data collected by the Visible/Infrared Imager Radiometer Suite (VIIRS) on the Suomi NPP satellite in six orbits over eight hours. These versions are not simply photographs taken by a person traveling in space with a camera. They are composites of massive quantities of remotely sensed data collected by satellite-borne sensors. …This is not the integrating vision of a particular person standing in a particular place or even floating in space. It’s an image of something no human could see with his or her own eye … because it’s a full 360-degree composite, made of data collected and assembled over time, wrapped around a wireframe sphere to produce a view of the Earth at a resolution of at least half a kilometer per pixel—and any continent can be chosen to be at the center of the image. [Moreover] … it can always be updated with new data.
(Kurgan 2013, 11–12; our emphasis)
As a composite, this image is especially interesting because it has no (visible) edges; that is, it appears to be a hypothesis of sight that does not make boundaries or, perhaps phrased better, is boundless. But, of course, the plane that disappeared has shown that “we” cannot see always and everywhere, even with the aid of the huge technical complex of GPS, civilian and military satellites, drone cameras, Google Maps, and so on. The disappearance thus reveals that there are edges (which might appear as gaps, corridors, out-of-focus patches, and the like) in the apparently boundless surface of visualization, even if we do not always know where they are or how they operate, where or when we might fall over these “edges.”
This case of double blind is further complicated, however, by the operation of a double bind. The double bind is a phenomenon described by the anthropologist, fellow contributor to cybernetics, analyst of schizophrenia, and advocate of Alcoholics Anonymous15 Gregory Bateson. Bateson suggests that the double bind—which he describes as “an experienced breach in the weave of contextual structure”—is a characteristic of all adaptive change. Such change, he says, depends upon feedback loops, “be it those provided by natural selection or those of individual reinforcement.” He continues, “In all cases, then, there must be a process of trial and error and a mechanism of comparison.” But, he notes, “trial and error must always involve error, and error is always biologically and/or psychically expensive. It follows therefore that adaptive change must always be hierarchic.” The introduction of hierarchy is necessary since “[t]here is needed not only that first-order change which suits the immediate environmental (or physiological) demand but also second-order changes which will reduce the amount of trial and error needed to achieve (p.53) the first-order change. And so on” (Bateson 1987, 201). Our suggestion, then, is that the disappearance of MA Flight 370 is a consequence of the emergence of a hierarchy of first- and second-order observation in the surface of visualization via feedback loops.
Let us explore this in a bit more detail. In the case of the disappearing plane, the double bind in operation for some, especially state actors, is the coexistence of the injunction to “see everything” (or at least the injunction, “do not see what you do not see”) with the contradictory injunction to “see nothing” (which is to say, “do not see what you see”). The relatives of flight passengers in China and elsewhere most likely occupy a distinct position since they want to know what has happened to the plane and its passengers. They personify the imperative to see everything; they focus the minds of state actors who do not want to admit the failure of the fantasy of total planetary observation that is the double blind hypothesis. However in order for this double blind hypothesis to be preserved, observers who are differently situated in relation to the context(s) of observation would need to reveal to each other what they have (not) seen, and with this comes the challenging possibility of comparison of the capacity to observe. Comparison in turn has “costs” of very significant political tension for such observers, perhaps especially for observers acting on behalf of the state, intensifying the situation in which they simultaneously feel compelled to observe the injunction to not see what they do not see and to not see what they do see. These twin injunctions inform a slow dance of observe and (don’t) tell, the outcome of which is both the creation and outcome of unstable hierarchies of first and second orders of observation, in which a plane can disappear, and in which mechanisms of comparison can only precariously establish what can and what cannot be seen.
And, not surprisingly, the disappearance of the plane was followed by a huge proliferation of hypotheses of sight, each of which reconfigured the distinction between security and secrecy. The actors or “eyes” who had an interest in, and the capacity to observe, the movement of the plane were many and various, including not only the Malaysian civil aviation department, air force, navy, and Maritime Enforcement Agency, but also numerous national and international bodies, resulting in the deployment of ships and aircraft from Australia, China, Japan, Malaysia, New Zealand, South Korea, the United Kingdom, and the United States. Significantly, for our interest in the hierarchy of first and second orders of observation, Wikipedia (a kind of composite eye itself) posted the following about “information sharing” among such actors/eyes:
(p.54) Although Malaysia’s acting Transport Minister Hishammuddin Hussein, who is also the country’s Defence Minister, denied the existence of problems between the participating countries, academics said that because of regional conflicts, there were genuine trust issues involved in co-operation and sharing intelligence, and that these were hampering the search. International relations experts said entrenched rivalries over sovereignty, security, intelligence, and national interests made meaningful multilateral co-operation very difficult. A Chinese academic made the observation that the parties were searching independently, thus it was not a multilateral search effort. …
Defence experts suggested that giving others access to radar information could be sensitive on a military level. As an example: “The rate at which they can take the picture can also reveal how good the radar system is.” One suggested that some countries could already have had radar data on the aircraft but were reluctant to share any information that could potentially reveal their defence capabilities and compromise their own security. Similarly, submarines patrolling the South China Sea might have information in the event of a water impact, and sharing such information could reveal their locations and listening capabilities.
(https://en.wikipedia.org/wiki/Malaysia_Airlines_Flight_370, accessed September 2014)
What unites the resulting multiple, competing, and variously credible hypotheses of sight that emerged to account for the problem of the disappearance of MA Flight 370 is, we suggest, the operation of various feedback loops between first and second orders of observation. Once double bind is in operation alongside double blind, not only may you not see what you do not see (the double blind), you may not see what you see, and you may also see what you do not see (the double bind), and so on. In the process, the feedback loops between first- and second-order observations produce “tracks” from “traces,” which are, as Bateson says, “superposing and interconnecting.” That is, the traces do not “make clear” or “lay bare,” as in common usage, but result in a whole range of affective disorders of visualization—paranoia, narcissistic exceptionalism, and despair as well as the hope that the injunction “we do not see what we do not see” does in fact mean that we can see everything, and so can learn the fate of the disappearing plane and the people within it.
Orders of Observation and Lines of Sight
So, what can “we” (the public?) observe in relation to this multiplication of “eyes” with which to see? Perhaps “we” can observe that the invisibility of the plane brings into visibility at least some of the technical supports necessary for observation, providing the possibility of piecing together knowledge about processes of composition in relation to, among other matters of concern, international security, secrecy, and the current technological capacity to observe in different media, including water.
(p.55) We can begin to observe these processes from accounts that suggest the plane was first described as having disappeared when it crossed the edge of Malaysian and Vietnamese airspace, that civilian and military radar provided different information, that tracking via satellite could apparently provide information (technically speaking, “pings”) about a change in direction but not whether this was to a northern or a southern corridor. Much less clearly visible, but still observable or rather observable by being rendered unobservable in specific ways, are the political dimensions that mean the information gained by both first- and second-order observation is only made available in particular ways by different actors. In short, what “we” can observe is that the plane’s (possible) movements become variously visible (or invisible) in relation to different technologies of observation, mobility, and political and military “reach” (Allen and Cochrane 2010). What appears as a seamless, boundary-less 360-degree composite is shown instead to be a military-political-technical patchwork (figure 3.4), a ceaselessly ongoing collage with carefully stitched-together edges, corridors, and targets, international treaties, submerged alliances and territorial conflicts, satellites and submarines.
(p.56) To investigate the dynamics of this composition further, we can turn to discussion of other kinds of nonobservability, such as those of national state secrecy. Peter Galison, for example, points out that a targetable infrastructure has both come into view in recent years, and been rendered a matter of secrecy in the United States: “With the terror wars raging across the globe, sites previously invisible to the targeters—passenger trains, shopping centers, sports stadiums, monuments—suddenly came all too clearly into view” (Galison 2010, 962). And he suggests that it is “this targetable infrastructure” that informs the contemporary organization of state secrecy:
After September 11, 2001, the whole system of national security began to shift, first with the Patriot Act (October 26, 2001) and then in a host of other alterations to the law. One key change in the secret universe was President George W. Bush’s Executive Order 13292 of March 25, 2003. … For Bush, the goal was to provide a system of information control that would continue the older “scientific, technological, or economic matters relating to the national security,” but added to it the clause that this should “include defense against transnational terrorism” … The government would seek as well protection to cover … “vulnerabilities or capabilities of systems, installations, infrastructure, projects, plans, or protection services related to the national security, which includes defense against transnational terrorism.” With this new vocabulary, especially in the inclusion of infrastructure, lay a sea change in the ontology of secrecy. (ibid) … Bush framed the national strategy with a picture of the opposition: “The terrorist enemy that we face is highly determined, patient, and adaptive” (ibid, 964).
Galison concludes, “Our new security fence is everywhere, not delimited by time or space. … Critical Unclassified Information fits our age exactly: a form of secrecy with no end date, no limit of scope, and little access through the Freedom of Information Act. In short we have a new ontology of hidden knowledge: multiply infinite secrets for a boundless conflict’ (Galison 2010, 969–970). He suggests that the composite form identified by Kurgan allows for the multiplication of secrets, which in turn provide the grounds for conflict without end: not a composite of total planetary observation, but of infinite secrets for a total war (see also Chow 2006).
Tracking: A Stitch in Time
How, then, are we to understand what is at stake for biosensing from this variety of examples? In what we have presented so far, we have added the notion of composition to Gal’s conceptualization of the public-private distinction as a recursive fractal so as to draw attention to the specificity (p.57) and variety of ways in which this fractal is now being produced. On the basis of our examples, we suggest that it is not simply the capacity to observe or sense that is being extended (that is, it is not simply the case that observers can observe more than before), but also the patterns of observation that are being complicated by these compositions. New practices of composition multiply the relations between observers, and in particular, the methods and kinds of feedback loops. And while we further suggest that many of these relations now commonly take the form of practices of tracking, we also note that their outcome is not pre-given.16
In a previous discussion of numbering practices (Day, Lury, and Wakeford 2014), we identified several processes of composition including zooming, folding, pausing, accreting, knotting, and diffracting. For example, we considered how fractions, including percentages, might be considered as practices of folding in which a denominator or whole is (recursively) folded (and unfolded) into (enumerated) parts. Here, in the discussion of the disappearing plane, we have suggested that such processes of composition can also be produced—via the use of feedback loops—as tracks. While the notion of a feedback loop was central to second-wave cybernetics, our examples indicate that it is now implicated in a variety of forms of tracking beyond what Hookway describes as the original “model space of a predator-prey relation” (2014, 107).17 We suggest that this increasing diversity of forms of tracking (that is, not only but including hunting) depends on which hypotheses of sight are in operation, and whether and how levels of observation are introduced and how they operate.
To elaborate on both the importance and heterogeneity of tracking, we introduce an analogy with practices of stitching. The needle entering into the fabric being stitched indicates a moment of observation or data collection, the leaving of a trace, but this trace only becomes part of a track when a line is drawn, when a relation is established, between such points—in the case of stitching, by the thread that the movement of the needle leaves behind. It is this thread that is the track, and it is the effect of a dynamic process of composition, the stitching of the collection of data18 across and between fabrics or spaces, running, edging, joining, hiding folds, superposing and interconnecting, all within the weave of the fabric or, as Bateson (1987) put it, the contextual structure that it comes to constitute.
One kind of tracking might be like a running stitch—the basic stitch in hand sewing and embroidery, on which all other forms of sewing are based (figure 3.5). The stitch is worked by passing the needle in and out (p.58) of the fabric: a relation of observing the self by the self in a recursive observer-observed relationship, as with Jennifer Lee Morone Inc. In this case, the relation of separation between observer and observed that is essential for a trace to be recorded is effected by the incorporation of the person, a relation that, as we noted, acquires significance in terms of both the legal recognition of the corporation as a person and the history of the possessive individual. In this example, not only is the person as individual divided in herself by being incorporated, but her activities are also rendered as commodities.
Other kinds of tracking are more like blanket stitching (figure 3.6) in which an edge is secured: defining a territory or referent in relation to whose coordinates the object’s movements are mapped. An example might be found in the mid-twentieth-century form of broadcasting where the “audience,” in some sense observing television or other media, is constituted as a commodity to be bought and sold; the audience members’ activities of watching in turn observed—or tracked—by broadcasters, who sell analyses of the traces of the individual or household activity of watching to advertisers. This is the count plus loop.
A third kind of tracking might be understood as invisible (or stealth) stitching, in which the detail if not the existence of a feedback loop between first- and second-order observation is hidden. EdgeRank,19 the algorithm that Facebook developed to govern what is displayed—and how high—on an individual Facebook user’s News Feed provides an example. The EdgeRank algorithm20 calculated the sum of edges—where an edge was: “basically everything that ‘happens’ in Facebook. Examples of Edges would be status updates, comments, likes, and shares. There were many more Edges than the examples above—any action that happens within Facebook is an Edge” (emphasis in original). Each edge itself
was a composite of Affinity (a one-way relationship between a user and an edge), Weight (a valuing system created by Facebook to give greater weight to some actions rather than others), and Time Decay (Facebook’s equation for this is 1/(time since action)).
In order for this stitching analogy to be useful, however, it is important to appreciate that the fabric in which the track is made visible is itself an ongoing contextual weave; as a consequence, tracking is constitutive and may produce knotting, fraying, or loose ends.21 Indeed, to continue with the analogy, it was the existing operation of EdgeRank that made the Facebook experiment discussed earlier in this chapter appear uncontroversial to those conducting it: “Which content is shown or omitted in the News Feed is determined via a ranking algorithm that Facebook continually develops and tests in the interest of showing viewers the content they will find the most relevant and engaging. One such test is reported (p.60) in this study: A test of whether posts with emotional content are more engaging” (Kramer, Guillory, and Hancock 2014).
In the light of other initiatives such as bitcoin, we suggest that likes, status updates, comments, and so forth can be understood as tokens in Facebook’s own crypto-currency: they—and tokens in other proprietary social media—constitute transferable property rights in social connections, and specifically in dividuality (discussion follows). Indeed, Facebook might be seen to hold a monopoly in its own currencies, the value of which it can manipulate through continual recalibration of the rate of exchange.
Conclusion: Tracking Persons
To conclude, we suggest that our examples do not simply support the claim that biosensing takes place in relation to a public-private distinction that is a recursive fractal, but indicate that this fractal is increasingly organized through a variety of processes of tracking, the recursive looping of data collection into data analysis. Not all loops are the same—tracks can be laid to make or seal edges, or to make invisible. Indeed, we have sought to show that tracking is central to the reconfiguration of economy and society, with implications for how we consider processes of securitization, financialization, and reterritorialization and deterritorialization whether performed as scoring, ranking, or edging.
In one further—more speculative—step, we return to the relation between the recursive fractal of the public-private and the longstanding legal and political fiction of the person. The developments in observing and tracking we have described here trouble this category of the person, which currently comprises both people and corporations. These ways of organizing observation track not an individual person that is undivided or persistently the same but, rather, a dividual, continually remade in relations that divide and connect.22 Morone is parceled out, while Venter is reassembled through distributed datasets. We are not yet in a position to judge the outcome of this troubling of the category of person, but the issues at stake are suggested in some responses to the cases we have outlined.
For example, as we have indicated, crypto-currencies trouble the distinction of public-private. Indeed, we further observe that some currencies have adopted a playful approach to the project of personification by adopting a persona as a way of constituting—or masking—themselves as brands. But this strategy has met some resistance. For example, the (p.61) currency Coinye initially used an image of rapper Kanye West as its logo. Upon hearing of this use, attorneys for Kanye West sent a cease and desist letter to the developers of Coinye, stating that the use of “Coinye” was willful trademark infringement, unfair competition, cyberpiracy and dilution, and instructing Coinye to stop using the likeness and name of Kanye West (Wikipedia, http://en.wikipedia.org/wiki/Coinye, accessed September 2014). In another example, the U.S. Internal Revenue Service has recently reasserted the construct of the person as individual by ruling that bitcoin would be treated as property for tax purposes and not as a currency. This move simultaneously made specific individuals identifiable for the purpose of charging capital gains tax (only individuals, not dividuals, hold property rights) and denied the legitimacy of the apparently im-personal guarantee provided by the distributed trustless platform for bitcoin as a currency.
Responses to the finding that it is possible to blur the distinction between a nonidentifiable and an anonymous person in relation to publicly available genetic data provide a further vantage point from which to view these developments. As we have indicated, commentaries on genomic sequencing variously query inconsistent state privacy protection, posit a continuum between public and private, and describe alternative approaches supporting open-access datasets and shifting norms of privacy. Sanjay Mehta et al., for example, advocate a “differential privacy” policy for related genomics problems, based on quantifiable thresholds to determine the level of aggregation of data open for analysis. As they note, “A new and different approach shifts the focus of privacy from being regarded as a property of the data, as reflected in the HIPAA de-identification standard, to account for the process by which information is disclosed independently of the data. The emerging standard in privacy protection and disclosure control, termed differential privacy, (…) is based on this approach” (Mehta, Vinterbo, and Little 2014). In the policy forum of Science, Rodriguez et al. refer to “an increasing number of ‘citizen science’ initiatives [such as Sage Bionetworks Commons and Genomera], which use informatics tools and social-media strategies to build research models for integrating participant preferences about privacy protection and future research use in an iterative and dynamic way” (2013, 276). In these proposals, it is implied that a person may now usefully be recognized—and given rights—not as an individual, but as a dividual, a form of personhood whose value(s) emerge in relations of division and connection, relations not of sameness but of self-similarity.
(p.62) In short, the operation of recursion in all these examples, so integral to the expansion of forms of tracking we identify in this chapter, means that the edges of a person cannot easily be stabilized or secured. In consequence, relations to property as relations between persons constituted as individuals are in flux. It is too early to see clearly whether and how the legal fiction of the person (and corporation) as individual can withstand the challenge of experiments to capture the value of this emerging dynamic dividuality. But, in our view, it is this developing fiction that will (or will not) stabilize the recursive fractal of the public-private and provide the context for biosensing in the future.
We would like to acknowledge Intel University Research Office, which supported our project on contemporary numbering practices (2011–2014) as part of a wider program on biosensors. This project supported our work with Nina Wakeford, whom we also want to thank. In addition, Sophie Day acknowledges research support from the School of Public Health, Imperial College London and the National Institute for Health Research Imperial BioMedical Research Centre, and Celia Lury acknowledges the support of the Economic and Social Research Council, grant no: ES/K010689/1.
All, Robleh, John Barrdear, Roger Clews, and James Southgate. 2014. “The Economies of Digital Currencies.” Bank of England Quarterly Bulletin, Q3. http://www.bankofengland.co.uk/publications/Documents/quarterlybulletin/2014/qb14q3digitalcurrenciesbitcoin2.pdf. Accessed September 2014.
Allen, John, and Allan Cochrane. 2010. “Assemblages of State Power: Topological Shifts in the Organization of Government and Politics.” Antipode 42 (5): 1071–1089.
Bateson, Gregory. 1987. Steps to an Ecology of Mind. San Francisco: Chandler.
Bochner, Mel. 2008. Solar Systems & Rest Rooms. Writings and Interviews, 1965–2007. Cambridge, MA: The MIT Press.
Chow, Rey. 2006. The Age of the World Target: Self-referentiality in War, Theory and Comparative Work. Durham, NC: Duke University Press.
Gal, Susan 2002. “A Semiotics of the Public/Private Distinction.” differences: A Journal of Feminist Cultural Studies 13 (1): 77–95.
Galison, Peter. 2010. “Secrecy in Three Acts.” Social Research: An International Quarterly 77 (3): 941–974.
Gross, Ana. 2015. “The Political Ontology and Economy of Data.” PhD thesis. Centre for Interdisciplinary Methodologies, University of Warwick, Coventry, UK.
Gutmann, Amy. 2013. “Data Re-Identification: Prioritize Privacy.” Science 339: 1032.
Gymrek, M., A. McGuire, D. Golan, E. Halperin, and Y. Erlich. 2013. “Identifying Personal Genomes by Surname Inference.” Science 339: 321–324.
Hart, Keith. 2005. The Hit Man’s Dilemma: Or Business, Personal and Impersonal. Chicago: Prickly Paradigm.
Hookway, Branden. 2014. Interface. Cambridge, MA: The MIT Press.
Ingold, Tim. 2007. Lines: A Brief History. London: Routledge.
Kelty, Christopher M. 2005. “Geeks, Social Imaginaries and Recursive Publics.” Cultural Anthropology 20 (2): 185–214.
Kelty, Christopher. 2008. Two Bits: The Cultural Significance of Free Software. Durham, NC: Duke University Press.
Kitchin, Rob. 2014. The Data Revolution. London: Sage.
Kramer, Adam D. I., Jamie E. Guillory, and Jeffrey T. Hancock. 2014. “Experimental Evidence of Massive-Scale Emotional Contagion through Social networks.” Proceedings of the National Academy of Sciences of the United States of America 111 (24): 8788–8790.
Kurgan, Laura. 2013. Close Up at a Distance: Mapping, Technology and Politics. Cambridge, MA, and London: The MIT Press.
Luhmann, Niklas. 1998. Observations on Modernity. Stanford, CA: Stanford University Press.
Macpherson, C. B. 1962. The Political Theory of Possessive Individualism: Hobbes to Locke. Oxford: Clarendon Press.
Mallard, A., C. Méadel, and F. Musiani. 2014. “The Paradoxes of Distributed Trust: Peer-to-Peer Architecture and User Confidence in Bitcoin.” Journal of Peer Production 4. http://peerproduction.net/issues/issue-4-value-and-currency/peer-reviewed-articles/the-paradoxes-of-distributed-trust/. Accessed September 2014.
Marriott, McKim. 1976. “Hindu Transactions: Diversity Without Dualism.” In Transaction and Meaning: Directions in the Anthropology of Exchange and Symbolic Behavior, ed. Bruce Kapferer, 109–142. Philadelphia: Institute for the Study of Human Issues.
Morris, Brian. 1991. Western Conceptions of the Individual. London: Berg.
P. H. 2014. “Who Owns Your Personal Data? The Incorporated Woman.” The Economist, June 27. http://www.economist.com/node/21606113?fsrc=scn/tw/te/bl/theincorporatedwoman. Accessed July 2014.
Rodriguez, Laura L., Lisa D. Brooks, Judith H. Greenberg, and Eric D. Green. 2013. “The Complexities of Genomic Identifiability.” Science 339: 275–276.
Simmel, Georg. 1950. The Sociology of Georg Simmel, trans., ed., and introduction by K. H. Wolff. Glencoe, IL: Free Press.
Strathern, Marilyn. 1988. The Gender of the Gift: Problems with Women and Problems with Society in Melanesia. Berkeley: University of California Press.
Verma, Inder M. 2014. “Editorial Expression of Concern.” Proceedings of the National Academy of Sciences of the United States of America 111 (29). doi: http://www.pnas.org/content/111/29/10779.1.full.10.1073/pnas.1412469111.
von Foerster, Heinz. 1995. “Heinz von Forster [sic].” The Cybernetics Society. http://www.cybsoc.org/heinz.htm. Accessed September 2014.
Weber, Samuel. 1996. Mass Mediauras: Form, Technics, Media. Stanford University Press.
Wise, Jacqui. 2014. “UK Research Bodies Agree Steps to Protect Anonymity of Study Participants.” British Medical Journal 348: g2372.
(1.) Luhmann defines observation as “any kind of operation that makes a distinction so as to designate one (but not the other) side. Such a definition is itself contingent, since what is defined would have another meaning given another distinction” (1998, 47).
(2.) This definition can be contrasted with practices of counting, one-off additive techniques in which analysis is separated analytically from the collection of data (see Day, Lury, and Wakeford 2014).
(5.) The classic account of such “possessive individualism” is Macpherson (1962). While responses to the online article in which we first encountered this project suggest that the value of an individual’s personal data might amount to no more than £10 a year, the magazine includes in its online report the observation that there are “13 trackers and beacons on this page alone.”
(p.63) (6.) As biosensing relies on social media platforms such as these, we do not make a hard distinction between such sensing and other ways of participating and knowing.
(7.) Gal’s approach differs from Kelty’s recursive publics (2005, 2008) insofar as she argues that the public is a derivative of the public-private coupling. Whereas Patterson and Nissenbaum (chapter 5, this volume) suggest that privacy is context dependent, our argument is that context dependence is actively produced as the specific process of patterning we describe here.
(9.) The more people mine bitcoin, the harder it is to generate. The situation now is such that the energy needed to mine requires more power than that of a typical household. Consequently KnCMiner (the company that used to sell bitcoin mining equipment) has recently built an industrial data center in northern Sweden in the Arctic Circle. From there, it generates bitcoin for itself as revenue, and leases power to customers; see http://www.coindesk.com/kncminer-cloud-mining-service-arctic-bitcoin-mine/, accessed September 2014.
(11.) Mallard, Méadel, and Musiani (2014) provide a detailed description: “As a service operating on top of a peer-to-peer network, bitcoin allows users to execute payments by digitally signing their transactions. It prevents the possible problem of double-spending ‘digital’ coins through a distributed time-stamping service.”
(12.) In the way in which it calls into question the legitimacy of the state’s monopoly of currency, bitcoin evokes Simmel’s (1950) discussion of the sociology of secrecy, where secrets index degrees of publicness in specific ways.
(13.) Mallard, Méadel, and Musiani (2014) write: “Bitcoin addresses—the virtual pseudonyms that identify users vis-à-vis the system—are the arrangement to which privacy protection for participants in the system is delegated. Each user possesses one or more bitcoin addresses that are stored and managed by its P2P client, its digital wallet. Each address is linked to a unique public/private key pair: … Bitcoin uses public-key cryptography, in which each user has a pair of cryptographic keys: a public encryption key and a private decryption key. The publicly available encrypting-key is widely distributed, while the private decrypting key is known only to its proprietor. The keys are related mathematically, but the parameters are chosen so that calculating the private key from the public key is either impossible or prohibitively expensive.”
(14.) At the time of going to press, seventeen months after the plane disappeared, debris have been found washed up on Reunion that may yield evidence of a crash (BBC News, July 30, 2015).
(15.) Bateson (1987) claimed that his cybernetic epistemology coincided closely with the epistemology of Alcoholics Anonymous. Dawn Nafus pointed out to us that a parallel is commonly drawn between AA and the Quantified Self movement that has emerged in recent years.
(p.64) (16.) As Bochner says, what is key to composition is not simply “the adjustment of the parts, i.e., their size, shape, color or placement to arrive at the finished work, but that the exact nature of that finished work ‘is not known beforehand’” (2008, 37). See also Ingold (2007).
(17.) Hookway writes, “the teleology of cybernetics is an equilibration of behavior in the pursuit of a target, as a predator pursues prey. … In this way, intelligent technologies evolve within the model space of a predator-prey relation” (2014, 107).
(18.) As Dawn Nafus observes (pers. comm.), engineers “stitch” together data that are qualitatively different—such as data referring to blood pressure and data referring to sleep. Combined, the stitches align the time stamps so that it is possible to see the joint trajectory of each data stream.
(19.) In 2011, Facebook stopped using the EdgeRank name internally to refer to its News Feed ranking algorithm.
(20.) This description of EdgeRank was taken from a webpage (December 2014) produced by Applum: http://www.whatisedgerank.com. Applum has produced a device called EdgeRank Checker that measures the average impact of EdgeRank; see http://applum.com. Other descriptions give alternative “formulas.”
(21.) In discussion of these analogies, Dawn Nafus suggested that Facebook’s poke feature might constitute an example of “fraying,” used until so worn that it is cast aside.
(22.) We should note that we appreciate that persons relate “dividually” every day despite and alongside these norms (see, for example, Morris 1991). The concept of dividuality developed in Indian (Marriott 1976) and Melanesian (Strathern 1988) ethnographies has been greatly expanded since Marilyn Strathern’s work deploying this perspective in Euro-American contexts.