Digital technologies and data as sociomaterial objects

An excerpt from Chapter 2: ‘Theorising digital society’ from my book Digital Sociology (forthcoming, Routledge).

As sociologists and other social theorists have begun to argue, digital data are neither immaterial nor only miniscule components of a larger material entity. This perspective adopts a sociomaterial approach drawn from science and technology studies, an interdisciplinary field which has provided a critical stance on media technologies in general, and computerised technologies more specifically … In this literature, the digital data objects that are brought together through digital technologies, including ‘like’ or ‘share’ buttons, individuals’ browser histories, personalised recommendations and comments on social media posts as well as the hardware and software that structure the choices available to users, are assemblages of complex interactions of economic, technological, social and cultural logics (Mackenzie, 2005; Mackenzie and Vurdubakis, 2011; Caplan, 2013; Langois and Elmer, 2013). Representing digital phenomena as objects serves the purpose of acknowledging their existence, effects and power (Marres, 2012; Caplan, 2013; Hands, 2013; Langois and Elmer, 2013).

The cultural and political analysis of computer software is sometimes referred to as software studies. Writers in software studies place an emphasis not on the transmission or reception of messages, as in the old model of communication, but rather have developed a sociomaterial interest in the ways in which acts of computation produce and shape knowledges. Computer coding are positioned as agents in configurations and assemblages (Fuller, 2008), producing what Kitchin and  Dodge (2011) refer to as ‘coded assemblages’. Indeed the pervasive nature of software in everyday life is such that Manovich (2013: no page number given) argues that it has become ‘a universal language, the interface to our imagination and the world’. He contends, therefore, that social researchers should be conceptualising people’s interactions with digital technologies as ‘software performances’ which are constructed and reconstructed in real-time, with the software constantly reacting to the user’s actions.

… Digital data are also positioned as sociomaterial objects in this literature. Whereas many commentators in the popular media, government and business world view digital data as the ultimate forms of truth and accurate knowledge, sociologists and other social theorists have emphasised that these forms of information, like any other type, are socially created and have a social life, a vitality, of their own. Digital data objects structure our concepts of identity, embodiment, relationships, our choices and preferences and even our access to services or spaces.

There are many material aspects to digital data. They are the product of complex decisions, creative ideas, the solving and management of technical problems and marketing efforts on the part of those workers who are involved in producing the materials that create, manage and store these data. They are also the product of the labour of the prosumers who create the data. These are the ‘invisible’ material aspects of digital data (Aslinger and Huntemann, 2013).

Algorithms play an important role in configuring digital data objects. Algorithms measure and sort the users of digital technologies, deciding what choices they may be offered. Digital data objects aggregated together, often from a variety of sources, configure ‘metric assemblages’ (Burrows, 2012) or ‘surveillant assemblages’ (Haggerty and Ericson, 2000) that produce a virtual doppelganger of the user. Algorithms and other elements of software, therefore, are generative, a productive form of power (Mackenzie, 2005; Beer, 2009; Cheney-Lippold, 2011; Mackenzie and Vurdubakis, 2011; boyd and Crawford, 2012; Beer, 2013; Ruppert et al., 2013).

Scholars who have adopted a sociomaterial perspective have also highlighted the tangible physicality of aspects of digital technology manufacture and use. Despite the rhetoric of seamless, proficient operation that so commonly is employed to discuss the internet and ubiquitous computing, the maintenance that supports this operation is messy and contingent, often involving pragmatic compromises negotiations and just-in-time interventions to keep the system working. Geographical, economic, social, political and cultural factors – including such basic requirements as a stable electricity supply and access to a computer network – combine to promote or undermine the workings of digital technologies (Bell, 2006; Bell and Dourish, 2007; Dourish and Bell, 2007; Bell and Dourish, 2011). The materiality of digital hardware becomes very apparent when devices that are no longer required must be disposed of, creating the problem of digital waste (or ‘e-waste’) that often contains toxic materials (Gabrys, 2011; Miller and Horst, 2012).

Given the high turnover of digital devices, their tendency towards fast obsolescence and the fact that they are often replaced every few years in wealthy countries by people seeking the newest technologies and upgrades, vast quantities of digital waste is constantly generated. The vast majority of discarded digital devices end up in landfill. Only a small minority are recycled or reused, and those that are tend to be sent from wealthy to poor countries for scrap and salvaging of components. When they are outmoded and discarded, the once highly desirable, shiny digital devices that were so full of promise when they were purchased simply become another form of rubbish; dirty, unsightly and potentially contaminating pollutants (Gabrys, 2011). The electricity supplies that power digital technologies and digital data storage units themselves have environmental effects on humans and other living things, such as the release of smoke and particles from coal-fired electricity generating plants. ‘The digital is a regime of energies: human energy and the energy needed for technological machines’ (Parikka, 2013: no page given).

The materiality of digital objects is also apparent in debates over how and where digital data should be stored, as they require ever-larger physical structures (servers) for archiving purposes. Despite the metaphor of the computing ‘cloud’, digital data do not hover in the ether but must be contained within hardware. Furthermore, digital data are very difficult to erase or remove, and thus can be very stubbornly material. At the same time, however, if stored too long and not used, they may quickly become obsolete and therefore useless, if contemporary technologies can no longer access and make use of them. Digital data, therefore, may be said to ‘decay’ if left too long, and lost and forgotten, if they are not migrated to new technological formats. Digital memory is volatile because the technologies used to store and access data change so quickly. Analogue materials that are rendered into digital form for archival purposes and then destroyed may therefore be lost if their digital forms can no longer be used (Gabrys, 2011).

References

Aslinger B and Huntemann N. (2013) Digital media studies futures. Media, Culture & Society 35(1): 9-12.

Beer D. (2009) Power through the algorithm? Participatory web cultures and the technological unconscious. New Media & Society 11(6): 985-1002.

Beer D. (2013) Popular Culture and New Media: the Politics of Circulation, Houndmills: Palgrave Macmillan.

Bell G. (2006) ‘Satu keluarga, satu komputer’ (one home, one computer): cultural accounts of ICTs in South and Southeast Asia. Design Issues 22(2): 35-55.

Bell G and Dourish P. (2007) Yesterday’s tomorrows: notes on ubiquitous computing’s dominant vision. Personal and Ubiquitous Computing 11(2): 133-143.

Bell G and Dourish P. (2011) Divining a Digital Future: Mess and Mythology in Ubiquitous Computing, Cambridge, Mass: MIT Press.

boyd d and Crawford K. (2012) Critical questions for Big Data: provocations for a cultural, technological, and scholarly phenomenon. Information, Communication & Society 15(5): 662-679.

Burrows R. (2012) Living with the h-index? Metric assemblages in the contemporary academy. The Sociological Review 60(2): 355-372.

Caplan P. (2013) Software tunnels through the rags ‘n refuse: object oriented software studes and platform politics. Culture Machine, 14. (accessed 8 August 2013).

Cheney-Lippold J. (2011) A new algorithmic identity: soft biopolitics and the modulation of control. Theory, Culture & Society 28(6): 164-181.

Dourish P and Bell G. (2007) The infrastructure of experience and the experience of infrastructure: meaning and structure in everyday encounters with space. Environment and Planning B: Planning & Design 34(3): 414-430.

Fuller M. (2008) Introduction, the stuff of software. In: Fuller M (ed) Software Studies: A Lexicon. Cambridge, MA: The MIT Press, 1-13.

Gabrys J. (2011) Digital Rubbish: A Natural History of Electronics, Ann Arbor, MI: University of Michigan Press.

Haggerty K and Ericson R. (2000) The surveillant assemblage. British Journal of Sociology 51(4): 605-622.

Hands J. (2013) Introduction: politics, power and ‘platformativity’. Culture Machine, 14. (accessed 5 February 2014).

Kitchin R and Dodge M. (2011) Code/Space: Software and Everyday Life, Cambridge, Mass: MIT Press.

Langois G and Elmer G. (2013) The research politics of social media platforms. Culture Machine, 14. (accessed 8 August 2013).

Mackenzie A. (2005) The performativity of code: software and cultures of circulation. Theory, Culture & Society 22(1): 71-92.

Mackenzie A and Vurdubakis T. (2011) Codes and codings in crisis: signification, performativity and excess. Theory, Culture & Society 28(6): 3-23.

Manovich L. (2013) The algorithms of our lives. The Chronicle of Higher Education. (accessed 17 December 2013).

Marres N. (2012) Material Participation: Technology, the Environment and Everyday Publics, New York: Palgrave Macmillan.

Miller D and Horst H. (2012) The digital and the human: a prospectus for digital anthropology. In: Horst H and Miller D (eds) Digital Anthropology. London: Berg, 3-35.

Parikka J. (2013) Dust and exhaustion: the labor of media materialism. CTheory. (accessed 2 November 2013).

Ruppert E, Law J and Savage M. (2013) Reassembling social science methods: the challenge of digital devices. Theory, Culture & Society 30(4): 22-46.

 

The five modes of self-tracking

Recently I have been working on a conference paper that seeks to outline the five different modes of self-tracking that I have identified as currently in existence. (Update – the full paper can now be downloaded here).

I argue that there is evidence that the personal data that are derived from individuals engaging in reflexive self-monitoring are now beginning to be used by agencies and organisations beyond the personal and privatised realm. Self-tracking rationales and sites are proliferating as part of a ‘function creep’ of the technology and ethos of self-tracking. The detail offered by these data on individuals and the growing commodification and commercial value of digital data have led government, managerial and commercial enterprises to explore ways of appropriating self-tracking for their own purposes. In some contexts people are encouraged, ‘nudged’, obliged or coerced into using digital devices to produce personal data which are then used by others.

The paper examines these issues, outlining five modes of self-tracking that have emerged: private, pushed, communal, imposed and exploited. There are intersections and recursive relationships between each of these self-tracking modes. However there are also observable differences related to the extent to which the self-tracking is taken up voluntarily and the purposes to which the data thus created are put.

Here are definitions of the typology of self-tracking that I have developed:

  • Private self-tracking relates to self-tracking practices that are taken up voluntarily as part of the quest for self-knowledge and self-optimisation and as an often pleasurable and playful mode of selfhood. Private self-tracking, as espoused in the Quantified Self’s goal of ‘self  knowledge through numbers’, is undertaken for purely personal reasons and the data are kept private or shared only with limited and selected others. This is perhaps the most public and well-known face of self-tracking.
  • Pushed self-tracking represents a mode that departs from the private self-tracking mode in that the initial incentive for engaging in self-tracking comes from another actor or agency. Self-monitoring may be taken up voluntarily, but in response to external encouragement or advocating rather than as a personal and wholly private initiative. Examples include the move in preventive medicine, health promotion and patient self-care to encourage people to monitor their biometrics to achieve targeted health goals. The workplace has become a key site of pushed self-tracking, particularly in relation to corporate wellness programs where workers are encourage to take up self-tracking and share their data with their employer.
  • Communal self-tracking involves the voluntary sharing of a tracker’s personal data with other people. They may use social media, platforms designed for comparing and sharing personal data and sites such as the Quantified Self website to engage with and learn from other self-trackers. Some attend meetups or conferences to meet face-to-face with other self-trackers and share their data and evaluations of the value of different techniques and devices for self-tracking. This mode is also evident in citizen science, citizen sensing and community development initiatives using data collected by individuals on their local environs, such as air quality, traffic conditions and crime rate that are then aggregated with other participants for use in improving local conditions and services or political action.
  • Imposed self-tracking involves the imposition of self-tracking practices upon individuals by others primarily for these others’ benefit. These include the use of tracking devices as part of worker productivity monitoring and efficiency programs. There is a fine line between pushed self-tracking and imposed self-tracking. While some elements of self-interest may still operate, people may not always have full choice over whether or not they engage in self-tracking. In the case of self-tracking in corporate wellness programs, employees must give their consent to wearing the devices and allowing employers to view their activity data. However failure to comply may lead to higher health insurance premiums enforced by an employer, as is happening in some workplaces in the United States. At its most coercive, imposed self-tracking is used in programs involving monitoring of location and drug use for probation and parole surveillance, drug addiction programs and family law and child custody monitoring.
  • Exploited self-tracking refers to the ways in which individuals’ personal data are repurposed for the (often commercial) benefit of others. Exploited self-tracking is often marketed to consumers as a way for them to benefit personally, whether by sharing their information with others as a form of communal self-tracking or by earning points or rewards. However their data are then used by second parties for their own purposes and in some cases are sold to or used by third parties. Customer loyalty programs, in which consumers voluntarily sign up to have their individual purchasing habits logged by retailers in return for points or rewards is one example. Some retailers (for example a large pharmacy chain in the US) are beginning to use wearable devices as part of their customer rewards schemes, encouraging customers to upload their personal fitness data to their platforms. The data can then be used by the retailers for their marketing, advertising and product offers as well as onsold to third parties.

In the rest of the paper I draw upon theoretical perspectives on concepts of selfhood, citizenship, biopolitics and data practices and assemblages in discussing the wider sociocultural implications of the emergence and development of these modes of self-tracking. I argue that there are many important issues that require further exploring in relation to the appropriation of self-tracking. As humans increasingly become nodes in the Internet of Things, generating and exchanging digital data with other smart, sensor-equipped objects, self-tracking practices will most probably become unavoidable for many people, whether they are taken up voluntarily or pushed or imposed upon them. The evidence outlined in this paper suggests a gradually widening scope for the use of self-tracking that is likely to expand as a growing number of agencies and organisations realise the potential of the data that are produced from these practices.

Edit (12 December 2015): More on this topic can be found in my book The Quantified Self: A Sociology of Self-Tracking Cultures.