Managing and materialising data as part of self-tracking

Like many other forms of digital data, self-tracking data have a vitality and social life of their own, circulating across and between a multitude of sites. In a context in which digital data are culturally represented as liquid entities that require management and containment, part of the project of managing the contemporary body is that of containment of the data that one’s body produces. As discursive representations of self-tracking and the quantified self frequently contend, personal data are profligate: it is only right that one should seek first, to collect these data, and second, to manage and discipline the data by aggregating them, representing them visually, and making sense of them.

Shifting forms of selfhood are configured via these digital data assemblages, depending on the context in and purpose for which they are assembled. As the digital data produced by self-tracking are constantly generated and the combinations of data sets that may be brought together on individuals are numerous, personal data assemblages are never stable or contained. They represent a ‘snap-shot’ of a particular moment in time and a particular rationale of data practice. The data assemblages are always mutable, dynamic, responsive to new inputs and interpretations. They thus represent a type of selfhood that is distributed between different and constantly changing data sets. Self-tracking assemblages are constantly created and recreated when information about individuals is derived via digital technologies and then reassembled for various purposes.

Bodies and selves are always multiple, in whatever context they find themselves. However for self-trackers, this multiplicity is foregrounded in ways that may not have occurred in previous eras. If they are reviewing their personal data regularly, they cannot fail to be confronted with the shifting data assemblages that serve to represent their bodies and their selves. Part of the data practices in which they are invited to engage as part of self-tracking culture, therefore, is the negotiation and sense-making around the hybridity and vitality of their data assemblages.

To gain meaning from these data sets, self-trackers or third parties who seek to use their data must engage in sense-making that can interpret these data and gain some purchase on their mutating forms. An important element of self-tracking practices for many people is the visualisation or presentation of their personal data. The notion that data can be beautiful and aesthetically pleasing when presented in appropriate formats pervades data science in general: Gregg (2015) refers to this phenomenon as the ‘data spectacle’. The generation of digital data visualisations can be, variously, acts of work, creative expression and the presentation or performance of selfhood, with the latter an element in particular of self-tracking practices.

In the ‘show-and-tell’ ethos of the Quantified Self movement, finding compelling visual modes to demonstrate the patterns in one’s data is a central feature. The Quantified Self website is full of demonstrations by members of their data, including videos of their ‘show-and-tell’ presentations and still images of their visualisations. Collecting and aggregating personal data, therefore, are part of a range of practices involving self-knowledge and self-expression. By showing one’s data to others in a visually interesting and explanatory graphic, a self-tracker is achieving both self-knowledge and self-expression. Self-tracking becomes performative, both for the insights that a self-tracker may achieve about her or his life but also in terms of the aesthetics of the data that she or he may be able to curate.

The aesthetic elements of data visualisations involve affective responses that may include both pleasure and anxiety (McCosker and Wilken 2014). Indeed McCosker and Wilken (2014) refer to the tendency in data visualisation circles towards the fetishising and sublimity of ‘beautiful data’ as part of exerting mastery over the seemingly unlimited and thus overwhelming amounts of big digital datasets. Extending this logic, the physical materialising of digital data in the form of a 2D or 3D data materialisation may offer a solution to the anxieties of big data. When it is one’s personal data drawn from one’s own flesh that is being manifested in a material digital data object, this may provoke a sense of mastery over what may be experienced as a continually data-emitting subjectivity. The liquidity, flows and force of personal digital data become frozen in time and space, offering an opportunity to make sense of one’s data.

References

Gregg, M. (2015) Inside the data spectacle. Television & New Media, 16 (1), 37-51.

McCosker, A. and Wilken, R. (2014) Rethinking ‘big data’ as visual knowledge: the sublime and the diagrammatic in data visualisation. Visual Studies, 29 (2), 155-164.

Changing representations of self-tracking

I recently completed a chapter for a book on lifelogging that discussed the concepts and uses of data as they are expressed in representations of self-tracking (see here for the full paper, available open access). In part of the chapter I looked at the ways in which people writing about the quantified self and other interpretations of self-tracking represent data and data practices, including in articles published in Wired magazine and other media outlets and blogs.

From the beginning of discussions of the quantified self, the representation of data in quantified self-tracking discourses (as least as it was expressed by its progenitors) included several factors. These include the following: quantified data are powerful entities; it is important not only to collect quantified data on oneself, but to analyse these data for the patterns and insights they reveal; data (and particularly quantified or quantifiable data) are an avenue to self-knowledge; the emergence of new digital and mobile devices for gathering information about oneself have facilitated self-tracking and the generation of quantified personal data; quantifiable data are more neutral, reliable, intellectual and objective than qualitative data, which are intuitive, emotional and subjective; self-tracked data can provide greater insights than the information that a person receives from their senses, revealing previously hidden patterns or correlations; self-tracked data can be motivational phenomena, inspiring action, by entering into a feedback loop; everything can be rendered as data; and data about individuals are emblematic of their true selves.

In more recent times, however, it is evident that a further set of concepts about self-tracked data have emerged since the original euphoria of the early accounts of quantified self-tracking. They include: the meaning of self-tracked data can be difficult to interpret; personal data can be disempowering as well as empowering; the conditions in which data are gathered can influence their validity; the contexts in which data are generated are vital to understanding their meaning; individuals’ personal data are not necessarily secure or private; quantified personal data can be reductive; and personal data can be used to discriminate against individuals.

We as yet know very little about how people are conceptualising and engaging with digital data about themselves. Given the recent scandals about how people’s personal data may be hacked or used or manipulated without their knowledge (the Snowden revelations about security agencies’ use of metadata, the Facebook emotional manipulation experiment, the celebrity nude photo and Sony Pictures hackings, for example), as well as growing coverage of the potentially negative implications of self-tracking as described above, these are pressing issues.

The cultural specificity of digital health technologies

Digital health technologies configure a certain type of practising medicine and public health, a certain type of patient or lay person and a specific perspective on the human body. The techno-utopian approach to using digital health technologies tends to assume that these tacit norms and assumptions are shared and accepted by all the actors involved, and that they are acting on a universal human body. Yet a cursory examination of surveys of digital health technology use demonstrates that social structural factors such as age, gender, education level, occupation and race/ethnicity, as well as people’s state of health and their geographical location play a major role in influencing how such technologies are taken up among lay people or the extent to which they are able to access the technologies.

An American study of the use of some digital health technologies using representative data collected by the National Cancer Institute in 2012, for example, found no evidence of differences by race or ethnicity, but significant differences for gender, age and socioeconomic status (Kontos et al. 2014). Female respondents were more likely to use online technologies for health-related information, as were younger people (under less than 65) and those of higher socioeconomic status. People of low socioeconomic status were less likely to go online to look for a healthcare provider, use email or the internet to connect with a doctor, track their personal health information online, using a website to track to help track diet, weight or physical activity or download health information to a mobile device. However they were more likely to use social media sites to access or share health information. Women were more likely than men to engage in all of these activities.

While there is little academic research on how different social groups use apps, market research reports have generated some insights. One report showed that women install 40 per cent more apps than men and buy 17 per cent more paid apps. Men use health and fitness apps slightly more (10 per cent) than women (Koetsier 2013). A Nielsen market report on the use of wearable devices found that while men and women used fitness activity bands in equal numbers, women were more likely to use diet and calorie counter apps (Nielsen 2014).

As these findings suggest, gender is one important characteristic that structures the use of digital health technologies. The digital technology culture is generally male-dominated: most technology designers, developers and entrepreneurs are male. As a result, a certain blindness to the needs of women can be evident. For example, when the Apple Health app was announced in 2014, destined to be included as part of a suite of apps on the Apple Watch, it did not include a function for the tracking of menstrual cycles (Eveleth 2014). Gender stereotypes are routinely reproduced in devices such as health and medical apps. As I noted in my study of sexuality and reproduction self-tracking apps, the sexuality apps tend to focus on documenting and celebrating male sexual performance, with little acknowledgement of women’s sexuality, while reproduction apps emphasise women’s over men’s fertility.

App designers and those who develop many other digital technologies for medical and health-related purposes often fail to recognise the social and cultural differences that may influence how people interact with them. Just as cultural beliefs about health and illness vary from culture to culture, so too do responses to the cultural artefacts that are digital health technologies. Aboriginal people living in a remote region of Australia, for example, have very different notions of embodiment, health and disease from those that tend to feature in the health literacy apps that have been developed for mainstream white Australian culture (Christie and Verran 2014). It is therefore not surprising that a review of the efficacy of a number of social media and apps developed for health promotion interventions targeted at Aboriginal Australians found no evidence of their effectiveness or benefit to this population (Brusse et al. 2014).

Few other analyses have sought to highlight the cultural differences in which people respond to and use digital health technologies. This kind of research is surely imperative to challenge existing assumptions about ‘the user’ of these technologies and provide greater insights into their benefits and limitations.

My publications for 2014

This the list of my publications that came out in 2014. If you would like a copy of any of the articles, please contact me on deborah.lupton@canberra.edu.au.

Books

Lupton, D. (2015) Digital Sociology (Routledge: this  has a 2015 publication date, but actually was published in November 2014).

Special Journal Issue

Editor of special issue on ‘Beyond techno-utopia: critical approaches to digital health technologies’, Societies (volume 4, number 2), 2014.

Book Chapters

Lupton, D. (2014) The reproductive citizen: motherhood and health education. In Fitzpatrick, K. and Tinning, R. (eds), Health Education: Critical Perspectives. London: Routledge, pp. 48—60.

Lupton, D. (2014) Unborn assemblages: shifting configurations of embryonic and foetal embodiment. In Nash, M. (ed), Reframing Reproduction: Conceiving Gendered Experiences. Houndmills: Palgrave Macmillan.

Peer-reviewed Journal Articles

Lupton, D. (2014) ‘How do you measure up?’ Assumptions about ‘obesity’ and health-related behaviors in ‘obesity’ prevention campaigns. Fat Studies, 3(1), 32—44.

Lupton, D. (2014) The commodification of patient opinion: the digital patient experience economy in the age of big data. Sociology of Health & Illness, 36(6), 856—69.

Lupton, D. (2014) Precious, pure, uncivilised, vulnerable: infant embodiment in the Australian popular media. Children & Society, 28(5), 341—51.

Lupton, D. (2014) Quantified sex: a critical analysis of sexual and reproductive self-tracking apps. Culture, Health & Sexuality, online first, doi: 1080/13691058.2014.920528.

Lupton, D. (2014) Data assemblages, sentient schools and digitised HPE (response to Gard). Sport, Education and Society, online first, doi: 1080/13573322.2014.962496.

Lupton, D. (2014) Health promotion in the digital era: a critical commentary. Health Promotion International, online first, doi: 10.1093/heapro/dau091.

Lupton, D. (2014) Apps as artefacts: towards a critical sociological perspective on health and medical apps. Societies, 4, 606—22.

Lupton, D. (2014) Critical perspectives on digital health technologies. Sociology Compass, 8(12), 1344—59.

Editorials

Lupton, D. (2014) Beyond techno-utopia: critical approaches to digital health technologies. Societies, 4(4), 706—11.

Other Academic Publications

Lupton, D. (2014) Risk. In Cockerham, W., Dingwall, R. and Quah, S. (eds), The Wiley-Blackwell Encyclopedia of Health, Illness, Behavior and Society. New York: Blackwell, pp. 2067—71.

Lupton, D. (2014) Feeling Better Connected’: Academics’ Use of Social Media. Canberra: News & Media Research Centre.

Itinerary for my trip to England in January 2015

Next month I will be visiting England to give talks and meet colleagues. It’s a whirlwind visit, with 8 talks at 7 universities in five days. The itinerary and further details are provided below for those who might be interested in coming along to any of the talks.

Monday 12 January

  • 10.30 am—3.00 pm: NSMNSS Knowledge Exchange Event, London: Speaking on ‘Using social media for academic work – possibilities, benefits and risks’. Further details here.
  • 5.00 pm—6.30 pm: Seminar at UCL, London. Speaking on ‘Fabricated data bodies: reflections on 3D printed digital body objects in medical and health domains‘. Venue: Daryll Forde room, Department of Anthropology, UCL.

Tuesday 13 January

  • 2.00 pm—4.00 pm: Sociological Perspectives on Digital Health event, Warwick University. Speaking on ‘Critical digital health studies: a research agenda’. Further details here.
  • 5.00 pm-7.00 pm: What is Digital Sociology? event, Warwick University. Speaking on ‘What is digital sociology?’. Further details here.

Wednesday 14 January

  • 9.30 am—12.00 pm: Workshop at the Department of Primary Care Health Sciences, Green Templeton College, Oxford University. Workshop topic is ‘Theorising and researching medical and health apps and wearable self-tracking devices‘.
  • 5.00 pm—7.00 pm: Digital Sociology event, Goldsmiths, University of London. Speaking on a panel on ‘Digital sociology, digital cultures, web science, data science  … what’s the difference?’. Further details here.

Thursday 15 January

  • 10.00 am—4.00 pm: ‘Biosensors in Everyday Life’ workshop at Lancaster University. Speaking on ‘Self-tracking cultures: thinking sociologically about the quantified self’. Further details here.

Friday 16 January

  • 12.00 pm—4.00 pm: Yorkshire BSA Medsoc group event, University of York. Speaking on ‘Digital data, big and small: some critical sociological reflections‘. Further details here.

 

 

Towards a sociology of 3D printing

As a digital sociologist, I have become fascinated by the social and cultural implications of 3D printing technologies. Few sociologists or any other critical academic commentators have begun to investigate how 3D printing is beginning to affect society. Yet as 3D printing technologies move into an expanding realm of contexts, there is much opportunity to analyse their effects. Not only are these technologies having an impact on industrial manufacturing and the distribution of goods, makers, artists and designers are taking them up in intriguing ways. 3D printing is being used in medicine and dentistry, public relations and marketing and in fan cultures. These technologies are being introduced into schools and incorporated into the curriculum. As the price of 3D printers falls, they will become an addition to more households. There are significant environmental and legal issues in relation to how they are used, including questions about intellectual property.

As part of my initial explorations into the sociology of 3D printing, last week I published two pieces on these technologies. One was an article for The Conversation, in which I discussed the phenomenon of the 3D self replica. This is a figurine that can be made of a person using the digital data derived from 3D scanning software. The technologies to generate these artefacts are rapidly moving into a range of leisure domains, including sporting events, shopping centres, airports, concerts and amusement parks as well as fan cultures and marketing programs. 3D printed self replicas can even be made at home using a software package developed for the Xbox Kinect game box and a home 3D printer. Some commentators have referred to these replicas as ‘3D selfies’ because they involve the production of a personal likeness. In the article I speculated about the ways in which people may start to use these figures as markers or mementos of their bodies and social relationships.

The second piece was an academic article that discusses the use of 3D printing of what I entitle ‘digital body objects’ for medical and health-related purposes. The article explores the use of non-organic materialisations of people’s body parts for medical purposes as well as the fabrication of self-tracked bodily data into objects. Here is the abstract: the full paper can be accessed here:

The advent of 3D printing technologies has generated new ways of representing and conceptualising health and illness, medical practice and the body. There are many social, cultural and political implications of 3D printing, but a critical sociology of 3D printing is only beginning to emerge. In this article I seek to contribute to this nascent literature by addressing some of the ways in which 3D printing technologies are being used to convert digital data collected on human bodies and fabricate them into tangible forms that can be touched and held. I focus in particular on the use of 3D printing to manufacture non-organic replicas of individuals’ bodies, body parts or bodily functions and activities. The article is also a reflection on a specific set of digital data practices and the meaning of such data to individuals. In analysing these new forms of human bodies, I draw on sociomaterialist perspectives as well as the recent work of scholars who have sought to reflect on selfhood, embodiment, place and space in digital society and the nature of people’s interactions with digital data. I argue that these objects incite intriguing ways of thinking about the ways in digital data on embodiment, health and illnesses are interpreted and used across a range of contexts. The article ends with some speculations about where these technologies may be headed and outlining future research directions.

These initial forays into a sociology of 3D printing represent merely a small component of possible avenues for theorising and research into the social impact of this technology. What I am particularly interested in at the moment is the implications for people’s data practices, or how the material objects that are generated from 3D printing technologies act as ‘solidified’ personal data. Future writings will investigate these issues in greater depth.

Digital Sociology now out

Digital Sociology has now been published (click here for the Amazon link and here for the publisher’s link).

 

The publisher’s blurb is below:

Digital Sociology

We now live in a digital society. New digital technologies have had a profound influence on everyday life, social relations, government, commerce, the economy and the production and dissemination of knowledge. People’s movements in space, their purchasing habits and their online communication with others are now monitored in detail by digital technologies. We are increasingly becoming digital data subjects, whether we like it or not, and whether we choose this or not.

The sub-discipline of digital sociology provides a means by which the impact, development and use of these technologies and their incorporation into social worlds, social institutions and concepts of selfhood and embodiment may be investigated, analysed and understood. This book introduces a range of interesting social, cultural and political dimensions of digital society and discusses some of the important debates occurring in research and scholarship on these aspects. It covers the new knowledge economy and big data, reconceptualising research in the digital era, the digitisation of higher education, the diversity of digital use, digital politics and citizen digital engagement, the politics of surveillance, privacy issues, the contribution of digital devices to embodiment and concepts of selfhood and many other topics.

Digital Sociology is essential reading not only for students and academics in sociology, anthropology, media and communication, digital cultures, digital humanities, internet studies, science and technology studies, cultural geography and social computing, but for other readers interested in the social impact of digital technologies.

The politics of privacy in the digital age

The latest except from my forthcoming book Digital Sociology (due to be released by Routledge on 12 November 2014). This one is from Chapter 7: Digital Politics and Citizen Digital Public Engagement.

The distinction between public and private has become challenged and transformed via digital media practices. Indeed it has been contended that via the use of online confessional practices, as well as the accumulating masses of data that are generated about digital technology users’ everyday habits, activities and preferences, the concept of privacy has changed. Increasingly, as data from many other users are aggregated and interpreted using algorithms, one’s own data has an impact on others by predicting their tastes and preferences (boyd, 2012). The concept of ‘networked privacy’ developed by danah boyd (2012) acknowledges this complexity. As she points out, it is difficult to make a case for privacy as an individual issue in the age of social media networks and sousveillance. Many people who upload images or comments to social media sites include other people in the material, either deliberately or inadvertently. As boyd (2012: 348) observes, ‘I can’t even count the number of photos that were taken by strangers with me in the background at the Taj Mahal’.

Many users have come to realise that the information about themselves and their friends and family members that they choose to share on social media platforms may be accessible to others, depending on the privacy policy of the platform and the ways in which users have operated privacy settings. Information that is shared on Facebook, for example, is far easier to limit to Facebook friends if privacy settings restrict access than are data that users upload to platforms such as Twitter, YouTube or Instagram, which have few, if any, settings that can be used to limit access to personal content. Even within Facebook, however, users must accept that their data may be accessed by those that they have chosen as friends. They may be included in photos that are uploaded by their friends even if they do not wish others to view the photo, for example.

Open source data harvesting tools are now available that allow people to search their friends’ data. Using a tool such as Facebook Graph Search, people who have joined that social media platform can mine the data uploaded by their friends and search for patterns. Such elements as ‘photos of my friends in New York’ or ‘restaurants my friends like’ can be identified using this tool. In certain professions, such as academia, others can use search engines to find out many details about one’s employment details and accomplishments (just one example is Google Scholar, which lists academics’ publications as well as how often and where they have been cited by others). Such personal data as online photographs or videos of people, their social media profiles and online comments can easily be accessed by others by using search engines.

Furthermore, not only are individuals’ personal data shared in social networks, they may now be used to make predictions about others’ actions, interests, preferences or even health states (Andrejevic, 2013; boyd, 2012). When people’s small data are aggregated with others to produce big data, the resultant datasets are used for predictive analytics (Chapter 5). As part of algorithmic veillance and the production of algorithmic identities, people become represented as configurations of others in the social media networks with which they engage and the websites people characterised as ‘like them’ visit. There is little, if any, opportunity to opt out of participation in these data assemblages that are configured about oneself.

A significant tension exists in discourses about online privacy. Research suggests that people hold ambivalent and sometimes paradoxical ideas about privacy in digital society. Many people value the use of dataveillance for security purposes and for improving economic and social wellbeing. It is common for digital media users to state that they are not concerned about being monitored by others online because they have nothing to hide (Best, 2010). On the other hand, however, there is evidence of unease about the continuous, ubiquitous and pervasive nature of digital surveillance. It has become recognised that there are limits to the extent to which privacy can be protected, at least in terms of individuals being able to exert control over access to digital data about themselves or enjoy the right to be forgotten (Rosen, 2012; Rosenzweig, 2012). Some commentators have contended that notions of privacy, indeed, need to be rethought in the digital era. Rosenzweig (2012) has described previous concepts as ‘antique privacy’, which require challenging and reassessment in the contemporary world of ubiquitous dataveillance. He asserts that in weighing up rights and freedoms, the means, ends and consequences of any dataveillance program should be individually assessed.

Recent surveys of Americans by the Pew Research Center (Rainie and Madden, 2013) have found that the majority still value the notion of personal privacy but also value the protection against criminals or terrorists that breaches of their own privacy may offer. Digital technology users for the most part are aware of the trade-off between protecting their personal data from others’ scrutiny or commercial use, and gaining benefits from using digital media platforms that collect these data as a condition of use. This research demonstrates that the context in which personal data are collected is important to people’s assessments of whether their privacy should be intruded upon. The Americans surveyed were more concerned about others knowing the content of their emails than their internet searches, and were more likely to experience or witness breaches of privacy in their own social media networks than to be aware of government surveillance of their personal data.

Another study using qualitative interviews with Britons (The Wellcome Trust, 2013) investigated public attitudes to personal data and the linking of these data. The research found that many interviewees demonstrated a positive perspective on the use of big data for national security and the prevention and detection of crime, improving government services, the allocation of resources and planning, identifying social and population trends, convenience and time-saving when doing shopping and other online transactions, identifying dishonest practices and making vital medical information available in an emergency. However the interviewees also expressed a number of concerns about the use of their data, including the potential for the data to be lost, stolen, hacked or leaked and shared without consent, the invasion of privacy when used for surveillance, unsolicited marketing and advertising, the difficulty of correcting inaccurate data on oneself and the use of the data to discriminate against people. Those interviewees of low socioeconomic status were more likely to feel powerless about dealing with potential personal data breaches, identity theft or the use of their data to discriminate against them.

References

Andrejevic, M. (2013) Infoglut: How Too Much Information is Changing the Way We Think and KnowNew York: Routledge.

Best, K. (2010) Living in the control society: surveillance, users and digital screen technologies. International Journal of Cultural Studies, 13, 5-24.

boyd, d. (2012) Networked privacy. Surveillance & Society, 10, 348-50.

Rainie, L. & Madden, M. (2013) 5 findings about privacy. http://networked.pewinternet.org/2013/12/23/5-findings-about-privacy, accessed 24 December 2013.

Rosen, J. (2012) The right to be forgotten. Stanford Law Review Online, 64 (88). http://www.stanfordlawreview.org/online/privacy-paradox/right-to-be-forgotten/, accessed 21 November 2013.

Rosenzweig, P. (2012) Whither privacy? Surveillance & Society, 10, 344-47.

The Wellcome Trust (2013) Summary Report of Qualitative Research into Public Attitudes to Personal Data and Linking Personal Data [online text], The Wellcome Trust http://www.wellcome.ac.uk/stellent/groups/corporatesite/@msh_grants/documents/web_document/wtp053205.pdf

 

Seams in the cyborg

Another excerpt from my forthcoming book Digital Sociology (due to be released on 12 November 2014). From chapter 8: ‘The Digitised Body/Self’.

Such is the extent of our intimate relations with digital technologies that we often respond emotionally to the devices themselves and to the content contained within or created by these devices. The design of digital devices and software interfaces is highly important to users’ responses to them. Devices such as iPhones are often described in highly affective and aestheticised terms: as beautiful playthings, glossy and shiny objects of desire, even as edible or delicious. Advertising for the iPhone and other Apple devices often focus on inspiring child-like wonder at their beauty and magical capabilities (Cannon and Barker 2012).

Affective responses to material objects are integral to their biographical meaning to their owners and their participation in intimate relationships. Writers on material culture and affect have noted the entangling of bodies/selves with physical objects and how artefacts act as extensions or prostheses of the body/self, becoming markers of personhood. Objects become invested with sentimental value by virtue of their association with specific people and places, and thus move from anonymous, mass-produced items to biographically-inscribed artefacts that bear with them personal meanings. Over use and with time, such initially anonymised objects become personalised prosthetics of the self, their purely functional status and monetary value replaced by more personal and sentimental value (Miller 2008, Turkle 2007).

… Bell and Dourish (2011) refer to the mythologies and the mess of ubiquitous computing technologies. By myths they mean the cultural stories, values and meanings that are drawn upon to make sense and represent these technologies. The types of myths surrounding new digital technologies tend to focus on their very novelty, their apparent divergence from what has come before them and their ability to provide solutions to problems. The ‘mess’ of digital technologies inheres in the challenges to myths that suggest that they are infallible, offer an ideal solution to a problem: the ‘practical reality’ of their everyday use (Bell & Dourish, 2011, p. 4). When digital technologies operate as we expect them to, they feel as they are inextricably part of our bodies and selves. Inevitably, however, there are moments when we become aware of our dependence on technologies, or find them annoying or difficult to use, or lose interest in them. Technologies break down, fail to work as expected; infrastructure and government regulations may not support them adequately; users may become bored with using them or their bodies may rebel and develop over-use symptoms. There may be resistances, personal or organised, to their use, and contestations over their meanings and value (Lupton, 1995; Miller & Horst, 2012).

Freund (2004, p. 273) uses the term ‘technological habitus’ to describe the ‘internalised control’ and kinds of consciousness required of individuals to function in technological environments such as those currently offered in contemporary western societies. The human/machine entity, he argues, is not seamless: rather there are disjunctions – or, as he puts it, ‘seams in the cyborg’ – where fleshly body and machine do not intermesh smoothly, and discomfort, stress or disempowerment may result. Sleep patterns, increasing work and commuting time and a decrease in leisure time, for example, can be disrupted by the use of technologies, causing illness, stress and fatigue. Our bodies may begin to alert us that these objects are material in the ways that they affect our embodiment: through eye-strain, hand, neck or back pain or headaches from using the devices too much (Lupton, 1995).

People may feel overwhelmed by the sheer mass of data conveyed by their digital devices and the need to keep up with social network updates. Analyses of social media platforms such as Facebook are beginning to appear that suggest that users may simultaneously recognise their dependence upon social media to maintain their social network but may also resent this dependence and the time that is taken up in engaging with them, even fearing that they may be ‘addicted’ to their use (Davis, 2012). Users may also feel ‘invaded’ by the sheer overload of data that may be generated by membership of social networking sites and the difficulty of switching off mobile devices and taking time out from using them (boyd, 2008).

Technology developers are constantly working on ways to incorporate digital devices into embodiment and everyday life, to render them ever less obtrusive and ever more part of our bodies and selves. As the technical lead and manager of the Google Glass (a wearable device that is worn on the face like spectacles) project contends, ‘bringing technology and computing closer to the body can actually improve communication and attention – allowing technology to get further out of the way’ (Starner, 2013, p. no page numbers given, emphasis in the original). He asserts that by rendering these devices smaller and more easily worn on the body, they recede further into the background rather than dominating users’ attention (as is so overtly the case with the current popular smartphone and tablet computers). Despite these efforts, Glass wearers have been subjected to constant attention from others that is often negative and based on the presumption that the device is too obvious, unstylish and unattractive, or that the people who wear them are wealthy computer nerds who do not respect the privacy of others. They have reported many incidences of angry responses from others when wearing Glass in public, even to the point of people ripping the device off their faces or asking them to leave a venue (Gross, 2014). The design of digital devices, therefore, may incite emotional responses not only in the users themselves but also in onlookers.

Some people find wearable self-tracking devices not fashionable enough, or not water-proof enough, or too clunky or heavy, or not comfortable enough to wear, or find that they get destroyed in the washing machine when the user forgets to remove them from their clothing. One designer (Darmour, 2013) has argued that if these technologies remain too obvious, ‘bolting’ these devices to our bodies will ‘distract, disrupt, and ultimately disengage us from others, ultimately degrading our human experience’. She asserts that instead these objects need to be designed more carefully so that they may be integrated into the ‘fabric of our lives’. Her suggested ways of doing this include making them look more beautiful, like jewellery (broaches, necklaces, bracelets, rings), incorporating them into fashionable garments, making them peripheral and making them meaningful: using colours or vibrations rather than numbers to display data readings from these devices.

References

Bell, G., & Dourish, P. (2011). Divining a Digital Future: Mess and Mythology in Ubiquitous Computing. Cambridge, Mass: MIT Press.

Cannon, K., & Barker, J. (2012). Hard candy. In P. Snickars & P. Vonderau (Eds.), Moving Data: The iPhone and the Future of Medicine (pp. 73-88). New York: Columbia University Press.

boyd, d. (2008). Facebook’s privacy trainwreck: exposure, invasion, and social convergence. Convergence, 14(1), 13-20.

Darmour, J. (2013). 3 ways to make wearable tech actually wearable. Co.Design. Retrieved from http://www.fastcodesign.com/1672107/3-ways-to-make-wearable-tech-actually-wearable

Davis, J. (2012). Social media and experiential ambivalence. Future Internet, 4(4), 955-970.

Freund, P. (2004). Civilised bodies redux: seams in the cyborg. Social Theory & Health, 2(3), 273-289.

Gross, A. (2014). What’s the problem with Google Glass? Retrieved from http://www.newyorker.com/online/blogs/currency/2014/03/whats-the-problem-with-google-glass.html

Lupton, D. (1995). The embodied computer/user. Body & Society, 1(3-4), 97-112.

Miller, D. (2008). The Comfort of Things. Cambridge: Polity Press.

Miller, D., & Horst, H. (2012). The digital and the human: a prospectus for digital anthropology. In H. Horst & D. Miller (Eds.), Digital Anthropology (pp. 3-35). London: Berg.

Starner, T. (2013). Google glass lead: how wearing tech on our bodies actually helps it get out of our way. Wired. Retrieved from http://www.wired.com/opinion/2013/12/the-paradox-of-wearables-close-to-your-body-but-keeping-tech-far-away/

Turkle, S. (2007). Evocative Objects: Things We Think With. Cambridge, Mass: Massachusetts Institute of Technology.

The digital tracking of school students in physical education classes: a critique

I have had a new article published in the journal of Sport, Education and Society on the topic of how school  health and physical education (HPE) is becoming digitised and technologies of self-tracking are being introduced into classes. As its title suggests – ‘Data assemblages, sentient schools and digitised HPE (response to Gard)’ – the article outlines some thoughts in response to a piece published in the same journal by another Australian sociologist, Michael Gard. Gard contends that a new era of HPE seems to be emerging in the wake of the digitising of society in general and the commercialising of education, which is incorporating the use of digital technologies.

Few commentators in education, health promotion or sports studies have begun to realise the extent to which digital data surveillance (‘dataveillance’) and analytics are now encroaching into many social institutions and settings and the ways in which actors and agencies in the digital knowledge economy are appropriating these data. In my article I give some examples of the types of surveillance technologies that are being introduced into school HPE. Apps such as Coach’s Eye and Ubersense are beginning to be advocated in HPE circles, as are other health and fitness apps. Some self-tracking apps have been designed specifically for HPE teachers for use with their students. For example the Polar GoFit app with a set of heart rate sensors is expressly designed for HPE teachers as a monitoring tool for students’ physical activities during lessons. It allows teachers to distribute the heart rate sensors to students, set a target zone for heart rate levels and then monitor these online while the lesson takes place, either for individuals or the class as a group.

I argue that there are significant political and ethical implications of the move towards mobilising digital devices to collect personal data on school students. I have elsewhere identified a typology of five modes of self-tracking that involve different levels of voluntary engagement and ways in which personal data are employed. ‘Private’ self-tracking is undertaken voluntarily and initiated by the participant for personal reasons, ‘communal’ self-tracking involves the voluntary sharing of one’s personal data with others, ‘pushed’ self-tracking involves ‘nudging’ or persuasion, ‘imposed’ self-tracking is forced upon people and ‘exploited’ self-tracking involves the use of personal data for the express purposes of others.

Digitised HPE potentially involves all five of these modes. In the context of the institution of the school and the more specific site of HPE, the previous tendencies of HPE to represent paternalistic disciplinary control over the unruly bodies of children and young people and to exercise authority over what the concepts of ‘health’, ‘the ideal body’ and ‘fitness’ should mean can only be exacerbated. More enthusiastic students who enjoy sport and fitness activities may willingly and voluntarily adopt or consent to dataveillance of their bodies as part of achieving personal fitness or sporting performance goals. However when students are forced to wear heart rate monitors to demonstrate that they are conforming to the exertions demanded of them by the HPE teacher, there is little room for resistance. When certain very specific targets of appropriate number of steps, heart-rate levels, body fat or BMI measurements and the like are set and students’ digitised data compared against them, the capacity for the apparatus of HPE to constitute a normalising, surveilling and disciplinary gaze on children and young people and the capacity for using these data for public shaming are enhanced.

The abstract of the article is below. If you would like a copy, please email me on deborah.lupton@canberra.edu.au.

Michael Gard (2014) raises some important issues in his opinion piece on digitised health and physical education (HPE) in the school setting. His piece represents the beginning of a more critical approach to the instrumental and solutionist perspectives that are currently offered on digitised HPE. Few commentators in education, health promotion or sports studies have begun to realise the extent to which digital data surveillance and analytics are now encroaching into many social institutions and settings and the ways in which actors and agencies in the digital knowledge economy are appropriating these data. Identifying what is happening and the implications for concepts of selfhood, the body and social relations, not to mention the more specific issues of privacy and the commercialisation and exploitation of personal data, requires much greater attention than these issues have previously received in the critical social literature. While Gard has begun to do this in his article, there is much more to discuss. In this response, I present some discussion that seeks to provide a complementary commentary on the broader context in which digitised HPE is developing and manifesting. Whether or not one takes a position that is techno-utopian, dystopian or somewhere in between, I would argue that to fully understand the social, cultural and political resonances of digitised HPE, such contextualising is vital.