The politics of privacy in the digital age

The latest except from my forthcoming book Digital Sociology (due to be released by Routledge on 12 November 2014). This one is from Chapter 7: Digital Politics and Citizen Digital Public Engagement.

The distinction between public and private has become challenged and transformed via digital media practices. Indeed it has been contended that via the use of online confessional practices, as well as the accumulating masses of data that are generated about digital technology users’ everyday habits, activities and preferences, the concept of privacy has changed. Increasingly, as data from many other users are aggregated and interpreted using algorithms, one’s own data has an impact on others by predicting their tastes and preferences (boyd, 2012). The concept of ‘networked privacy’ developed by danah boyd (2012) acknowledges this complexity. As she points out, it is difficult to make a case for privacy as an individual issue in the age of social media networks and sousveillance. Many people who upload images or comments to social media sites include other people in the material, either deliberately or inadvertently. As boyd (2012: 348) observes, ‘I can’t even count the number of photos that were taken by strangers with me in the background at the Taj Mahal’.

Many users have come to realise that the information about themselves and their friends and family members that they choose to share on social media platforms may be accessible to others, depending on the privacy policy of the platform and the ways in which users have operated privacy settings. Information that is shared on Facebook, for example, is far easier to limit to Facebook friends if privacy settings restrict access than are data that users upload to platforms such as Twitter, YouTube or Instagram, which have few, if any, settings that can be used to limit access to personal content. Even within Facebook, however, users must accept that their data may be accessed by those that they have chosen as friends. They may be included in photos that are uploaded by their friends even if they do not wish others to view the photo, for example.

Open source data harvesting tools are now available that allow people to search their friends’ data. Using a tool such as Facebook Graph Search, people who have joined that social media platform can mine the data uploaded by their friends and search for patterns. Such elements as ‘photos of my friends in New York’ or ‘restaurants my friends like’ can be identified using this tool. In certain professions, such as academia, others can use search engines to find out many details about one’s employment details and accomplishments (just one example is Google Scholar, which lists academics’ publications as well as how often and where they have been cited by others). Such personal data as online photographs or videos of people, their social media profiles and online comments can easily be accessed by others by using search engines.

Furthermore, not only are individuals’ personal data shared in social networks, they may now be used to make predictions about others’ actions, interests, preferences or even health states (Andrejevic, 2013; boyd, 2012). When people’s small data are aggregated with others to produce big data, the resultant datasets are used for predictive analytics (Chapter 5). As part of algorithmic veillance and the production of algorithmic identities, people become represented as configurations of others in the social media networks with which they engage and the websites people characterised as ‘like them’ visit. There is little, if any, opportunity to opt out of participation in these data assemblages that are configured about oneself.

A significant tension exists in discourses about online privacy. Research suggests that people hold ambivalent and sometimes paradoxical ideas about privacy in digital society. Many people value the use of dataveillance for security purposes and for improving economic and social wellbeing. It is common for digital media users to state that they are not concerned about being monitored by others online because they have nothing to hide (Best, 2010). On the other hand, however, there is evidence of unease about the continuous, ubiquitous and pervasive nature of digital surveillance. It has become recognised that there are limits to the extent to which privacy can be protected, at least in terms of individuals being able to exert control over access to digital data about themselves or enjoy the right to be forgotten (Rosen, 2012; Rosenzweig, 2012). Some commentators have contended that notions of privacy, indeed, need to be rethought in the digital era. Rosenzweig (2012) has described previous concepts as ‘antique privacy’, which require challenging and reassessment in the contemporary world of ubiquitous dataveillance. He asserts that in weighing up rights and freedoms, the means, ends and consequences of any dataveillance program should be individually assessed.

Recent surveys of Americans by the Pew Research Center (Rainie and Madden, 2013) have found that the majority still value the notion of personal privacy but also value the protection against criminals or terrorists that breaches of their own privacy may offer. Digital technology users for the most part are aware of the trade-off between protecting their personal data from others’ scrutiny or commercial use, and gaining benefits from using digital media platforms that collect these data as a condition of use. This research demonstrates that the context in which personal data are collected is important to people’s assessments of whether their privacy should be intruded upon. The Americans surveyed were more concerned about others knowing the content of their emails than their internet searches, and were more likely to experience or witness breaches of privacy in their own social media networks than to be aware of government surveillance of their personal data.

Another study using qualitative interviews with Britons (The Wellcome Trust, 2013) investigated public attitudes to personal data and the linking of these data. The research found that many interviewees demonstrated a positive perspective on the use of big data for national security and the prevention and detection of crime, improving government services, the allocation of resources and planning, identifying social and population trends, convenience and time-saving when doing shopping and other online transactions, identifying dishonest practices and making vital medical information available in an emergency. However the interviewees also expressed a number of concerns about the use of their data, including the potential for the data to be lost, stolen, hacked or leaked and shared without consent, the invasion of privacy when used for surveillance, unsolicited marketing and advertising, the difficulty of correcting inaccurate data on oneself and the use of the data to discriminate against people. Those interviewees of low socioeconomic status were more likely to feel powerless about dealing with potential personal data breaches, identity theft or the use of their data to discriminate against them.

References

Andrejevic, M. (2013) Infoglut: How Too Much Information is Changing the Way We Think and KnowNew York: Routledge.

Best, K. (2010) Living in the control society: surveillance, users and digital screen technologies. International Journal of Cultural Studies, 13, 5-24.

boyd, d. (2012) Networked privacy. Surveillance & Society, 10, 348-50.

Rainie, L. & Madden, M. (2013) 5 findings about privacy. http://networked.pewinternet.org/2013/12/23/5-findings-about-privacy, accessed 24 December 2013.

Rosen, J. (2012) The right to be forgotten. Stanford Law Review Online, 64 (88). http://www.stanfordlawreview.org/online/privacy-paradox/right-to-be-forgotten/, accessed 21 November 2013.

Rosenzweig, P. (2012) Whither privacy? Surveillance & Society, 10, 344-47.

The Wellcome Trust (2013) Summary Report of Qualitative Research into Public Attitudes to Personal Data and Linking Personal Data [online text], The Wellcome Trust http://www.wellcome.ac.uk/stellent/groups/corporatesite/@msh_grants/documents/web_document/wtp053205.pdf

 

Seams in the cyborg

Another excerpt from my forthcoming book Digital Sociology (due to be released on 12 November 2014). From chapter 8: ‘The Digitised Body/Self’.

Such is the extent of our intimate relations with digital technologies that we often respond emotionally to the devices themselves and to the content contained within or created by these devices. The design of digital devices and software interfaces is highly important to users’ responses to them. Devices such as iPhones are often described in highly affective and aestheticised terms: as beautiful playthings, glossy and shiny objects of desire, even as edible or delicious. Advertising for the iPhone and other Apple devices often focus on inspiring child-like wonder at their beauty and magical capabilities (Cannon and Barker 2012).

Affective responses to material objects are integral to their biographical meaning to their owners and their participation in intimate relationships. Writers on material culture and affect have noted the entangling of bodies/selves with physical objects and how artefacts act as extensions or prostheses of the body/self, becoming markers of personhood. Objects become invested with sentimental value by virtue of their association with specific people and places, and thus move from anonymous, mass-produced items to biographically-inscribed artefacts that bear with them personal meanings. Over use and with time, such initially anonymised objects become personalised prosthetics of the self, their purely functional status and monetary value replaced by more personal and sentimental value (Miller 2008, Turkle 2007).

… Bell and Dourish (2011) refer to the mythologies and the mess of ubiquitous computing technologies. By myths they mean the cultural stories, values and meanings that are drawn upon to make sense and represent these technologies. The types of myths surrounding new digital technologies tend to focus on their very novelty, their apparent divergence from what has come before them and their ability to provide solutions to problems. The ‘mess’ of digital technologies inheres in the challenges to myths that suggest that they are infallible, offer an ideal solution to a problem: the ‘practical reality’ of their everyday use (Bell & Dourish, 2011, p. 4). When digital technologies operate as we expect them to, they feel as they are inextricably part of our bodies and selves. Inevitably, however, there are moments when we become aware of our dependence on technologies, or find them annoying or difficult to use, or lose interest in them. Technologies break down, fail to work as expected; infrastructure and government regulations may not support them adequately; users may become bored with using them or their bodies may rebel and develop over-use symptoms. There may be resistances, personal or organised, to their use, and contestations over their meanings and value (Lupton, 1995; Miller & Horst, 2012).

Freund (2004, p. 273) uses the term ‘technological habitus’ to describe the ‘internalised control’ and kinds of consciousness required of individuals to function in technological environments such as those currently offered in contemporary western societies. The human/machine entity, he argues, is not seamless: rather there are disjunctions – or, as he puts it, ‘seams in the cyborg’ – where fleshly body and machine do not intermesh smoothly, and discomfort, stress or disempowerment may result. Sleep patterns, increasing work and commuting time and a decrease in leisure time, for example, can be disrupted by the use of technologies, causing illness, stress and fatigue. Our bodies may begin to alert us that these objects are material in the ways that they affect our embodiment: through eye-strain, hand, neck or back pain or headaches from using the devices too much (Lupton, 1995).

People may feel overwhelmed by the sheer mass of data conveyed by their digital devices and the need to keep up with social network updates. Analyses of social media platforms such as Facebook are beginning to appear that suggest that users may simultaneously recognise their dependence upon social media to maintain their social network but may also resent this dependence and the time that is taken up in engaging with them, even fearing that they may be ‘addicted’ to their use (Davis, 2012). Users may also feel ‘invaded’ by the sheer overload of data that may be generated by membership of social networking sites and the difficulty of switching off mobile devices and taking time out from using them (boyd, 2008).

Technology developers are constantly working on ways to incorporate digital devices into embodiment and everyday life, to render them ever less obtrusive and ever more part of our bodies and selves. As the technical lead and manager of the Google Glass (a wearable device that is worn on the face like spectacles) project contends, ‘bringing technology and computing closer to the body can actually improve communication and attention – allowing technology to get further out of the way’ (Starner, 2013, p. no page numbers given, emphasis in the original). He asserts that by rendering these devices smaller and more easily worn on the body, they recede further into the background rather than dominating users’ attention (as is so overtly the case with the current popular smartphone and tablet computers). Despite these efforts, Glass wearers have been subjected to constant attention from others that is often negative and based on the presumption that the device is too obvious, unstylish and unattractive, or that the people who wear them are wealthy computer nerds who do not respect the privacy of others. They have reported many incidences of angry responses from others when wearing Glass in public, even to the point of people ripping the device off their faces or asking them to leave a venue (Gross, 2014). The design of digital devices, therefore, may incite emotional responses not only in the users themselves but also in onlookers.

Some people find wearable self-tracking devices not fashionable enough, or not water-proof enough, or too clunky or heavy, or not comfortable enough to wear, or find that they get destroyed in the washing machine when the user forgets to remove them from their clothing. One designer (Darmour, 2013) has argued that if these technologies remain too obvious, ‘bolting’ these devices to our bodies will ‘distract, disrupt, and ultimately disengage us from others, ultimately degrading our human experience’. She asserts that instead these objects need to be designed more carefully so that they may be integrated into the ‘fabric of our lives’. Her suggested ways of doing this include making them look more beautiful, like jewellery (broaches, necklaces, bracelets, rings), incorporating them into fashionable garments, making them peripheral and making them meaningful: using colours or vibrations rather than numbers to display data readings from these devices.

References

Bell, G., & Dourish, P. (2011). Divining a Digital Future: Mess and Mythology in Ubiquitous Computing. Cambridge, Mass: MIT Press.

Cannon, K., & Barker, J. (2012). Hard candy. In P. Snickars & P. Vonderau (Eds.), Moving Data: The iPhone and the Future of Medicine (pp. 73-88). New York: Columbia University Press.

boyd, d. (2008). Facebook’s privacy trainwreck: exposure, invasion, and social convergence. Convergence, 14(1), 13-20.

Darmour, J. (2013). 3 ways to make wearable tech actually wearable. Co.Design. Retrieved from http://www.fastcodesign.com/1672107/3-ways-to-make-wearable-tech-actually-wearable

Davis, J. (2012). Social media and experiential ambivalence. Future Internet, 4(4), 955-970.

Freund, P. (2004). Civilised bodies redux: seams in the cyborg. Social Theory & Health, 2(3), 273-289.

Gross, A. (2014). What’s the problem with Google Glass? Retrieved from http://www.newyorker.com/online/blogs/currency/2014/03/whats-the-problem-with-google-glass.html

Lupton, D. (1995). The embodied computer/user. Body & Society, 1(3-4), 97-112.

Miller, D. (2008). The Comfort of Things. Cambridge: Polity Press.

Miller, D., & Horst, H. (2012). The digital and the human: a prospectus for digital anthropology. In H. Horst & D. Miller (Eds.), Digital Anthropology (pp. 3-35). London: Berg.

Starner, T. (2013). Google glass lead: how wearing tech on our bodies actually helps it get out of our way. Wired. Retrieved from http://www.wired.com/opinion/2013/12/the-paradox-of-wearables-close-to-your-body-but-keeping-tech-far-away/

Turkle, S. (2007). Evocative Objects: Things We Think With. Cambridge, Mass: Massachusetts Institute of Technology.

The digital tracking of school students in physical education classes: a critique

I have had a new article published in the journal of Sport, Education and Society on the topic of how school  health and physical education (HPE) is becoming digitised and technologies of self-tracking are being introduced into classes. As its title suggests – ‘Data assemblages, sentient schools and digitised HPE (response to Gard)’ – the article outlines some thoughts in response to a piece published in the same journal by another Australian sociologist, Michael Gard. Gard contends that a new era of HPE seems to be emerging in the wake of the digitising of society in general and the commercialising of education, which is incorporating the use of digital technologies.

Few commentators in education, health promotion or sports studies have begun to realise the extent to which digital data surveillance (‘dataveillance’) and analytics are now encroaching into many social institutions and settings and the ways in which actors and agencies in the digital knowledge economy are appropriating these data. In my article I give some examples of the types of surveillance technologies that are being introduced into school HPE. Apps such as Coach’s Eye and Ubersense are beginning to be advocated in HPE circles, as are other health and fitness apps. Some self-tracking apps have been designed specifically for HPE teachers for use with their students. For example the Polar GoFit app with a set of heart rate sensors is expressly designed for HPE teachers as a monitoring tool for students’ physical activities during lessons. It allows teachers to distribute the heart rate sensors to students, set a target zone for heart rate levels and then monitor these online while the lesson takes place, either for individuals or the class as a group.

I argue that there are significant political and ethical implications of the move towards mobilising digital devices to collect personal data on school students. I have elsewhere identified a typology of five modes of self-tracking that involve different levels of voluntary engagement and ways in which personal data are employed. ‘Private’ self-tracking is undertaken voluntarily and initiated by the participant for personal reasons, ‘communal’ self-tracking involves the voluntary sharing of one’s personal data with others, ‘pushed’ self-tracking involves ‘nudging’ or persuasion, ‘imposed’ self-tracking is forced upon people and ‘exploited’ self-tracking involves the use of personal data for the express purposes of others.

Digitised HPE potentially involves all five of these modes. In the context of the institution of the school and the more specific site of HPE, the previous tendencies of HPE to represent paternalistic disciplinary control over the unruly bodies of children and young people and to exercise authority over what the concepts of ‘health’, ‘the ideal body’ and ‘fitness’ should mean can only be exacerbated. More enthusiastic students who enjoy sport and fitness activities may willingly and voluntarily adopt or consent to dataveillance of their bodies as part of achieving personal fitness or sporting performance goals. However when students are forced to wear heart rate monitors to demonstrate that they are conforming to the exertions demanded of them by the HPE teacher, there is little room for resistance. When certain very specific targets of appropriate number of steps, heart-rate levels, body fat or BMI measurements and the like are set and students’ digitised data compared against them, the capacity for the apparatus of HPE to constitute a normalising, surveilling and disciplinary gaze on children and young people and the capacity for using these data for public shaming are enhanced.

The abstract of the article is below. If you would like a copy, please email me on deborah.lupton@canberra.edu.au.

Michael Gard (2014) raises some important issues in his opinion piece on digitised health and physical education (HPE) in the school setting. His piece represents the beginning of a more critical approach to the instrumental and solutionist perspectives that are currently offered on digitised HPE. Few commentators in education, health promotion or sports studies have begun to realise the extent to which digital data surveillance and analytics are now encroaching into many social institutions and settings and the ways in which actors and agencies in the digital knowledge economy are appropriating these data. Identifying what is happening and the implications for concepts of selfhood, the body and social relations, not to mention the more specific issues of privacy and the commercialisation and exploitation of personal data, requires much greater attention than these issues have previously received in the critical social literature. While Gard has begun to do this in his article, there is much more to discuss. In this response, I present some discussion that seeks to provide a complementary commentary on the broader context in which digitised HPE is developing and manifesting. Whether or not one takes a position that is techno-utopian, dystopian or somewhere in between, I would argue that to fully understand the social, cultural and political resonances of digitised HPE, such contextualising is vital.