Have large numbers of Australians left Facebook? It seems not

I am currently working on analysing interviews from my newest research project ‘Facebook and Trust’. This project was designed in response to the huge publicity given to the Facebook/Cambridge Analytica scandal in March this year. I was interested in investigating how Australian Facebook users were using the platform in the wake of the scandal and what their feelings were about how Facebook make use of the personal information that is uploaded.

Following the scandal, numerous news reports claimed that large numbers of Australians were deleting their Facebook accounts as part of the #DeleteFacebook trend. As one report contended,

Many Australians are for the first time discovering just how much Facebook knows about them and many are shocked, leading them to quit the platform.

A Pew survey of US adults conducted soon after Cambridge Analytica found that around a quarter of respondents had deleted the Facebook app from their phone in the past 12 months, and more than half had adjusted their privacy settings  The survey did not ask directly about why the respondents had taken these measures, and as the time-frame related to the past year there may have been other reasons that these respondents had taken these actions (for example, different controversies over ‘fake news’ or poor content moderation on Facebook that have also received high levels of news media publicity).

Indeed, it is interesting to compare these findings with a previous Pew survey undertaken at the end of 2012, in which over two-thirds of the respondents who were current Facebook users said that they had sometimes voluntarily taken a break from using the platform and one-fifth who said they were not current Facebook users had used the platform at one time but had stopped using it. Those who had taken an extended break or had stopped using Facebook referred to reasons such as not wanting to expend too much time on the platform or finding the content overly personal, trivial or boring. As this survey suggests, some Facebook users have long had ambivalent feelings about using the platform.

There are no reliable statistics that I can find on how many Australians have deleted their Facebook account post-Cambridge Analytica. According to the Social Media Statistics Australia website, which provides a monthly report on Australians’ use of social media, in September 2018 approximately 60% of Australians (across the total population, including children) were active Facebook users, and 50% of Australians were logging on once a day. A similar proportion of Australians were regular YouTube users: both platforms had 15 million active monthly users. Next in order of popularity were Instagram (9 million users per month), Snapchat (6.4 million), WhatsApp (6 million), Twitter (4.7 million), LinkedIn (4.5 million) and Tumblr (3.7 million).

In terms of age breakdown, the site reports that in September 2018, Australians aged 25 to 39 years were the largest group of Facebook users (6.1 million), followed by those aged 40 to 55 (4.1 million), 18 to 25 (3.5 million), 55 to 64 (1.6 million) and 65 years and over (1.2 million). Less than a million of Australians aged 13 to 17 years used Facebook,

I compared the report for February 2018 (the month before the Cambridge Analytica scandal was publicised) and May 2018 (soon after the scandal) with the figures for September 2018. The website reports that in both February and May 2018, there were 15 million monthly active Australian users, just as there were for September 2018. So if large numbers of Australians have deleted their accounts, this is not showing up in these data.

The interviews I am currently analysing should cast some light on how Australian Facebook users have responded (if at all) to the Cambridge Analytica scandal and other privacy-related issues concerning the personal information they upload to Facebook. I’ll provide an update on the findings once I finish working through the interviews.

The politics of privacy in the digital age

The latest except from my forthcoming book Digital Sociology (due to be released by Routledge on 12 November 2014). This one is from Chapter 7: Digital Politics and Citizen Digital Public Engagement.

The distinction between public and private has become challenged and transformed via digital media practices. Indeed it has been contended that via the use of online confessional practices, as well as the accumulating masses of data that are generated about digital technology users’ everyday habits, activities and preferences, the concept of privacy has changed. Increasingly, as data from many other users are aggregated and interpreted using algorithms, one’s own data has an impact on others by predicting their tastes and preferences (boyd, 2012). The concept of ‘networked privacy’ developed by danah boyd (2012) acknowledges this complexity. As she points out, it is difficult to make a case for privacy as an individual issue in the age of social media networks and sousveillance. Many people who upload images or comments to social media sites include other people in the material, either deliberately or inadvertently. As boyd (2012: 348) observes, ‘I can’t even count the number of photos that were taken by strangers with me in the background at the Taj Mahal’.

Many users have come to realise that the information about themselves and their friends and family members that they choose to share on social media platforms may be accessible to others, depending on the privacy policy of the platform and the ways in which users have operated privacy settings. Information that is shared on Facebook, for example, is far easier to limit to Facebook friends if privacy settings restrict access than are data that users upload to platforms such as Twitter, YouTube or Instagram, which have few, if any, settings that can be used to limit access to personal content. Even within Facebook, however, users must accept that their data may be accessed by those that they have chosen as friends. They may be included in photos that are uploaded by their friends even if they do not wish others to view the photo, for example.

Open source data harvesting tools are now available that allow people to search their friends’ data. Using a tool such as Facebook Graph Search, people who have joined that social media platform can mine the data uploaded by their friends and search for patterns. Such elements as ‘photos of my friends in New York’ or ‘restaurants my friends like’ can be identified using this tool. In certain professions, such as academia, others can use search engines to find out many details about one’s employment details and accomplishments (just one example is Google Scholar, which lists academics’ publications as well as how often and where they have been cited by others). Such personal data as online photographs or videos of people, their social media profiles and online comments can easily be accessed by others by using search engines.

Furthermore, not only are individuals’ personal data shared in social networks, they may now be used to make predictions about others’ actions, interests, preferences or even health states (Andrejevic, 2013; boyd, 2012). When people’s small data are aggregated with others to produce big data, the resultant datasets are used for predictive analytics (Chapter 5). As part of algorithmic veillance and the production of algorithmic identities, people become represented as configurations of others in the social media networks with which they engage and the websites people characterised as ‘like them’ visit. There is little, if any, opportunity to opt out of participation in these data assemblages that are configured about oneself.

A significant tension exists in discourses about online privacy. Research suggests that people hold ambivalent and sometimes paradoxical ideas about privacy in digital society. Many people value the use of dataveillance for security purposes and for improving economic and social wellbeing. It is common for digital media users to state that they are not concerned about being monitored by others online because they have nothing to hide (Best, 2010). On the other hand, however, there is evidence of unease about the continuous, ubiquitous and pervasive nature of digital surveillance. It has become recognised that there are limits to the extent to which privacy can be protected, at least in terms of individuals being able to exert control over access to digital data about themselves or enjoy the right to be forgotten (Rosen, 2012; Rosenzweig, 2012). Some commentators have contended that notions of privacy, indeed, need to be rethought in the digital era. Rosenzweig (2012) has described previous concepts as ‘antique privacy’, which require challenging and reassessment in the contemporary world of ubiquitous dataveillance. He asserts that in weighing up rights and freedoms, the means, ends and consequences of any dataveillance program should be individually assessed.

Recent surveys of Americans by the Pew Research Center (Rainie and Madden, 2013) have found that the majority still value the notion of personal privacy but also value the protection against criminals or terrorists that breaches of their own privacy may offer. Digital technology users for the most part are aware of the trade-off between protecting their personal data from others’ scrutiny or commercial use, and gaining benefits from using digital media platforms that collect these data as a condition of use. This research demonstrates that the context in which personal data are collected is important to people’s assessments of whether their privacy should be intruded upon. The Americans surveyed were more concerned about others knowing the content of their emails than their internet searches, and were more likely to experience or witness breaches of privacy in their own social media networks than to be aware of government surveillance of their personal data.

Another study using qualitative interviews with Britons (The Wellcome Trust, 2013) investigated public attitudes to personal data and the linking of these data. The research found that many interviewees demonstrated a positive perspective on the use of big data for national security and the prevention and detection of crime, improving government services, the allocation of resources and planning, identifying social and population trends, convenience and time-saving when doing shopping and other online transactions, identifying dishonest practices and making vital medical information available in an emergency. However the interviewees also expressed a number of concerns about the use of their data, including the potential for the data to be lost, stolen, hacked or leaked and shared without consent, the invasion of privacy when used for surveillance, unsolicited marketing and advertising, the difficulty of correcting inaccurate data on oneself and the use of the data to discriminate against people. Those interviewees of low socioeconomic status were more likely to feel powerless about dealing with potential personal data breaches, identity theft or the use of their data to discriminate against them.

References

Andrejevic, M. (2013) Infoglut: How Too Much Information is Changing the Way We Think and KnowNew York: Routledge.

Best, K. (2010) Living in the control society: surveillance, users and digital screen technologies. International Journal of Cultural Studies, 13, 5-24.

boyd, d. (2012) Networked privacy. Surveillance & Society, 10, 348-50.

Rainie, L. & Madden, M. (2013) 5 findings about privacy. http://networked.pewinternet.org/2013/12/23/5-findings-about-privacy, accessed 24 December 2013.

Rosen, J. (2012) The right to be forgotten. Stanford Law Review Online, 64 (88). http://www.stanfordlawreview.org/online/privacy-paradox/right-to-be-forgotten/, accessed 21 November 2013.

Rosenzweig, P. (2012) Whither privacy? Surveillance & Society, 10, 344-47.

The Wellcome Trust (2013) Summary Report of Qualitative Research into Public Attitudes to Personal Data and Linking Personal Data [online text], The Wellcome Trust http://www.wellcome.ac.uk/stellent/groups/corporatesite/@msh_grants/documents/web_document/wtp053205.pdf