Digital sociology and human-computer interaction research

I have been thinking for some time about some of the shared interests of digital sociology and human-computer interaction (HCI) research. In December 2014 I gave a paper at the major annual Australian HCI conference (known as ‘OzCHI’), offering a sociological perspective on self-tracking cultures. And I recently submitted a brief position paper for a workshop on everyday surveillance, to be held as part of the preeminent conference on HCI internationally (referred to as ‘CHI’), to be held in May 2016. Here is what I have to say in this position paper.

Everyday Surveillance: What Digital Sociology Can Offer

In this position paper I outline how perspectives from digital sociology can contribute to researching and theorising everyday surveillance. I contend that sociologists and human-computer interaction (HCI) researchers have tended to conduct their research in relatively separate spheres that would benefit from collaboration and greater use of the literatures in each discipline.

Thus far there has been little interaction between sociologists and HCI studies. Yet there is much potential for the approaches of each area of study to draw insights for each other’s work. Sociologists can learn from innovative methods presented in HCI. For their part, HCI researchers could benefit from the sociocultural theory developed in sociology to provide greater depth to their investigations. While they engage in approaches to researching user experience that offer interesting new methods for sociologists, their work tends to draw on psychological models of behaviour that fail to incorporate the broader social, cultural and political dimensions of everyday beliefs and practices and are often paternalistic in their approach.

Digital sociology is a subdiscipline of sociology that is beginning to blossom. This work draws on a long interest on the part of sociologists in the social, cultural and political elements of the internet, cyberspace and personal computer use. In line with the traditional interests of sociologists, those scholars who have directed attention at digital technologies have emphasised the social determinants of technology use: structuring factors such as gender, age, social class, geographical location, race and ethnicity. As such, their perspective tends to be critical, interested in identifying the power relations and tacit assumptions that underpin social relationships and institutions.Sociologists have adopted a range of social theories, including Marxist-influenced structuralist conflict theory, feminist and poststructuralist Foucauldian theory as well as Latourian actor-network theory, to generate insights into people’s use of digital technologies and the social impact of these technologies.

More specifically, in relation to everyday surveillance, HCI researchers have yet to fully engage in the ground-breaking work of sociologists who have explored the social elements of digital surveillance technologies and the ways in which these technologies are used across a range of domains and for a multitude of purposes.

Several sociologists have sought to investigate how people within specific social groups engage in voluntary and participatory surveillance, typically using ethnographic, focus group or interview-based research to do so. Some survey-based research has sought to identify people’s attitudes to the ways in which their personal data are used by third parties and the accompanying data security and privacy issues, as well as the influence on attitudes of membership of social groups.

An important sociological literature has developed that takes a critical approach to covert or disciplinary surveillance and the spread of such monitoring into many nooks and crannies of everyday life, often without people’s knowledge or consent. Analysis of the social implications of algorithmic sorting on people’s life chances and opportunities (sometimes referred to as ‘algorithmic authority’) has also begun to develop. This literature is part of critical data studies, a developing multidisciplinary field of research incorporating not only sociology but also anthropology, cultural studies, internet studies, media and cultural studies and cultural geography.

As a digital sociologist who has researched digital data practices and data materialisations, particularly in relation to self-tracking cultures, big data politics and understandings, digitised academia, and parenting cultures, I am interested in learning more about user-experience methods in relation to surveillance technologies as they are employed in HCI, but also contributing my sociological perspective to broadening HCI’s hitherto often individualistic, instrumental and uncritical approach. I argue that bringing greater awareness and more in-depth analysis of the social into HCI research on surveillance to a greater extent would enrich the field.


Public understanding of personal digital data

One of the research projects I am conducting, with Mike Michael from the University of Sydney, is investigating the public understanding of digital data. We are experimenting with using novel methods (for sociologists at least!) in our project, which combines focus group discussions with cultural probes.

In the discussion groups, we wanted to go beyond the usual approach of simply asking questions of people. We wanted to invite people to think and work together playfully and creatively. We therefore decided to employ cultural probes to stimulate thought, discussion and debate, involving asking people to work together as a group or in pairs to generate material artefacts. Cultural probes are objects or tasks that are designed to be playful and provocative so as to encourage people to think in new ways about technologies. They are particularly valued for their ability to address intimate or controversial issues, to act as ‘irritants’ to engage people’s responses.

We asked people in our focus groups (held in Sydney) to engage in three collaborative tasks that we devised for them.

  1. The Daily Big Data Task. This task asked participants to work together as a group to draw a timeline on a huge piece of paper of a typical person’s day and adding the ways in which data (digital or otherwise) may be collected on that person.
  2. The Digital Profile Card Game. This task involved small groups to use cards with socio-demographic details on them to construct a profile of an individual, speculate about their characteristics and discuss how this information could be used.
  3. The Personal Data Machine. In this activity, participants were asked to work in pairs to design two data-gathering devices: one that they would find useful to use to collect any kind of data about themselves, and one for collecting data on another person. They were asked to write notes or make drawings describing their devices.

After each task the group came together to talk about the artefacts they had created or handled and what their implications were.

Earlier this year, we published a short piece in Discover Society that outlined some of our initial findings. A new article in the journal Public Understanding of Science, entitled ‘Toward a manifesto for the public understanding of big data’ has just been published. We are currently working on another article that presents our empirical work in greater detail.

In the ‘manifesto’ article, we pointed out that there are many intersections between research on the public understanding of digital data with the literatures on the public understanding of science and public engagement with science and technology. These are bodies of work that have been devoted to making sense of the intersections between how citizens engage with scientific knowledge, including not only consuming but producing this knowledge. Indeed, it may be contended that members of the public have been ‘prosumers’ of scientific knowledge long before the emergence of digital data (that is, both acting to consume and produce scientific information), particularly when they are engaging in citizen activism or citizen science initiatives.

There are many examples of citizens participating in activities that either contribute to or challenge accepted scientific beliefs. This critical approach has led to the development of a public engagement with science and technology model, in which ‘the public’ and ‘scientific expertise’ are not contrasted with each other. Rather, it is acknowledged that each draws their definitions from the other, contributing to hybrid assemblages of knowledges. The public are both the subjects and objects of scientific research and data, just as they are of digital data.

Our research project findings suggest that we may be seeing a transformation in attitudes in response to the controversies and scandals in relation to the use of people’s personal data that have received a high level of public attention over the past two and a half years, potentially reshaping concepts of privacy. What emerged from our focus groups is a somewhat diffuse but quite extensive understanding on the part of the participants of the ways in which data may be gathered about them and the uses to which these data may be put.

It was evident that although many participants were aware of these issues, they were rather uncertain about the specific details of how their personal data became part of big data sets and for what this information was used. While the term ‘scary’ was employed by several people when describing the extent of data collection in which they are implicated and the knowledge that other people may have about them from their online interactions and transactions, they struggled to articulate more specifically what the implications of such collection were.

On the other hand, when the participants were designing their ‘Personal Data Machines’, it was evident from their creations that they appreciated and enjoyed the opportunity not only to collect detailed information about themselves, but also on other people: partners, children and other family members and co-workers. Some imagined devices included those that monitored other people’s dreams or snooped on partners’ phone call metadata to check if they were unfaithful. Other people described a lie-detecting device, one that could track commercial competitors’ activities, another that revealed the salary of their workmates (so that the user could know if they were being fairly remunerated) and a dating device that could scan a prospective partner’s hand or face and reveal their financial assets and criminal record details.

In some cases, it seems, too much information is never enough. The seductions of data can be very appealing, not just for commercial enterprises or national security agencies.