• Ingen resultater fundet

This PhD project was conceived in the years 2011-2012. It was around this time Facebook had just opened for the use of the so-called Open Graph, enabling Facebook users to bring along their social networks and entire ‘social graph’ into services outside of Facebook. The promises were of personalization and higher user relevance and user engagement 5. The attitude in 2012 towards the sharing of one’s own data with Facebook and third-party developers was no doubt very different from the one most espouse today. At the time of writing, general public mistrust towards Facebook and its data protection practices is at a high. Sparked by the recent Facebook Cambridge Analytica scandal, and the data that was illegally acquired from Facebook and used for political purposes, increasing amounts of users have lost confidence in the platform.

According to a recent survey by Business Insider 81% of Facebook users have little to no confidence in Facebook protecting their data and privacy – a number much higher than for other social media platforms 6.

This development, from a focus on personalization and the derived user benefits, towards a problematization of privacy issues, has been ongoing since the beginning of this PhD project.

The empirical studies performed throughout my PhD have shown similar tendencies, raising data privacy-related issues. User privacy concerns were evident in both of the empirical studies performed during this research project. Both studies included elements of individual-specific behavior-based information being disclosed to a selection of users’ Facebook friends. The concerns expressed by the study participants were similar to what a Director of Product

5 https://mashable.com/2010/04/21/facebook-open-graph/?europe=true#vgA1s.ySkkqD

6 https://nordic.businessinsider.com/consumers-dont-trust-facebook-at-all-new-survey-data-2018-4?r=US&IR=T

81

Innovation in Netflix told me during a phone interview in November 2016 about Netflix’ use of social information:

You would like to know what your friends are watching - we [Netflix] hear that all the time. If you ask people “would you like to know what your friends are watching?” most people would say yes. But then you ask them “are you willing to share what you’re watching?”. And most people would say no. And, so, you have that problem.

This paradox has been referred to in the academic literature as the “personalization privacy paradox” (Farag Awad & Krishnan, 2006) and describes the inherent tradeoff between access to personalized services and the unwillingness to give away personal data that enables such personalization. Similarly, in the qualitative study for Paper 5 some informants showed elements of constrained behavior, effectively exemplifying what has been referred to as the extended chilling effect of social media (Marder, Joinson, Shankar, & Houghton, 2016). This term describes the tendency for some individuals to constrain or modify their real-life behaviors out of fear that their behavior will be disclosed on social media and observed by others. Specifically, some informants described how they might change their behaviors if they felt observed by other users. Such tendency to modify one’s behavior if cognizant of an online audience has also been empirically demonstrated in, for example, the context of political voting decision-making (Vatrapu & Robertson, 2010).

Other informants clearly expressed discomfort at the thought of their past music listening behavior being observed. What is notable in these findings is that the informants’ concerns mainly seemed to surface when imagining an audience they had no control over. That is, the feeling that their behaviors would be disclosed to an audience consisting of their entire Facebook friends list, as Spotify does not offer any segmentation controls. Social media platforms often collapse multiple audiences, one’s friends, colleagues, and family – a

phenomenon known as ‘context collapse’ (Marwick & Boyd, 2010). Facebook is no exception, and informants mainly explained that music and music listening behaviors can reveal details about one’s personality and whereabouts, which they felt uneasy revealing to their entire Facebook friends list (through Spotify). Researchers have previously identified various strategies for navigating unclear audiences, such as self-censorship, selective sharing and grouping, and not posting at all (Semaan, Faucett, Robertson, Maruyama, & Douglas, 2015). If, on the contrary, audience controls were offered users were much more positive about the

82

presence of behavior-based information. Accordingly, one important design implication of this research for the future design of eWOB elements is to offer users control of the audience, particularly in cases where eWOB is disclosed at the individual-specific level.

Another important issue of eWOB is its potential for deceptive use. As I have argued eWOB can be seen as a strategy for designing persuasive systems. Defined as “computerized software or information systems designed to reinforce, change or shape attitudes or behaviors or both without using coercion or deception”, literature on persuasive systems emphasizes the importance of not using coercion or deception (Oinas-Kukkonen & Harjumaa, 2008, p. 202).

However, such misuse remains a longstanding concern in this literature, and great efforts have been made to outline ethical guidelines for persuasive systems design (Berdichevsky &

Neuenschwander, 1999; Verbeek, 2006). The challenge remains, even if as researchers we provide and adhere to ethical guidelines such guidelines are of little help when in the hands of ill-intentioned systems designers with a narrow focus on short-termed profit. The area is currently unregulated and lacks mechanisms that can assist users of digital products and services in assessing the validity of the eWOB faced. This is especially prevalent when data is presented at the aggregated and/or anonymous level. How much truth is there behind Airbnb informing you that ‘8 users are currently looking at this accommodation for the same dates as you’?

This obscures further contextual details because no information is provided about whether 8 users is many or few. Such practices are common in e-commerce, especially in the travel industry, and leverage the persuasive tactic of scarcity (Cialdini, 2001). In a similar vein, music streaming service Tidal has recently been accused of inflating the number of ‘listens’ for the artists Beyoncé and Kanye West 7. This not only misleads users, it also has profound financial consequences, given the number of listens determines how the artists are compensated

financially by Tidal. Whereas most countries have laws that regulate how companies can carry out marketing communications activities (the use of competitor comparisons, marketing towards children, and use of threatening language are examples of regulated areas in some countries), there seems to be a blurry line when it comes to cases where the product design has merged with marketing-type of mechanics, such as eWOB.

Related to the topic of potential deception carried out by digital services, consumers can potentially control eWOB in ways that might harm the trustworthiness of behaviors. Behaviors have traditionally been viewed as the holy grail in terms of trustworthiness: It is what you do

7https://www.cbsnews.com/news/tidal-accused-of-falsifying-stream-numbers-for-beyonce-and-kanye-west/

83

that matters; actions speak louder than words; walk the talk. These are popular phrases which seek to capture the presumably unfiltered and trustworthy nature of behaviors. However, in a digital world consumers’ can carefully curate the disclosure of their behaviors. This happens, for example, when a user enters ‘private session’ listening in Spotify. When doing this, the music listened to, and thus the behavioral information, will not be disclosed to peers or other users. In essence, this makes it less reliable to take someone’s listening behavior as a full picture of that person and their tastes, as it is but a curated representation. This is also the issue with the more manually controlled uses of behavior-based information as illustrated in the study of an online beauty forum by Cheung et al. (2014). Here users had the option to manually list and disclose to other users which beauty products they had previously purchased. It is likely that this option to disclose led to curated representations of actual purchases, akin to impression management (Leary, Kowalski, & Leary, 1990). For example, selectively not including more mundane or less ‘cool’ products to paint a certain image of self. In such cases, behaviors cannot be regarded as unfiltered and trustworthy accounts.

Broadening the scope, the ‘dark side’ of eWOB reaches beyond misuse related to the marketing of products and services. As demonstrated by Bond et al. (2012) observable voting behaviors of friends can affect one’s own voting behavior. Whereas Bond et al.'s (2012) study was devoted to increasing voter turn-out, and not biasing a specific candidate, one might very well imagine that eWOB can also be used to influence which candidates are elected. As such, if in the hands of ill-intentioned systems designers – or politicians - it might be used in deceptive manners that can have profound implications for democracy. This ability of behavior-based information to

“lead the herd astray” was demonstrated by Salganik and Watts (2008), where users of a music community were shown to blindly follow the lead of other users. Specifically, songs that had a low number of downloads, and thus interpreted to be less popular, were manipulated to look like they had a high number of downloads making them ‘falsely popular’. As a result, the falsely popular songs increased in downloads, effectively showing that users were drawn towards making the same choices as others. Luckily, I should note, the Salganik and Watts (2008) study also found that over time the ‘true popular’ songs regained their pace. As such, this study demonstrated that although eWOB can be powerful in influencing behaviors, the human ability to critically assess a situation and make decisions accordingly is still prevalent, even in a digital world.

84