• Ingen resultater fundet

How to Measure Privacy-Related Consumer Harm?

2.3 A Coherent Theory of Harm

2.3.3 How to Measure Privacy-Related Consumer Harm?

In the former sections three possible theories of privacy-related consumer harm have been put forward. The findings suggest that all three could imply a substantial loss in consumer welfare, which would give competition authorities the economic support to find alleged practices of data harvesting violating Article 102 TFEU. This section will revolve around how a degradation in privacy protection – and the hereto related harm – could be measured.

In the Facebook case, the BKartA acknowledges the high damage potential for users of the platform’s data accumulation but also notes that ‘it is difficult to quantify the effects of the damage, since it is not clear whether, when and how potential [...] consumer damage will occur and which users will be affected.’167 Instead, the competition authority suggests using data

162 Valletti, 2019, p. 4.

163 Kemp, 2019, p. 11.

164 Valletti, 2019, p. 5.

165 Deutscher, 2019, p. 197. See also section 2.3.1.

166 The BKartA expressed that ‘the collection of data itself can lead to behavioural changes among users, for in-stance, to avoid adverse reaction among friends or public bodies’. Facebook, case B6-22/16, para 909.

167 Ibid., para 911-912.

tection regulation as a qualitative benchmark for determining the exploitative abuse. Accord-ingly, non-compliance with privacy regulations would indicate that a platform has violated Art.

102 TFEU.168 This approach will be subject to more discussion in the integrated analysis.169 Further, the potential deterioration of privacy protection could be measured in qualitative terms by comparing privacy policies prior to a platforms’ dominance and when dominance has been established as a concept of privacy-price.170 In the case of Facebook for example, the platform evidently decreased privacy protection for users as its market dominance increased.171 Such a change in privacy policies, which requires users to disclose more of their personal data, could then be interpreted equally to a price increase.172 The disadvantage of this approach is that an increase in data harvesting is not necessarily understood as harmful by each user. As discussed in section 2.3.2.1 some, less-privacy-sensitive users, may favour the change in privacy policies if this would increase the quality of the service. Moreover, the exact amount of the price in-crease could be difficult to quantify, as the importance of changes in privacy policies can differ, depending on for example the type of information that users have to give away. This is where a conjoint analysis could be of help.

A conjoint analysis could economically corroborate the aforementioned proposition that per-sonal data may constitute an actual price paid by users and be used as a quantitative tool for measuring the consumer harm related to privacy degradations.173 It may enable the identifica-tion of the respective value of privacy protecidentifica-tion for users by measuring how changes in the attributes hereof influence their preferences and decision making.174 Similar models have been used by other economists to investigate users’ privacy valuations, which validates the ap-proach.175 Competition authorities could carry out such an analysis in three steps: The conjoint analysis should start with the formulation of a consumer survey that helps to identify the rele-vant price and non-price attributes of the platforms’ service by breaking it into its constituent

168 Facebook, case B6-22/16, para 912.

169 See section 4.4.1.

170 This approach is inspired by Deutscher, who argues that in the Facebook/WhatsApp merger, the Commission could have measured the degradation in privacy protection by comparing the privacy policies prior to the merger and the potential privacy policies post the merger. Deutscher, 2019, p. 194 and 198. For methods of calculating a privacy-price see also Malgiere & Custers, 2018; OECD, 2013A.

171 See section 2.3.2.1.

172 Deutscher, 2019, p. 199.

173Ibid., p. 201.

174 Ibid., p. 200.

175 See Acquisti et al., 2013; Krasnova et al., 2009; Phelps et al., 2001; OECD, 2013A, p. 29 f. On the economics of conjoint analysis see Baker & Rubinfeld, 1999; Green & Srinivasan, 1978.

parts (called attributes and levels).176 With regards to privacy as an attribute, an analysis of social networks could include the following attribute levels: (1) no disclosure of personal in-formation; (2) disclosure of basic profile (name, email address or phone number); (3) full pro-file disclosure; and (4) full propro-file disclosure plus disclosure of contacts or friends.177 Hereafter, all attributes and attribute levels should be bundled in different combinations (so-called stim-uli).178 This second step is followed by asking a specific number of representative users179 to choose their preferred choice sets,180 which they are then asked to rank according to their indi-vidual preferences.181 Employing statistical methods, the relative importance of each attribute and attribute level for the users’ platform services can be estimated.182 To measure the value users ascribe to privacy and changes in privacy protection, the utility changes in response to variations in the attributive level are weighed with the utility changes in response to changes in monetary prices.183 This approach would not only give competition authorities the possibility to measure the aforementioned consumer harm – it would also provide a tool to balance this possible harm with pro-competitive efficiencies.184 Compared to traditional survey methods it might be a more reliable tool, as it allows to account for the privacy paradox because users can reveal their behaviour rather than only state their preferences.185

Regrettably, the quantitative approach is subject to disadvantages as well. The conjoint analysis is very complex in nature and requires heavy administrable resources.186 Moreover, the results are often dependent on the way the choice sets are framed, which can lead to assessment errors and cause legal uncertainty.187 It should, therefore, not be relied upon as the sole element on which an assessment is based. However, in combination with the privacy-price concept it could

176 Deutscher, 2019, p. 202. It can be argued that this exercise is already applied when defining the relevant market.

See Hildebrand, 2002, p. 13 f. on the application of conjoint analysis in the Hypothetical Monopolist Test.

177 Inspired by Deutscher, 2019, p. 203. For a full example of possible attributes and attribute levels in conjoint analysis of social networks see appendix B.

178 Green & Srinivasan, 1978, p. 105.

179 The representative sample should include ideally around 1,000 users and represent different age and population groups. Moreover, it is of great importance, that it is composed of both actual and potential users. Deutscher, 2019, p. 202.

180 The users should only be confronted with a limited number of choice sets, ideally 12-20. Ibid.

181 Baker & Rubinfeld, 1999, p. 425.

182 Green & Srinivasan, 1978, p. 107.

183 Deutscher, 2019, p. 202.

184 Ibid., p. 204.

185 Ibid., p. 207.

186 Ibid., p. 205.

187 Ibid. The assumption of consumer rationality should thus be applied in a balanced manner, considering the findings in section 2.3.1

constitute an appropriate framework for competition authorities to measure consumer harm re-lated to potential data privacy abuses.