• Ingen resultater fundet

3.3 M ETHODOLOGICAL APPROACH

3.3.2 Methodological choices: Assessing the impact of eWOB (RQ2)

41

includes examples that no longer technically exist, it would be faulty to draw tendencies from the database. At best, those tendencies would represent my own process in defining the concept of eWOB, going from broad terms, which included reviews of consumption

experiences, to being manually and automatically shared behaviors, finally to being restricted to disclosure of behaviors based on digital traces. Further, the building of this database was not meant as a core part of my research. It surfaced out of curiosity and remained that way

throughout. Consequently, there has probably been dozens of instances where I stumbled upon a piece of behavior being disclosed where I just did not have time to record it in the database.

42

adoption among those exposed to Facebook messages about their friends’ in-game

achievements (i.e. game-related behaviors). Building on this, I was curious about more detailed variables that could potentially guide companies in the design of eWOB and specifically how eWOB is presented in the most efficient manner. I wanted to directly compare whether the number of friends exhibiting behavior mattered, whether the perceived knowledge of music of those friends was influential, and the relative effectiveness of opinions and behaviors. This overall approach of applying social network analysis on the TDC Play case was unfortunately abandoned because of Facebook’s decision to gradually decrease the visibility – and thus the potential impact - of such auto-generated stories 2. In addition, the elements that I wanted to compare were impossible to do in the natural environment of Facebook. Instead, I turned to the approach of performing a randomized controlled field experiment within the empirical context of the TDC-owned movie streaming service Blockbuster and involving an action design

research-inspired (Sein, Henfridsson, Purao, & Rossi, 2011) development of a tool, the SOCIALIIT, to assist the integration of Facebook information into a website.

Movie streaming, and Blockbuster in particular, was chosen for several reasons. First, it had come to my attention that Blockbuster had already implemented Facebook login, but remained uncertain whether they should proceed in making use of social information in their interface, and if so, how. This confluence of real-world industry need, coupled with a bona fide research problem, provided a solid research rationale. Secondly, movie streaming is an experience good, where inspiration from friends and peers is likely to play a role because of the difficulty of assessing quality before purchase. Thirdly, since Blockbuster is an online service, we could establish a seamless integration with Facebook which would serve as the source of social information. Lastly, my affiliation with TDC Group entailed certain privilege, such as access to Blockbuster’s official Blockbuster Facebook page from which we were allowed to recruit participants for the experiment, who represented real, potential users of Blockbuster. Using the Blockbuster Facebook page added credibility to the experiment. Additionally, my internal status also allowed us to draw on the technical competences of the team in terms of Facebook ad management in the recruiting process, and actively engage the Blockbuster team in the development and testing of the SOCIALIIT.

2 https://www.adweek.com/digital/latest-news-feed-algorithm-change-third-party-implicit-posts-punished/?red=if

43 3.3.2.1 Development phase

A number of elements were developed for the experiment: a) A mockup of the real Blockbuster website b) a survey module and c) the tool SOCIALIIT that enabled us to infuse Facebook information into both the mock-up website and the survey module. These were built adopting guidelines from action design research methodology (Sein et al., 2011). Action design research brings together individuals who are implicated by the project, including the organizational stakeholders, researchers, developers, and end users, who co-build an ensemble IT-artefact in cycles of feedback, improving the final product and generating research knowledge at the same time (Sein et al., 2011). In total, we employed three iterative cycles of design, development and evaluation to arrive at the final version of the SOCIALIIT and the connected elements (website and survey module). Specifically, stakeholders from both the Blockbuster team and from the Strategy team, where I was situated, were involved in testing the survey and the mockup website of Blockbuster.

3.3.2.2 Experiment procedure & analysis

Facebook posts from the official Blockbuster Facebook page were targeted towards current non-users of the service who fit the target market profile. These posts directed participants to an online survey (see Appendix 8.7 for screen dumps from survey), where they were randomly assigned to one of eight theory-derived treatment groups or a control group. The

randomization was handled by SOCIALIIT and was determined by order of visits. Specifically, the first person that clicked on the Facebook post was assigned to Group 1 (the control group) and was thus led to the experiment flow associated with Group 1. The second visitor was assigned to Group 2 and led to the experiment flow associated with Group 2. This process kept going for all 9 groups until the ninth click and was then repeated starting with Group 1, then Group 2, then Group 3 etc. until the data collection was stopped (in effect by ending the promotion of the Facebook post). When starting the survey, participants were informed that it was run by Blockbuster and a group of researchers from Copenhagen Business School, and that the purpose was to obtain their opinion on a new version of Blockbuster’s website. All

participants, except those in the control group, were informed that the survey included a request to connect with Facebook. It was explicitly stated that this was necessary to run the survey, that their social information would only be used for this particular study, and that we would not post to Facebook on participants’ behalf.

44

Participants were then asked to evaluate a version of the movie streaming service, which included fictional information about their real Facebook friends’ behaviors and opinions of Blockbuster. Following this exposure, participants answered questions about their attitude towards the service and intention to use it. Upon completion we carefully informed

participants that the opinions and behaviors of friends seen in the website were fictional, and thus did not represent actual opinions or behaviors by friends. Figure 11 provides an overview of the treatments in the experiment.

Figure 11. Overview of treatments. Darker blue indicates the hypothesized strongest effect.

In total, we gathered 473 complete responses, which after data quality and manipulation checks resulted in a final sample of 398. This data was then analyzed in the SAS program JMP using the ANOVA test. ANOVA is a statistical procedure used to test whether the means of three or more groups are equal, and a commonly used statistical analysis when conducting factorial design experiments (Jung, Schneider, & Valacich, 2010; Tsao, 2014).

3.3.2.3 Methodological reflections on methodology: answering RQ2

When conducting experiments, one constantly weighs the pros and cons of many

methodological choices. One such choice for me was whether to include a pre-treatment score of the two dependent variables, namely participants’ attitude towards Blockbuster and

intention to use Blockbuster. The advantage of a pretest approach is that it allows the

researcher to compare the pre- and post-treatment scores, with any deviations being attributed to the treatment (Salkind, 2010). However, I was not able to solve the problem of how to

practically conduct a pre-test without sacrificing the natural environment, which was staged so

45

that participants had the impression that they were giving Blockbuster feedback on a new version of their website. Further, the necessary time lag between the pre- and the post-test stages poses a challenge, as there is no way to hold constant other factors. It could be that some participants had started using Blockbuster after participating in a pre-test while others would not have done so. This would lead to some participants suddenly being more knowledgeable about Blockbuster than others, which would conflict with our desired sample of potential users of Blockbuster. Finally, conducting a pre-test stage would have been much more expensive in terms of recruiting cost as we would most likely have experienced a large drop-out from the first to the second stage, and we simply did not have budget for that. With that said, there is no doubt that if the above laid out practical and financial issues could have been solved, some potential sources of bias could have been avoided, which will be elaborated on in Chapter 4.

Furthermore, the use of a mock-up website came with advantages as well as disadvantages.

Obviously, the mock-up solution required a lot of time and effort to develop. Although we were careful to develop a well-functioning mockup, the experience was not like the real Blockbuster site, which arguably could have affected some of the participants, despite

explaining to them that they should not expect a fully functioning site. On a positive note, the mock-up setting gave us much more freedom to test out different design dimensions than would have been possible if using the real Blockbuster website. Finally, out of the necessary evil of creating the mock-up website, a customizable survey module and a Facebook app also emerged, all of which together comprise the SOCIALIIT. This enabled us to integrate Facebook friend information into both the mock-up website and the survey module.