• Ingen resultater fundet

Advertising and Traffic

Chapter 3: A (Media) Archaeology of Citation

7.1 Advertising and Traffic

In a nutshell, then, PageRank organises relevance by deciding which links receive visibility and in this way, directs the flow of attention on the ‘information highway’ of the web. Yet ‘this commodification of attention occurs in a largely invisible way’ (Halavais 2009:83).

Economically speaking, search results are an expenditure similar to TV programming in that the service is given away for free in order to attract an audience which can then be sold to

advertisers by the provider. In 1981 Dallas Smythe first articulated this strategy of collating attention into the stream of information as the ‘audience commodity’, offering a ‘Marxist analysis of the role of communication for Fordist capitalism’ (Beverungen 2015). As Smythe pointed out, television entailed a network that financed the production and then packaged the content to sell to advertisers, often described as the ‘eyeball effect’, where more viewers

exponentially increase profits. According to Elizabeth an Couvering, search engines are not just a case of technological development but are ‘navigational media’ which she defines as a ‘type of technically-based media actor that organises and directs audiences or users to various types of content’ (2010:225). Specifically in regard to the development of search engines in the 1990s, the browser was seen as the crucial point for audience aggregation’ (ibid:101). it was not only about finding an audience to consume (and pay for it) (ibid:92), but also converting attention into action, enticing users to click on links and interact with the algorithmic interface.

67 For example, Charles H. Hubbell showed that ‘receiving a negative endorsement from a member of negative status makes a positive contribution to the prestige of the endorsed person (if the same Mafioso opposes you then your reputation might rise’ (Franceshet 2010:5.3).

Search engines extract value through the freely given labor of millions of people, and, by reconfiguring it, use it to draw attention to themselves. That attention is then sold to producers who wish to advertise their products (Halavais 2009:83).

In 1998, seven months before Google, one of the key developments in the search industry in response to spam was the search engine ‘GoTo.com’ (formerly Overture), which implemented the purchase of search terms (keywords) by advertisers. Similar to the Yellow Pages, websites wanted to be on top of the GoTo.com SERP (Search Engine Page Results) for specific

keywords. Instead of a wide range television broadcast audience, GoTo accentuated the niche customers in the long tail, matching advertisers to specific visitors to the site. ‘These ‘bid-placements’ operated as an auction, where advertisers competed with each other and only paid when users clicked on their ad listed in search results. Van Couvering structures this advertising through three key characteristics:

1) it is priced on a cost-per click basis; 2) it is contextual, linked either to page content or to the users’ search term; 3) it is syndicated to other websites on a revenue-sharing basis (i.e. the fee is split between the owner of the website and the provider of the paid search service)( 2010:115).

GoTo’s model was not to make advertisers pay with CTM or ‘per impression’ but that the

‘advertiser was only liable for the fee when someone actually clicked the ad ––unclicked impressions were given away for free’ (ibid:113). ‘Paid search’ was GoTo’s innovation with its cost-per-click (CPC) advertisements and these were connected to user traffic, syndicated or otherwise. Therefore, by the end of the 1990s, if not even earlier, commercial interests had already been woven into the very fibre of modern media networks through legislation, market mechanisms and the like (McChesney 1999 cited by Halavais 2009:169).

Referencing the section Academic Search Engine Research (1.3.2) in their text, Brin and Page explicitly spelled out their vision on design goals that could be considered a small critique of the commercialisation of web search development.

Aside from tremendous growth, the Web has also become increasingly commercial over time. In 1993, 1.5% of web servers were on .com domains. This number grew to over 60% in 1997. At the same time, search engines have migrated from the academic domain to the commercial (1998:108).

Moreover, in their Appendix A:Advertising and Mixed Motives they explicitly speak of ‘bias’

and that income derived from ads ‘provide an incentive to provide poor quality search results’(ibid). Further on they assert that less advertisement would be needed to help the consumer find what they want and that this is what constitutes a ‘better’ search engine because advertising always wants customers to acquire new products (ibid:107). With the understanding of their search engine ‘as free of the “mixed motives” that coloured other search engines whose business model depended on advertising’, they even cite the critique of the ‘concentration of media ownership’ with Ben Bagdikian’s Media Monopoly, ‘a book that in retrospect seems to have served them more as a how-to guide than as a warning’ (Peters 2015:326). In spite of these statements, it is noteworthy to show their resolve in solving ‘the problem’ at hand––making PageRank economically viable.

Although Brin and Page did not originally wish to mix organic search with paid ads (Levy 2011), by 2000 Google were ‘cold-calling people, trying to get them to buy keywords’ and even

‘rolled out a new, self-service advertising product called AdWords that allowed businesses to purchase text ads on search-results pages’ (Oremus 2013). In 2002 Google launched an auction-based search-advertising AdWords Select, which replaced their old AdWords with a CPC (cost-per-click) model, or CPA (cost-per-acquisition or a ‘cost-per-action’) model where the

advertiser only pays when a user clicks on an ad.

[W]hen people went to a search engine, they were often in search of something, and that might make them more prepared to have their interest piqued by a banner ad, or, better yet, an appropriate advertisement right there next to the search results (Halavais 2009:78).

Revenue and viewer clicks increased with Google having a search engine that was not dominated by spam and which could control what the eyeballs saw, as not all advertisements were equally attractive.68 On August 19, 2004 Google had its IPO, valued at 27 billion dollars.

The success of Google’s highly original business model is the story of two algorithms.

The first––pioneering a new way of associating web pages to queries based on keywords––has made Google popular. The second––assigning a commercial value to those keywords—has made Google rich (Kaplan 2014:57).

By 2005, ‘Cyberspace’ (Gibson 1982), as the internet was affectionately termed back in the early 1990s, had morphed into ‘Cybercapitalism’ (DeLillo 2003), where search results were shaped by a highly intricate series of communication networks and commercial platforms that incorporated advertising. Once considered portals, as explained in Chapter 2, search engines were now ‘connecting users to advertisers, within a burgeoning media economy’ and this shift in the search engine industry meant they were able to combine ‘disparate technical

infrastructures’ (Zimmer 2008 cited by Van Couvering 2010:124) and diverse types of media including traditional media and communications. ‘If there is an emerging attention economy, the search engine is its trading floor’ (Halavais 2009:71) and obtaining attention, in the form of hyperlinks, is a kind of enduring wealth, as long as the links are not broken and ‘alive.’

Whereas, in the 1990s, the search engine industry was built upon the ‘supply chain for audiences rather than for content’ (Doyle 2002:18), by the 2000s, the keyword attention economy of PageRank directed ‘the creation and exploitation of a new commodity for media:

traffic’ (Van Couvering 2010:92). User ‘traffic’ was made possible first through mainstream distribution models and later on through search platforms and hyperlinking in the mid 2000s (ibid). As Halavais points out, this battle for attention enables the ‘ideology of the marketplace to be granted access to new areas’, as ‘search engines were becoming one of the most visited kinds of sites on the web; traffic alone made them attractive’ (2009:78). Yet if search engines needed to provide access to information for all users, it was not only the ‘backlinking’

structuring information; the ranking of search results determines which links the user will click on, therefore creating more traffic.

68 This is also the innovation of personalisation, where ads are customised for particular users, which I will address in Chapter 5.

In Googlearchy, Matthew Hindman elucidated the inequalities of PageRank through the lens of traffic, which measured the visibility of any site based on its search result ranking and the number of links pointing to it:

Links do not just provide paths for surfers…If links help determine online visibility, how links are distributed tells us much about who gets heard on the Web…The importance of links challenges notions that online equality is easy or inevitable (ibid:132).

By the mid 2000s, political discourse was already filtered thanks to Googlearchy, thus

‘deliberative democracy’ was prohibited by the infrastructure itself––‘the social, economic, political and even cognitive processes that enable it’ (ibid:130). Googlearchy purported that

‘niche dominance’, where only a small portion of websites receive most of the traffic, is self-perpetuating––the sites with more links receive greater traffic whereas those with few links are harder to find and require better searching skills (2009:55).69 With hyperlinks continuously being added and Google collecting data ad infinitum, the bias in search engine results simultaneously became more noticeable––‘bias that invites users to click on links to large websites, commercial websites, websites based in certain countries, and websites written in certain languages’ (Van Couvering 2010:3).