• Ingen resultater fundet

Abstract

III. Evidence-based – what does it mean for researchers?

evidence-based or evidence-informed social pedagogy. In many ways, even the best available studies seem far from reliable when we want to assess the effect of interventions. This explains many kinds of mismatches, for example, between professional beliefs and practical realities, between institutional aims and require-ments in the interaction with clients/users and between policies and implementa-tion (Messmer & Hitzler, 2008).

The idea of having a 1:1 implementation of research in practice is in other words misleading and refutable. Here we would further point to the ‘tacit dimen-sion’ (Polanyi, 1966; Hess & Mullen, 1995; Neuweg, 2004).

So far the article has dealt with some basic discussions. Now we will move to the question of how to understand evidence-based practice.

In brief, evidence-based practice does not begin with practice; rather, it begins with research. However, an important point of departure is whether or not EBP is a sanctum, simply to be taken for granted:

There is a tendency for the notion of evidence-based practice to take on the character of an ideology in some quarters. In other words, it is treated as beyond question, so that anyone who raises doubt about it is regarded as either mad or bad: as incapable of recognizing the obvious (who, after all, would want policymaking or practice not to be based on evidence?) or as acting on the basis of ulterior motives (such as ‘supply-side-prejudice’, which is often treated as a synonym for old-style

‘professionalism’) (Hammersley, 2009, p. 139).

Hammersley concentrates his further argumentation on what counts as evidence by discussing some assumptions, for example,

that research can provide sound evidence about what should be done that is more reliable than that from any other source; and that, if practice is based on scientific evidence, the outcomes will be dramatically improved (ibid., p. 148).

One point of his analysis should be emphasized here. The “missing link” points to the fact that much research is carried out in the spirit of positivism, and the criticism of that is “an exaggerated respect for quantitative method stemming from a neglect or underplaying, of the methodological problems surrounding it”

(ibid., p. 142). In addition, one could mention that readers of such research lack the tacit knowledge that is embodied in the activities of research and reviewing (ibid., p. 145).

Although Hammersley is no enemy of quantitative research, but rather tries to find a balance between methods, other discussants are. The criticism has been grouped as addressing short circuits, one-sidedness, biases, limitations or mis-perceptions. Mullen, Bellamy & Bledsoe refer to this criticism under the heading

“The Evidence Base in EBP” (Mullen, Bellamy & Bledsoe, 2008, p. 134-136). They mention, among other things, a critique on philosophical grounds that

an evidence-based, rational model of decision making does not fit with the realities of individualized, contextualized practice” and express “concern about whether evidence-based policy is feasible when so many competing factors enter into policy-making decisions (ibid., p. 135).

On political grounds Sommerfeld (2005) and Ziegler (2005)

raised important political questions about evidence-based practice… as threatening professional autonomy and potentially undercutting the fundamental integrity of the social work profession (ibid., p. 135)

Mullen et al. find a paradox or contradiction here, since their definition of EBP is grounded in the idea that practitioners should and can have their interventions on the best available evidence rather than on expert opinion, intuition, authority, tradition or common sense” (ibid., p. 131).

Their article concludes

that because social workers engage in complex and diverse forms of practice it is necessary that a wide range of evidence be considered admissible. Nevertheless, this should not mean that the profession should avoid setting clear standards and criteria regarding what will guide judgments about the quality and strength of various types of evidence” (ibid., p. 150).

To conclude this brief description of the state of the art we draw on an argumenta-tion developed at Bielefeld University. In “Evidence-based Practice – Modernising the Knowledge Base of Social Work?” (2009) Otto et al. note in their introduction that

experimental research may be superior to any other designs in providing empiri-cally robust evidence about whether a specific program “works” in the sense that specific events are attributable to deliberately varying the respective treatment (ibid., p. 12),

and they do not hesitate to recall what authors who support the aim of establish-ing evidence-based Social Work practice emphasize:

Particular doubt is cast on the idea that Social Work should execute the instruc-tions of manualized guidelines in order to be effective. In this respect, it is also questionable whether a Social Work practice fashioned directly on the basis of evidence gained from experimental research is intrinsically more ethical, more rational, and less authoritarian (ibid., p. 13).

This is a well-known reflection, because issues are ambiguous and demand interpretative spaces, and when interpretative spaces exist, strict measurement cycles do not work because required conditions and assumptions are not met (ibid.). Another problem that is raised by Otto et al. in relation to social work as evidence-based practice is connected with cases in which the intervention is a replicate of a practice that has proven to be effective elsewhere. Replication is only possible at the cost of working according to manuals, due to the logic of the methods and the assumptions about causality (cf. Otto et al., 2010, p. 15).

We will continue by discussing research in social work/social pedagogy. Shaw

& Norton (2007) have developed an approach to social research by pointing to

two dimensions: content and perspective. By asking what social research needs to consider, the authors point to five fundamental issues: purpose, contexts, researchers, methods of inquiry and domains. We will not go into further detail about these issues here (cf. Bryderup, 2008, p. 12 ff.), but will focus briefly on con-tent and perspective. By concon-tent is understood the primary research focus (target groups, communities (professional and policy) etc.). By perspective is meant the primary issue of research, for example, understanding/explaining risk, vulner-ability, abuse, resilience, and other issues. Further, Shaw and Bryderup pinpoint five characteristics of good social research, which should: (1) be methodologically robust, (2) be theoretically robust, (3) add value to practice, (4) represent value to people and (5) have economic value (Bryderup 2008, p. 18). It seems obvious that the authors are responding to a critique of qualitative social research, but are similarly trying to build bridges between quantitative and qualitative research in social work/social pedagogy. In spite of their efforts, there are still tendencies to restart the old war between the methods mentioned.

These tendencies reflect the consequences of modernization which could be sum-marized with Bauman’s term of ambivalence and uncertainty where contradictory knowledge and contradictory approaches have to live together” (ibid., p. 24).

Moreover, they quote Peter Sommerfeld, who states that “we have to face more complexity and learn to cope with it” (Sommerfeld, 2005, p. 18).

Among the complexities, there is a need to define what ‘evidence-based’

means and why we need to expand the concept of ‘evidence’ to a broader concept of knowledge that includes research knowledge, professional knowledge and practice knowledge (cf. Rasmussen, Kruse, & Holm, 2007). The question is to what extent does what research measuring outcomes, effectiveness and effects provide what kind of evidence to social pedagogical practice and vice versa: what kind of research is needed to contribute in a broader way to improve the quality of social pedagogy?