• Ingen resultater fundet

Danish University Colleges Review of litterature on social workers’ use of research evidence in care planning and support with children & young people Ebsen, Frank Cloyd

N/A
N/A
Info
Hent
Protected

Academic year: 2022

Del "Danish University Colleges Review of litterature on social workers’ use of research evidence in care planning and support with children & young people Ebsen, Frank Cloyd"

Copied!
40
0
0

Indlæser.... (se fuldtekst nu)

Hele teksten

(1)

Danish University Colleges

Review of litterature on social workers’ use of research evidence in care planning and support with children & young people

Ebsen, Frank Cloyd

Publication date:

2015

Document Version

Også kaldet Forlagets PDF Link to publication

Citation for pulished version (APA):

Ebsen, F. C. (2015). Review of litterature on social workers’ use of research evidence in care planning and support with children & young people. Professionshøjskolen Metropol.

General rights

Copyright and moral rights for the publications made accessible in the public portal are retained by the authors and/or other copyright owners and it is a condition of accessing publications that users recognise and abide by the legal requirements associated with these rights.

• Users may download and print one copy of any publication from the public portal for the purpose of private study or research.

• You may not further distribute the material or use it for any profit-making activity or commercial gain • You may freely distribute the URL identifying the publication in the public portal

Download policy

If you believe that this document breaches copyright please contact us providing details, and we will remove access to the work immediately and investigate your claim.

(2)

and support with children & young

people

(3)
(4)

with children and young people

The Social Care Institute for Excellence (SCIE) was established by Government in

2001 to improve social care services for adults and children in the United Kingdom.

We achieve this by identifying good practice and helping to embed it in everyday social care provision.

SCIE works to:

• disseminate knowledge-based good practice guidance

• involve people who use services, carers, practitioners, providers and policy

makers in advancing and promoting good practice in social care

• enhance the skills and professionalism of

social care workers through our tailored,

targeted and user-friendly resources.

(5)

Acknowledgements

This work was commissioned by Metropolitan University College, Denmark and funded by Tryg Fond. All rights to this report are the property of

Metropolitan University College, Denmark.

Social Care Institute for Excellence 206 Marylebone Road

London NW1 6AQ tel 020 7535 0900 fax 020 7535 0901 www.scie.org.uk

(6)

Introduction ...2

Decision making in social work ...3

Introduction ...3

The significance of research evidence ...3

Other sources and types of evidence ...4

What else affects decisions? ...6

Evidence-based practice...7

Introduction ...7

Defining evidence-based practice ...7

Barriers to evidence-based practice ...8

Facilitators of evidence-based practice ...10

Implementing effective interventions...13

Introduction ...13

Empirically supported treatments ...13

Integrative approaches ...13

Common factors approach ...14

Practice or core elements approach ...14

Implementing empirically supported interventions ...15

A case study in implementing ESTs ...18

Recommendations for future research and development work ...22

Recommendations for future research ...22

Recommendations for development work ...22

Summary and conclusion...24

Summary ...24

Conclusion ...25

References ...27

Appendix: Searching, screening and assessment ...29

Summary of the overall process ...29

Sources ...30

Search protocol and keyword formulation ...31

Search strategy ...32

Inclusion and exclusion criteria...33

(7)

Key messages

 Social workers use research in decision making but also draw on a wide range of evidence when they make a decision.

 The influence of research in decision making is mainly indirect.

 There are several barriers to social workers’ use of research, such as lack of time and access to resources.

 Social workers lack knowledge of and confidence in handling research evidence.

 Research methods should be taught to social work students so that they can acquire the confidence and skills to use research.

 Social workers should be offered ongoing training, supervision and support for the use of research.

 Working relationships between researchers and social workers need to be developed to improve the use of research evidence in decision- making processes.

 The use of research needs to be incorporated into organisational culture.

(8)

Introduction

This is one in a series of four literature reviews conducted by the Social Care Institute for Excellence (SCIE) on behalf of Metropolitan University College in Copenhagen, Denmark. This narrative review focuses on social workers’ use of research evidence in care planning and delivery for children and young people. It provides a concise summary of a sample of relevant research on the topic and signposts routes to further information, rather than offering a definitive or comprehensive statement of the research.

The research followed a modified version of SCIE’s published research briefing methodology

(www.scie.org.uk/publications/briefings/methodology.asp). This approach involved:

agreeing key search terms and themes related to the high-level concepts of interest – namely, decision making, evidence-based practice and the role of research in social work practice

systematic searching of relevant databases and other sources

screening records retrieved against agreed inclusion and exclusion criteria

extracting and synthesising data related to evaluations of

evidence-based practice in youth social work, barriers to evidence- based practice, facilitators of evidence-based practice and the

outcomes of evidence-based practice for children and young people.

The Appendix gives further information on the stages in the research process, sources of data, the search protocol and keyword formulation, the search strategy and inclusion and exclusion criteria.

The searches were conducted between May and June 2014.

(9)

Decision making in social work

Introduction

The quest to improve social work practice has become increasingly focused on the need to improve both the nature and transparency of social work decision-making. Being explicit about how social workers make judgements and decisions is seen as critical to improving the defensibility of decisions as well as improving social work outcomes. The questions for this review are whether and to what extent research evidence contributes to the decision- making process in youth social work, with what effect and how it could be improved. The review then moves on to examine whether evidence about effective interventions can be used to determine what care and support is delivered.

This chapter starts by examining the way social workers reach decisions in assessment and care planning. The particular focus is on whether they draw on research evidence in the decision-making process, to what extent and what else affects the conclusions they reach. Few studies have focused explicitly on the decision-making process and only two reported findings from empirical research. This chapter focuses on these two particular studies:

 Collins and Daly (2011), which presents results from a qualitative study of decision making around risk in social work

 Gordon et al (2009), which investigated how social work practitioners make use of research, inquiry and other forms of knowledge

evidence to inform their practice.

In the next chapter, we examine how the use of evidence in practice could be improved.

The significance of research evidence

In research on decision-making among social workers (Collins and Daly, 2011), interviewees were asked what ‘evidence’ meant in social work and, overwhelmingly, they indicated that it was information gathered from numerous sources relating to a particular case. Among other things, this included case histories and notes, their own observations and reports from other relevant professionals. This might therefore include evidence where the research origins of that knowledge have been lost, often referred to as ‘tacit knowledge’ (see, for example, Osmond, 2004; Osmond and O’Connor, 2006).

Very few social workers who were interviewed spontaneously mentioned research as evidence but those who did tended to be from the Children and Families Team:

‘[A] lot of our child protection cases have got an element of domestic violence. And it’s important to reiterate to everybody you know that we take domestic violence very, very seriously because we know from research the impact on children that have grown up in those situations.

(10)

It’s not just the fact that we’re worried that they will get harmed, they’ll get physically harmed during a confrontation, it is the emotional impact on children and we know that from research.’ (Collins and Daly, 2011, p 8)

The concept of ‘research as evidence’ was stronger for workers whose

experience of formal education was more recent, such as newly qualified staff and practice teachers, compared with more experienced workers:

‘[C]oming out of uni you think about evidence as being your research and your knowledge and things but now I think I would first look at evidence in terms of the case and what we already know and what we’ve already done with the family, so we are looking at our reports and evidence and notes of evidence from other agencies.’ (Collins and Daly, 2011, p 9)

Although the study participants rarely cited research as a type of evidence, some did refer to research when they were asked to describe their typical decision-making process. Collins and Daly suggest that this indicates that while research is not the first type of evidence that social workers consciously seek in their day-to-day work, it does play a part in decisions.

The fact that research does not make a bigger, explicit contribution to decision making is explained in Collinscamargo’s (2007) article by social workers’ poor access to resources. The author partly attributes the successful promotion of evidence-based practice in medicine to the use of computer technologies (Gira et al, 2004, cited in Collinscamargo, 2007). He contrasts this with state child welfare staff who are unable to use internet-based technologies in their work as the decision to restrict such use has been made by their agencies.

Compounding potential lack of adequate hardware or internet access is the fact that child welfare staff may not have time to peruse the professional journals to stay up to date on research findings impacting their work. The crisis-driven nature of their work certainly impacts this (Collinscamargo, 2007).

It is worth bearing in mind that Collinscamargo’s appraisal was made seven years ago and the issue of internet access is likely to have improved since then. However, the time pressures experienced within child welfare services have not, so the author’s observation that staff have no time to read published research still has relevance.

Other sources and types of evidence

If research is only one among a range of evidence types that social workers use, what else influences their decision making? The social workers in the Collins and Daly (2011) study were in no doubt about the critical importance of evidence for all decision making but it is clear that they were referring to a broad range of evidence types from different sources, which in their totality help them to make judgements about risk and interventions. In some circumstances, evidence from other professionals is important; in others, evidence about the preferences of people who use services is crucial; and in

(11)

others, evidence about the mental capacity of the individual is important.

Although the social workers reported that different types of evidence gain prominence in different decision-making situations, evidence of risk is consistently the most important: ‘this evidence of harm or risk takes

precedence over all other types of evidence, including the views of the person supported by services and their carers’ (Collins and Daly, 2011, p 12).

Findings from Gordon et al’s (2009) study support that account of using a complex and interacting mix of practice experience, social work and other theory, plus knowledge of legislation, methods of intervention and local and national policies, procedures and resources. The social workers in the study were able to describe multiple sources of knowledge, from early personal experiences through to social work and other training, and practice

experience. The most frequently mentioned source of knowledge was the social workers’ past and current experience of working with users and carers.

However, the following were also regularly cited as significant to the social workers’ use and development of knowledge (Gordon et al, 2009):

 in-service training

 supervision with managers

 social work qualifying training

 practice discussions with colleagues

 reading.

The task for social workers is to integrate the complex and interacting mix of evidence. In order to reach a decision they need to make sense of all the information through a process that relies on reflection. A crucial part of decision-making in social work, reflection was described by Collins and Daly (2011) as being an individualised process and one that is closely tied to supervision. Supervision in this context was described as being particularly useful for identifying gaps in evidence or providing fresh thinking on complex matters.

Gaps or contradictions in the assembled evidence

In Collins and Daly’s study, the issue of gaps or contradictions in the evidence that social workers were able to gather seemed a significant threat to the decision-making process. In such cases, social workers felt less confident in the decisions they were making, and took steps to gather further information and do ‘detective work’ to bolster the evidence they already had. It was notable that research evidence featured more when observational evidence was lacking; it could be said to be plugging the gaps:

‘[You] try and gather it from as many places as you can, your previous experience does come into it. Your knowledge, you know, whatever sort of research you have been looking at recently. There can be a variety of ways even though you have got limited information then, you can say well research shows that and you’ve got research findings to fall back on.’ (Children and Families Team social worker; Collins and Daly, 2011, p 22)

(12)

What else affects decisions?

Collins and Daly (2011) usefully highlight that, in addition to the types and sources of evidence combined for decision-making, ‘gut feelings’ or instinct, team or individual bias and organisational norms affect social workers’

decisions. Some of what is referred to as a ‘gut feeling’ may of course be research-based, but has been absorbed into professional practice and lost its research label.

Although the social workers acknowledged the existence of their gut feelings, they were keen to reframe the notion, especially those in the Children and Families Team. One respondent pointed out that their own feelings were actually indications based on evidence:

‘I remember research that was done a long time ago ... and one of the things that came back was don’t ignore gut feeling, because it’s often not gut feelings. It’s often based on something ... And if you tease it out it is evidential, it’s about not ignoring it but I think it’s about not calling it gut feelings because when you think about it there are indications. Even if it’s not hard evidence there are indications there that something’s not right.’ (Children and Families Team social worker; Collins and Daly, 2011, p 31)

Intuition and personal experience were evidently valued by the participants in Gordon et al’s (2009) study but were regarded with some caution, and had to be set against other forms of knowledge to confirm their utility. This

‘balancing act’, as practitioners assessed the relevance and validity of different kinds of evidence, was observed in all the interviews carried out for the study, with practitioners appearing to use a kind of continuous

triangulation to achieve ‘best fit’ between practice and knowledge. This cautionary note was also found in Collins and Daly’s (2011) study.

Respondents emphasised that although a gut feeling can be useful, alone it is not enough and should not be trusted without corroborating evidence: ‘You never go on a gut feeling about something because you have to evidence it’

(Collins and Daly, 2011, p31).

This comment would cause concern among some authors in the field. They warn that the process of finding evidence to corroborate a gut feeling can lead practitioners to seek out only the information that will support their intuition (known as a ‘confirmation bias’ or ‘verificationism’). This tendency to persist in initial judgements and reframe, minimise or dismiss discordant new evidence is seen in the literature (Burton, 2009, cited in Collins and Daly, 2011) as an important issue in social work. To counteract this confirmation bias, it is argued that decision makers need to be reflexive about the way the decision situation is framed and should not only seek to continuously question their assumptions, but also actively seek information that sheds doubt on those assumptions (O’Sullivan, 2011, cited in Collins and Daly, 2011).

(13)

Evidence-based practice

Introduction

So far we have seen that social workers incorporate a range of evidence types in assessment and decision-making; the process is complex and difficult to articulate. In the two empirical studies introduced in the previous chapter, research evidence did not routinely play a significant part in the process (Gordon et al, 2009; Collins and Daly, 2011). As we will see, there are many commentators who subscribe to the belief that social workers have an ethical obligation to base their decisions and interventions on the best evidence of effectiveness. The notion of using research findings to inform social work (and indeed other human services) is known as ‘evidence-based practice’.

Evidence-based practice (EBP) was traditionally based on the notion of a linear model of knowledge production and transfer, whereby research findings produced in one location are transferred to the context of use through various mechanisms, such as the development of intervention guidelines or treatment protocols (Eccles and Mittman, 2006; Proctor et al, 2009 – both cited in

Mitchell, 2011). However, there is increasing recognition that EBP does not or should not infer a one-way transfer of knowledge from academic research to the practice setting. Indeed, some commentators have argued that if research is to be relevant to practice, we need ‘practice-literate researchers’ as much as research-literate practitioners (Fisher, 2013). Therefore in recent years, this field has been a source of much debate across the human services, including social work, community services, child protection and mental health.

In this chapter we draw on the included studies to shed light on possible reasons why the social workers in the Collins and Daly (2011) study did not report a greater use of research evidence. We will examine suggestions for overcoming the barriers and finish by describing one attempt to improve the use of research evidence in decision making and practice.

Defining evidence-based practice

A broad definition of EBP is adopted in this chapter. However, in the next chapter we will see that others use a far narrower concept of EBP.

Sacket et al (2000) define EBP as:

‘a process of clinical decision making that entails “the integration of best research evidence with clinical expertise and patient values” involving five steps:

1. Convert one’s need for information into an answerable question.

2. Locate the best clinical evidence to answer that question.

3. Critically appraise that evidence in terms of its validity, clinical significance, and usefulness.

(14)

4. Integrate this critical appraisal of research evidence with one’s clinical expertise and the patient’s values and circumstances.

5. Evaluate one’s effectiveness and efficiency in undertaking the four previous steps and strive for self-improvement.’

Barriers to evidence-based practice

A solid body of research was identified during the review of the literature, which helps to shed light on the difficulties in using evidence to inform social work practice. The barriers are discussed below, and then we look at how they could be addressed.

Inadequate agency resources

In Gray et al’s (2012) evidence review, a lack of agency resources to support EBP was reported in all of their included studies. Two aspects of agency resources identified as barriers were:

 staff time

 infrastructure providing access to research evidence

Time was the most frequently identified barrier found in 10 of the 11 studies in Gray et al’s review. The most significant barrier to EBP implementation was noted when EBP was regarded as something in which practitioners and managers must engage on top of their normal full workload, without recognition in terms of additional staff time allocations. This finding was corroborated in Gray et al’s later survey of Australian social workers (Gray et al, 2013), which explored their perceptions of EBP. ‘Time’ and ‘time

compression’ were identified most frequently of all the barriers. Social workers said that it was predictable that if they were expected to find and appraise evidence on top of their normal workload then the lack of time ‘naturally

emerges as a barrier’ (Gray et al, 2013). Social workers have also pointed out that using any ‘spare’ time for reading books and reflecting on practice would be viewed by colleagues as a luxury in the face of pressing demands from caseloads (Gordon et al, 2009).

The second aspect of agency resources found to be lacking is information technology and library facilities for accessing web-based databases.

Investment in these resources is, in Gray et al’s (2012) view, a prerequisite to making EBP a reality.

Skills and knowledge needs

Survey respondents also reported that they lacked clarity about what exactly constitutes evidence, how to find it and how to critically appraise it (Gray et al, 2013). This is in line with the wider literature, which suggests that one of the reasons for this is that social workers are not trained to apply research to the practice context in the way that professionals in other disciplines are. The skills identified by the literature as lacking include:

(15)

 critical appraisal

 handling data

 transferring research findings into practice settings.

Literacy in information technology was also found to be lacking and, finally, supervisors seem to be ill equipped to supervise social workers in the implementation of research knowledge in practice.

Agency culture

The aspects of agency culture identified in the literature as barriers were

‘blame cultures’ resulting in practitioners being unable to work flexibly outside of guidelines or agency norms and:

‘… a lack of critical questioning (Booth et al, 2003), no prior experience in utilizing research to inform practice (Burke & Early, 2003; Stevens et al, 2005), punitive, constraining, or overly bureaucratic management or administrative procedures (Collinscamargo, 2007; Jones et al, 2007;

Straussner et al, 2006), and reactive approaches to practice, where evidence that is directly relevant was expected immediately.’ (Gray et al, 2012, p163)

Survey respondents were in agreement and suggested that EBP must therefore be tackled at the organisational level as well as the individual practitioner level.

Research environment

Interestingly, Gray et al’s review identified barriers on the part of research, not just on the part of practice. The authors identified a mismatch between the focus of research evidence and the requirements of the social work practice context, including the complexities of client circumstances. In the face of out- of-touch or simply unavailable research, it is difficult to stimulate a culture of research utilisation (Stevens et al, cited in Gray et al, 2012).

Practitioner attitudes

Although ‘practitioner attitudes’ was cited by fewer papers in Gray et al’s (2012) review than the other issues, it nevertheless featured and would

arguably be addressed by removing all the other barriers. In the face of all the other barriers discussed here, it is unsurprising that social workers have been found to be suspicious ‘about the trustworthiness of research and the

applicability of EBP to a human service context’ (Gray et al, 2012, p164).

In contrast to the broader literature, Gray et al’s survey identified a far more positive perspective among social workers despite all the barriers. Most said that they were familiar with research and keen to use it to inform their work with clients. An equal proportion wanted to engage in the production of research evidence compared with practitioners who wanted to engage by using research-based guidelines developed by others.

(16)

In addition to the factors discussed here, Aarons and Palinkas (2007) suggest that a gulf exists between usual practice and EBP because of problems in implementing a new service or approach. They believe that child welfare systems present particular challenges to implementing EBP and as a result effectiveness is bound to be compromised. They cite the highly bureaucratic nature of the child welfare system and couple this with evidence that

bureaucratic systems often include many different decision makers in addition to the practitioner, and are often linked with negative practitioner attitudes about adopting EBP (Aarons, 2004, cited in Aarons and Palinkas, 2007). It is also suggested that there are particularly problematic mediating factors in the context of child welfare services. Although the child or young person must be the central focus, their care and support is mediated through carers or parents

‘who may not be amenable to receiving such services’ (Aarons and Palinkas, 2007). Although all aspects of social work are vulnerable, Aarons and

Palinkas conclude that the nature of child welfare makes it particularly susceptible to the range of ‘system, structural, process, and person factors’

that compromise EBP implementation.

Costs

Although they are rarely identified in their own right, it follows from many of the barriers described in the literature that the costs of EBP are a major obstacle to implementation. In particular, the resource barriers (time and access to library facilities), shortcomings in training, limitations of supervision and availability of research evidence would in large part be improved if more funding were available. Weisz et al speculate that:

‘The high costs involved in providing adequate supervision, research support and overcoming organizational barriers may severely limit the wider feasibility of current state-of-the art integrative approaches to EBP implementation.’ (Weisz et al, 2003, cited in Mitchell, 2011, p212)

The question of who is going to pay for this development is rarely considered in the literature on EBP implementation. The required changes are likely to need investment so when funding is scattered around pilot projects it is hardly surprising that EBP has not been more widely implemented in the social work context (Mitchell, 2011). Nevertheless, many commentators have, on the basis of secondary reviews and empirical evidence, identified opportunities for improving the use of research in decision making and social work provision.

These ‘facilitators’ of EBP implementation are discussed in the next section.

Facilitators of evidence-based practice

Facilitators are often presented as the direct ‘answer’ or solution to the barriers. The implication is that by overcoming the barriers this will facilitate EBP uptake. For example, combining findings from their evidence review with their survey results, Gray et al (2012, 2013) made brief recommendations for dealing with inadequate agency resources, skills and knowledge needs, agency culture, the research environment and practitioner attitudes – the same themes under which they discussed barriers. They can also be

(17)

characterised as organisational facilitators in the sense that the organisation or agency has the authority to enable the changes.

Organisational facilitators

Inadequate agency resources: organisations need to invest in dedicated staff time to devote to EBP as well as staffed library facilities and information technology support to enable easy access to web-based databases.

Skills and knowledge needs: Gray et al’s findings indicate a need for

ongoing professional development and training to address gaps in knowledge, skills and understanding of research and EBP.

Agency culture: it is widely accepted that the culture of social work agencies must be changed to enable the implementation of EBP. Gray et al conclude that a ‘supportive culture’ would include management structures and

procedures, protocols and guidelines designed to embed EBP plus supportive practitioner supervision that helps social workers handle and implement research evidence.

Practitioner attitudes: Gray et al place the responsibility for improving practitioner attitudes to EBP in the hands of social work organisations. They maintain that organisations need to develop strategies and provide resources to ensure an appropriate mix of research and development teams versus frontline practitioners. The objective is to improve the skill base of the organisation for improving EBP capabilities (Gray et al, 2013). The wider literature validated their survey conclusions:

‘To facilitate the uptake of EBP in social work and human services practice, strategically driven, adequately resourced, multifaceted approaches to EBP capacity building in organizations are needed.’

(Gray et al, 2012, p157)

Individual facilitators

Focusing on the individual rather than the organisation, Collinscamargo (2007) has looked at the attributes or beliefs that practitioners must possess as a prerequisite for using research in their decision-making processes. The first is that they need an understanding of what EBP means within the context of their own agency.

Second, they need to believe that EBP is essential for ensuring that best practice is delivered and that EBP is an ethical imperative. Others have also written about the ethics of EBP: ‘the ethical obligations of social workers is to

“fully use evaluation and research evidence in their professional practice”

(National Association of Social Workers 1999, cited in Salloum et al, 2009) to maximize their clients’ well-being’ (Salloum et al, 2009, p263).

(18)

Next, Collinscamargo (2007) emphasises that practitioners must believe that EBP is valued and will be supported by the organisation, which obviously links with the need for organisational change discussed above.

Finally, Collinscamargo (2007) cites ‘expectancy valance’, which means that practitioners must believe that using research evidence to inform their

decision making and practice will ultimately improve outcomes for the families they support.

Some of the literature expands the concept of EBP beyond the ‘narrow application’ of research findings in practice. For example, Collinscamargo’s (2007) study of implementing EBP in child protection demonstrated the

importance of practitioners also having an outcomes-based orientation to their work with clients. This requires the practitioner to work in ways that make it possible to identify whether an intervention (rather than any other factor) has achieved change for the individual or family. Using agency data systems, social workers should record initial goal setting and evidence subsequent progress. Finally, the author concludes that practitioners should be supported in an ongoing process of self-reflection.

A case study in facilitating evidence-based practice

The evidence is unanimous in identifying practitioners’ lack of time as having to be addressed or accounted for in trying to implement EBP. The ‘What Works for Children’ project attempted to address the ‘time barrier’. An implementation officer was funded to work specifically with busy social

workers, helping them to identify research questions that had arisen as part of their practice. The implementation officer then searched for relevant evidence, appraised and synthesised it and provided evidence summaries to the

practitioners (Stevens et al, 2005, cited in Gray et al, 2012).

Almost all the social workers reported that the summaries were useful although a disappointingly small proportion (less than half) said that they thought there would be any change to agency practice as a result of what they had learnt. This finding underlines the importance of a multifaceted approach to ensuring that research evidence can be implemented in social work

decision-making and support. There is a limit to the research evidence that practitioners alone can implement; a strategic approach is required with research application driven at the management level (Barratt, 2003 and Bellamy et al, 2008 cited in Collinscamargo, 2007; Jones et al, 2007 and La Mendola et al, 2009, cited in Gray et al, 2012).

(19)

Implementing effective interventions

Introduction

The last two chapters were based on a broad concept of ‘evidence-based practice’; namely, drawing on research evidence in the decision-making process. This review found that there is a wealth of literature focused on how effective interventions can be transferred directly into the social work setting.

This is a very narrowly defined concept, which refers to implementing a standard, manualised social work intervention for which a body of supporting effectiveness evidence exists. This type of intervention is typically referred to as an ‘evidence-based intervention’ or ‘empirically supported treatment’

(EST). It is still an approach to EBP but in this chapter we will see how it differs from the notion of using evidence to inform decision-making. We will also see that there are different views about how strictly it is reasonable to transfer an EST into practice. At one end of the spectrum there are strict advocates of replicating effective ESTs in social work, while at the other end of the spectrum there are those who acknowledge the importance of

mediating factors such as practice wisdom and the complex realities of social work. These positions are discussed below. The chapter then examines barriers and facilitators to implementing ESTs and finishes with a case study of one attempt to implement ESTs.

Empirically supported treatments

Definitions of ESTs vary in their specificity. Many studies assume a narrow definition, where ESTs refer to standardised treatment protocols that have been demonstrated to be clinically effective in randomised controlled trials.

Some writers use the term more loosely, referring to ‘treatments supported by empirical evidence’ or ‘interventions showing beneficial effects in outcome research’ (Weisz et al, 2006, cited in Mitchell, 2011, p209 ).

A distinguishing feature of the EST literature is that it focuses on treatments for defined disorders or problems. This position relies heavily on the notion that specifiable disorders exist and that they can be meaningfully

distinguished from one another. This arguably naive assumption is probably the greatest flaw in the strict EST position and leads us to the next:

‘integrative approaches’.

Integrative approaches

Critics of the EST model from a practitioner perspective have argued that it is too narrow in the range of evidence it accepts and that it places too little value on the expertise of clinicians or the experience and preferences of clients (Walker, 2003, Larner, 2004; Epstein, 2009 – all cited in Mitchell, 2011;).

These critics have stressed the need to combine science and practice wisdom in an understanding of how social workers implement empirically based

interventions. Consistent with this, some implementation researchers have begun working with the Institute of Medicine’s definition of EBP, which centres

(20)

on the integration of best research evidence with clinical expertise and patient values. Researchers taking this position have examined the ways in which and reasons why ESTs are adapted by practitioners when implemented in practice settings (Aarons and Palinkas, 2007; Stirman et al, 2004, Hogue et al, 2008; McHugh et al, 2009 – all cited in Mitchell, 2011). They conclude that modification of new treatments is inevitable, because client characteristics, needs and contexts vary from those in which the treatments were originally developed and tested.

Common factors approach

The third significant position rejects the EST model altogether and is influenced to some extent by the ‘integrative approach’ (above).

The crux of the ‘common factors’ approach is the claim that large-scale meta- reviews of psychological treatment effectiveness in fact show that therapeutic techniques specific to particular treatments may account for no more than 15 per cent of the variance in behaviour change and other outcomes. Instead, authors point out that several factors common across most treatments

account for the bulk of the variance (Clark, 2001a, 2001b, Hubble et al, 1999 and Miller and Duncan, 2000 - cited in Mitchell, 2011;). Key among these

‘common factors’ are client factors (e.g. beliefs, understandings, optimism, support networks) and the quality of the client–therapist relationship. In other words, when practitioners implement ESTs, it is not the empirically supported techniques that account for outcome changes but other factors or influences brought to the treatment setting from outside. These are said to account for 40 per cent (client factors) and 30 per cent (client–therapist relationship) of the variance in behaviour change or other outcomes.

Practice or core elements approach

It is evident that the strict EST approach is limited in its applicability to

agencies serving children and young people with multiple and complex needs.

Mitchell (2011, p 214) points out that:

‘… a new model is needed which reconciles values from clinical science and practice wisdom, accommodates a diverse array of interventions for a wide variety of psychosocial issues, facilitates collaboration across sectors, and minimizes the inevitable costs of change.’

A key starting point is to respect and build on existing practice, rather than attempting to replace it with a new empirically supported intervention. It is therefore necessary to gain a greater understanding about current practice when trying to build evidence into practice.

The ‘practice or core elements’ approach is based on the idea that effective treatment models are comprised of numerous elements that can be identified, specified and employed in different ways. It rejects the assumption that these elements can only be organised and delivered in fixed arrangements specified

(21)

in EST models. Taking an example from the mental health field, Chorpita and Daleiden have found that different frequencies and combinations of practice elements are used in evidence-based treatments for a variety of mental health problems, age groups and ethnic groups. They have developed a Distillation and Matching Model to help decision makers identify practice elements likely to be most effective for particular clients or groups, based on how frequently those elements are found in successful treatments with populations matching the client on specified variables (Chorpita and Daleiden, cited in Mitchell, 2011). In doing so, evidence-based yet tailored solutions can be introduced to complex practice situations.

The practice or core elements approach offers solutions to several of the key barriers to implementation. Most importantly, it is conducive to reciprocal adaptation between treatment and context. This means that the intervention can be adapted to fit the practice context while recognising that practice may need to change to accommodate the intervention.

By breaking treatments down into small elements, practitioners and clients are better able to choose therapeutic content that addresses psychotherapeutic needs and therapeutic techniques best suited to the skills and style of the therapist and the nature of the client–therapist relationship. Content and techniques can be more readily selected and organised according to the developmental stage of a young person. This high level of adaptability to context may neutralise key barriers to EBP currently manifest in provider attitudes, the characteristics of client populations and the characteristics of usual practice (Mitchell, 2011).

Although the practice or core elements approach appears to offer a pragmatic, flexible solution, its proponents concede that deconstructing treatment

protocols for their essential elements may compromise their effectiveness.

Protocols are, after all, more than a sum of their parts and breaking them up in this way may eliminate implicit, un-definable or meta-level characteristics.

Implementing empirically supported interventions

A key question underpinning this debate is whether the implementation of manual-based ESTs in practice settings achieves positive outcomes compared with those in controlled trials. We have seen why many

commentators would say this is not possible. However, some research has demonstrated that manual-based therapies can be successfully employed in community settings (Stirman et al, 2004, cited in Mitchell, 2011). On the other hand, process evaluation has also demonstrated that success is difficult and costly to achieve (Weisz et al, 2003; Liddle et al, 2006, cited in Mitchell, 2011).

In their overview of ESTs for childhood anxiety disorders, Salloum et al (2009) highlight reasons why implementation is difficult and usefully provide options for overcoming the most common barriers to implementing ESTs. Crucially, for the purposes of this report, Salloum et al’s barriers and means of

(22)

overcoming them specifically relate to the implementation of ESTs in the context of social work with children and young people. They are as follows.

A belief that an EST will not work in the ‘real world’ of social work practice

Scepticism about the generalisability of treatments tested in controlled environments is a potential barrier to implementing empirically supported childhood anxiety interventions. Given the complexity of the practice context and client factors, this scepticism is hardly surprising. If treatments for

childhood anxiety disorders have only been tested with homogeneous groups in controlled settings, the concerns of social workers, who treat children with co-morbid conditions from diverse populations and in diverse settings, are quite valid.

Options to address this barrier include:

 using an EST as a first line of treatment

 testing ESTs in real-world settings

 adapting ESTs to meet the unique needs of clients

 evaluating results.

If an EST for childhood anxiety has only been tested in a controlled setting, social workers have a unique opportunity to evaluate the treatment in a real- world social work practice setting – this way the assumption that a specific treatment developed in tightly controlled clinical trials is not effective in ‘real- world’ practice can be tested (Salloum et al, 2009).

A belief that therapeutic relationships or rapport could be damaged by using manualised treatments

In terms of the effects on practitioners, some of the perceived advantages of manualised treatment are the belief that manuals provide motivation, structure and resources for practitioners, and improve practitioners’ skills. Perceived disadvantages include the belief that manualised treatment is overly

mechanical, not useful for attending to individual needs and may limit therapeutic creativity and authenticity. There is also a commonly held belief that manualised treatments are ‘cookie cutter’ approaches that devalue the therapeutic relationship between practitioners and clients (Salloum et al, 2009).

There are in fact a number of examples of manualised treatments that do recognise the importance of the clinician–client relationship and therapeutic processes. Practitioners are encouraged to develop a healthy therapeutic relationship and strong rapport while using manualised treatments for children with anxiety disorders. For example, in the Coping Cat: Cognitive-Behavioral Therapy for Anxious Children manual (Kendall and Hedtke, 2006, cited in Salloum, 2009), the first task is to build rapport: ‘Rapport between the anxious

(23)

child and the therapist is critical to the success of therapy, and it is certainly worthwhile to devote ample time to the establishment of a trusting relationship between the child and therapist.’ Also, prior to the implementation of a

manual-based treatment, a trusting relationship can be facilitated by

discussing with the parent and child (when appropriate) ways in which ESTs are known to be effective in treating childhood anxiety (Howard et al, 2003, cited in Salloum et al, 2009). It does therefore appear that implementing a manualised treatment does not negate the need for building a trusting relationship between practitioner and client, in fact it could be said to rely on that rapport.

A belief that caseloads are too heavy to implement ESTs

ESTs are often cited as impractical in managed care settings. Practitioners note that substantial caseloads often require over 40 hours a week of client contact and documentation, making it challenging to find additional time to become proficient in the implementation of ESTs (Nelson, 2006, cited in Salloum et al, 2009). Citing fatigue from high caseloads, practitioners also express a lack of energy or desire to read evidence-based research and treatment manuals or to complete required training on ESTs.

On the other hand, Salloum et al point out that once ESTs are learned and practitioners are competent in implementing them, these systematic and efficacious approaches may actually assist social workers in managing heavy caseloads. For example, in a study of service provider perspectives after implementing a new EST in the child welfare system, many caseworkers perceived the structure of the EST as a positive treatment aspect and the manualised format as being helpful. Therefore, rather than being viewed as a hindrance or aggravation when providing services, ESTs should be seen by social workers as a cost-effective and essential tool to navigate the care system (Aarons and Palinkas, 2007).

A belie f that it is too difficult to adapt practice to implement ESTs with fidelity

Probably one of the biggest objections to implementing manualised ESTs is the notion of following a prescribed intervention within real practice settings.

Many social workers share the view that integrating ESTs with a high degree of fidelity is unfulfilling and limits their creativity and innovation in the

therapeutic process (Addis, 2002, cited in Salloum et al, 2009). Even when the will was there, project staff in the Collinscamargo (2007) study found it difficult to measure fidelity to the practice model being tested. Supervisors noted their inability to implement all aspects of the clinical supervision model they had been taught. Consistent with this finding, Nelson et al (cited in Salloum et al, 2009) found that flexibility when employing manualised

treatment is an essential component for practitioners due to the complexity of individual client situations and needs.

In response to concerns about maintaining fidelity, Salloum et al (2009) point out that even in clinical trials for children with anxiety disorders, treatment

(24)

manuals are used in a flexible manner where fidelity to treatment principles are maintained, while individualised care is provided. For example, in an open trial of cognitive-behavioural therapy with children with obsessive compulsive disorder conducted in a community clinic, the treatment manual included the statement:

‘It is emphasized that the manual should be used as a guide and in a flexible way, where consideration is paid to the child’s age and

developmental level and to individual variations in problems and themes that need to be addressed.’ (Valderhaug et al, 2007, p 582, cited in Salloum et al, 2009, p268 )

Along these lines, most ESTs appear to allow for and encourage flexibility in implementation.

Ensuring fidelity to a prescribed treatment can in any case be complex. In fact, even if a treatment is delivered as it was prescribed, the issue of level of skill, or degree of competence, with which it is delivered may be a factor that influences fidelity. Methods to ensure fidelity can range from simple checklists for completing tasks outlined in treatment manuals, to intensive clinical

supervision and training, to more labour-intensive strategies such as having independent evaluators review taped sessions (Tucker and Blythe, 2008, cited in Salloum et al, 2009).

A lack of training opportunities for learning ESTs

Implementation of ESTs generally demands extensive training of practitioners and ongoing support such as clinical supervision, secondary consultation and case reviews (Mitchell, 2011). Therefore, a lack of training opportunities (i.e.

workshops, conferences, web-based information and ongoing supervision) is a potential barrier to implementation. It is important to recognise that training barriers may be confounded by factors within the organisational and

community contexts, such as resources, constraints, mandates and

availability of trained supervisors for ongoing supervision. Indeed, the forms of capacity building required to support implementation demand long-term

investment of funds and other resources.

To address the barrier of social workers needing more training opportunities to learn ESTs, Salloum et al (2009) suggest that graduate schools of social work need to increase the instructional opportunities and practical experience of using ESTs to treat mental health conditions such as anxiety disorders. In addition, research on implementation strategies to advance ESTs in real- world social work practice settings is needed.

A case study in implementing ESTs

A useful illustration of how to roll out an EST is given in a study of an intervention to reduce child neglect (Aarons and Palinkas, 2007). Case

managers were initially trained over five days, involving trainers modelling the required skills through role play and finishing with assessment and checklists

(25)

for measuring skills acquisition. The intervention ‘Safe Care’ was then rolled out in the actual practice setting by case managers who were observed and coached by consultants (trainers).

To evaluate the roll-out, the 15 participating case managers and two of the consultants/trainers were interviewed. The objective was to identify their views about factors influencing implementation of an EST in a child welfare system.

Six themes emerged as key determinants of whether the intervention could be implemented, some of which validate the barriers and facilitators identified in this review.

Acceptability to the case worker

The more experienced case managers resisted the Safe Care model because it was seen as being too structured in covering areas they already routinely address in usual practice.

Suitability of the EST to the needs of the family

The main problem here was that Safe Care was obviously suited to families with young children. The model contained elements to be delivered during

‘play’, during ‘bath time’ and during ‘dress time’. Therefore it was virtually impossible to see how it could be implemented with older children. Case managers reported a real quandary when families resisted an element of the model because it was deemed ‘ridiculous’ and as result the practitioner’s own credibility was undermined. This is in line with the wider literature that

describes difficulties in maintaining fidelity when implementing ESTs in practice. Case managers also noted that ‘the complex nature of family problems and situations limited the appropriateness and effectiveness of the EBP’ (Aarons and Palinkas, 2007, p414 ). Ultimately, families had so many competing demands that case managers had to stray from the Safe Care protocol and address urgent needs as a priority.

Case manager motivation for the EST

The implementation of Safe Care was affected by various practitioner motivations, including: the fit between the intervention and their parenting experiences, the perceived impact on their own professional competence, the fit with their current practice and the fit between the intervention and the organisation’s mission. Successful implementation also depended on whether the practitioner thought it was having a beneficial effect: ‘when you see the results of how it works, it makes you want to continue to do it’ (Aarons and Palinkas, 2007, p414).

Experience with being trained in Safe Care

The process of training and the practitioners’ perception of the trainers also shaped their attitudes to implementation. Practitioners were especially positive when trainers demonstrated flexibility and responsiveness to the realities of practice: ‘the trainers have been really good at listening to us and

(26)

relaxing a little bit – changing the format so it’s not too rigid’ (Aarons and Palinkas, 2007, p415). There were deep concerns when the trainers showed that they did not understand the realities of rolling out the EST in practice.

Extent of organisational support for the EST

Seen as crucial to implementation, it was apparent that this EST had full agency support. This was found at all levels of the organisation and to have line manager support was especially important for working out how to implement the EST in the context of real practice examples.

Impact of the EST on processes and outcomes

This is in line with the wider literature, which cites perceptions of adapting ESTs to real practice settings as a potential barrier. In the case of Safe Care, views were mixed about how well it fits the realities of practice and how well it can respond to clients’ complex circumstances. Case managers clearly felt that it could not be applied across the entire client population.

***

In a later study comparing implementation of Safe Care with another EST (Palinkas et al, 2009), the same authors along with others also identified qualities in the relationship between case managers (end users of the EST) and trainers (propagators of the EST) that are required for successful EST implementation.

The first was for both parties to be accessible to each other during the implementation process. So, for example, if a case manager had a query about an aspect of the model implementation, trainers felt that they should drop everything to respond otherwise the implementation might stall. End user and propagator also had to have mutual respect for each other and share a common understanding and language about the EST. Finally, compromise was seen to be key. The propagators of both ESTs made it clear to case managers that some adaptation of the model was permissible. In fact, the process started with supervisory sessions where the propagators worked out how the model could be implemented within practitioners’ usual way of working with clients. This served to alleviate practitioners’ worries about the rigidity of the model and the extent to which it could fit the practice context:

‘Even though they [the propagators] have a very strict focus on the research and making sure everything stays, you know, tight with respect to the research, they also want to make sure that it’s delivered in a way that is implementable and really useful to the family…’ (Palinkas et al, 2009, p608)

The only drawback to allowing this flexibility was that, occasionally,

practitioners were unclear about the boundaries and some propagators were seen as being more or less ‘loyal’ to the model than others. Practitioners

(27)

welcomed flexibility but for implementation to be successful, that flexibility had to be consistent.

(28)

Recommendations for future research and development work

Recommendations for future research

In terms of informing an understanding of how social workers use research evidence, with what effect and how this could be increased, the literature reported here identified many recurring themes. We know the range of

barriers that limit successful implementation of research into decision-making and practice. However, an important contribution would be made by research that documents what changes to social work practice occur when social workers do use research sin decision-making.

Another area that would benefit from research investigation is the extent to which the tacit knowledge used in practice is informed by research.

Recommendations for development work

There is a pressing need for development work to overcome the barriers to the use of research in decision-making and practice and to ensure that evidence and practice are effectively combined to improve the impact of social work.

Evidence suggests that there are two main areas in which development should take place.

Improving collaborative working between practitioners and researchers

Better collaboration between social workers and the so-called ‘producers’ of research evidence would improve the implementation of ESTs but also EBP in the broader sense of ‘decision-making processes’. Salloum et al (2009)

focused on improving the implementation of ESTs:

‘Overcoming barriers to using ESTs to treat childhood anxiety disorders in social work practice will require … interdisciplinary university–agency partnerships to advance the field by conducting effectiveness studies that use randomized clinical trials to compare treatment as usual versus ESTs.’ (2009, p271)

The authors also believe that these partnerships would help practitioners and researchers working together to ensure that intervention manuals are adapted to the complexities of social work practice. They would be made easier to use and more flexible, increasing adoption and improving fidelity. Salloum et al (2009) also propose that companion manuals should be jointly developed to help practitioners implement ESTs in diverse settings, including with diverse populations. Through this process, researchers and practitioners could learn from each other about what interventions are effective with different

populations and what strategies are best for implementing ESTs in real practice settings (Salloum et al, 2009).

(29)

The development of working relationships between researchers and

practitioners would similarly help to improve the use of research evidence in decision-making processes. It would challenge the notion of EBP as a one- way linear process and improve the practice relevance and applicability of research. This is reflected in the international literature, where evidence- based policy and practice have begun to recognise the need to build on the concerns of practice and to improve the collaborative working relationship (see Salisbury Forum Group, 2011; Julkunen et al, 2012). Gray et al (2012, p165) conclude that ‘ongoing innovation within organisations to discover new ways to bridge the research–practice divide in the human services is also indicated’.

Curriculum development

Another major area for development is in (pre- and post-qualifying) training and support for social workers since the literature raises some issues about practitioners’ knowledge and confidence in handling research evidence.

Gordon et al (2009) also suggest that there might be opportunities to develop a greater range of ways to help student and qualified social workers be more confident at articulating how they have incorporated research evidence in reaching decisions. Specific skills training should teach students the processes involved in using research; including finding, appraising and synthesising evidence, and course content and tools should be designed accordingly.

Finally, Salloum et al (2009) believe that graduate training should be developed to train and support social workers in implementing ESTs.

(30)

Summary and conclusion

Summary

This review set out to find out the extent to which social workers use research in their decision making.

Social workers understand the importance of evidence for decision making, and they use research in decision making. However, they tend to use evidence from a broad range other sources more frequently, such as case histories and notes, their own observations and reports from other

professionals, which combined help them to make judgements about risk and interventions. Some of this evidence may include knowledge where the research origins have been lost – tacit knowledge – and so research can be thought of as having only an indirect influence in decision making in such cases.

Other things that have been reported to influence social workers’ decision making are ‘gut feelings’, intuition, team or individual bias and organisational norms. With regard to gut feelings, again it is important to note that these may be research based but due to being absorbed into professional practice have lost their research label.

The review therefore found that research tends to have an indirect influence on social workers’ decision-making processes and many social workers use other types of ‘evidence’. Those who do use research for decision making are more likely to have had recent formal education rather than being more

experienced.

When there are gaps or contradictions in the evidence that a social worker has gathered for a particular decision, this poses a significant threat to the decision-making process: they feel less confident in the decisions they are making, and they seek further information to bolster the evidence they already have. It is interesting to note that the use of research evidence features more when there is a lack of observational evidence.

Several factors or barriers have been put forward as to the reason why research does not make a bigger contribution to decision making:

 inadequate agency resources

 skills and knowledge needs

 agency culture

 the research environment

 practitioner attitudes.

However, this review has highlighted some ways of improving the use of research in decision making and social work provision, drawn from the literature, which can be divided into organisational and individual facilitators.

(31)

Organisational facilitators include:

 improved information technology access and support

 ongoing professional development to address gaps in skills, knowledge and understanding of research

 a culture that is supportive of the use of research

 research and development teams.

Individual facilitators include the following:

 Social workers need an understanding of what EBP means within the context of their work.

 They need to believe that EBP is essential for ensuring that good practice is delivered.

 They must believe that EBP is valued and will be supported by the organisation.

 They must believe that using research evidence to inform their decision making and practice will ultimately improve outcomes for the families they support.

 They must have an outcomes-based approach to their work.

 They must go through an ongoing process of self-reflection.

(Collinscamargo, 2007)

Conclusion

In conclusion, in this review we have seen that social workers use research evidence to inform decision making, albeit to a lesser extent than other

knowledge types such as user and carer views, legislation, agency norms and their own practice wisdom. The review also found that even in the complex social work context, interventions proven to be effective can be transferred, albeit with difficulty, to a practice setting.

Drawing on research evidence to inform decision-making and social work practice is largely viewed in a positive light in the sense that if it is done

successfully then social work outcomes can be improved. Gambrill states that social work has to adopt an evidence-based approach to ‘move forward’

(Gambrill, 2001, cited in Collinscamargo, 2007). However, Webb rejects the idea that social workers will change their practice, even in the face of

convincing research evidence – and in this report we have seen the reasons that might explain this. Even if all those barriers could be addressed, some believe that child welfare is just too complex to benefit from an emphasis on empirically proven programmes (Angel, 2003, cited in Collinscamargo, 2007).

On balance, the literature included for this review suggests that an integrated approach to using evidence is desirable. To maximise the positive impact of social work, practitioners must be enabled to draw on the best available

evidence of what works and what is acceptable to children and families. At the same time, we should accept the role of intuitive processes in social work decision making, as Collinscamargo (2007, p29) concludes: ‘One cannot

(32)

remove the importance of individual judgment in assessment and decision- making …’.

(33)

References

Aarons, G.A. and Palinkas L.A. (2007) ‘Implementation of evidence-based practice in child welfare: service provider perspectives’, Administration and Policy in Mental Health & Mental Health Services Research, vol 34, no 4, pp 411–419.

Collins, E. and Daly, E. (2011) Decision making and social work in Scotland, Glasgow: Institute for Research and Innovation in Scotland.

Collinscamargo, C, (2007) ‘Administering research and demonstration

projects aimed at promoting evidence-based practice in child welfare’, Journal of Evidence-Based Social Work, vol 4, no 3–4, pp 21–38.

Fisher, M. (2013) ‘Beyond evidence-based policy and practice: reshaping the relationship between research and practice’, Social Work & Social Sciences Review, vol 16, no 2, pp 20–36.

Gordon, J., Cooper, B. and Dumbleton, S. (2009) How do social workers use evidence in practice?, Milton Keynes: Practice-based Professional Learning Centre for Excellence in Teaching and Learning, The Open University.

Gray, M., Joy, E., Plath, D. and Webb, S.A. (2012) ‘Implementing evidence- based practice: a review of the empirical research literature’, Research on Social Work Practice, originally published online 15 November 2012, doi:

10.1177/1049731512467072.

Gray, M., Joy, E., Plath, D. and Webb, S.A (2013) ‘What supports and impedes evidence-based practice implementation? A survey of Australian social workers’, British Journal of Social Work, first published online 18 October 2013, doi: 10.1093/bjsw/bct123.

Julkunen, I., Austin, M.J., Fisher, M. Uggerhøj, L. (2012) ‘Helsinki statement on social work practice research’ (available at http://blogs.helsinki.fi/practice- research-conference-2012/files/2013/06/Helsinki-Statement-Final-June- 2013.doc-pdf.pdf, accessed 9 July 2014).

Mitchell, P. (2011) ‘Evidence-based practice in real-world services for young people with complex needs: new opportunities suggested by recent

implementation science’, Children and Youth Services Review, vol 33, no 2, pp 207–216.

Osmond, J. (2004) ‘Formalizing the unformalized: practitioners’

communication of knowledge in practice’, British Journal of Social Work, vol 34, no 5, pp 677–92.

Osmond, J. and O’Connor, I. (2006) ‘Use of theory and research in social work practice: implications for knowledge-based practice’, Australian Social Work, vol 59, no 1, pp 5–19.

(34)

Palinkas, L.A., Aarons, G.A., Chorpita, B.F., Hoagwood, K., Landsverk, J. and Weisz, J.R. (2009) ‘Cultural exchange and the implementation of evidence- based practices: two case studies’, Research on Social Work Practice, vol 19, no 5, pp 602–612.

Salisbury Forum Group (2011) ‘The Salisbury statement’, Social Work &

Society, vol 9, no 1, pp 4–9.

Salloum, A., Sulkowski, M.L., Sirrine, E. and Storch, A.E. (2009) ‘Overcoming barriers to using empirically supported therapies to treat childhood anxiety disorders in social work practice’, Child and Adolescent Social Work Journal, vol 26, no 3, pp 259–273.

(35)

Appendix: Searching, screening and assessment

The SCIE project team worked with the research commissioners of the

Metropolitan University College in Denmark to agree questions for this review, based on previous research and knowledge. These were then tested within Social Care Online and supplemented with additional searches.

Using these preliminary data, the SCIE information specialist developed a search protocol, which was agreed with the commissioning team. This informed the development of appropriate free-text and thesaurus terms for bibliographic database searching. Figure 1 illustrates the overall development process.

Figure 1: Overall development process

Summary of the overall process

The stages in the research process can be summarised as follows:

1. Agree the overall proposal for the review with Metropolitan University College.

2. Meet with the team at Metropolitan University College (pre and post search).

Referencer

RELATEREDE DOKUMENTER

Evidence based medicine is the use of best evidence in making decisions about the care of individual patients. It requires integration of individual clinical expertise with the

• It is after lunch in the day care and one of the social educators (pedagogue) is in the bathroom changing nappies on a group of two and three year old children. • The

Part-time work is well covered in the Danish collective agreements and has typically similar rights as full- time workers. There is little evidence of severe gaps and about 80 %

During the 1970s, Danish mass media recurrently portrayed mass housing estates as signifiers of social problems in the otherwise increasingl affluent anish

Nonetheless, it is of note that in Norway (unlike Eng- land) Child Welfare Services may cover the costs of mental health support for young people who have been in care, and

8 People who work in health and social care services feel engaged with the work they do and are supported to continuously improve the information, support, care and treatment

The field of this vision is professional work with children in schools and kindergarten – a work with different professionals in action such as psychologist, social workers,

Her skal det understreges, at forældrene, om end de ofte var særdeles pressede i deres livssituation, generelt oplevede sig selv som kompetente i forhold til at håndtere deres