Home / Essays / Swk 640 week 10

Swk 640 week 10

Swk 640 week 10

Use headings APA style. APA CITING. PLEASE CIT TEXTBOOK.

1.

2.

3.

Okay, folks. We are moving forward with knowing what EBP is, and with how we can use it in practice.
Please take some time to read the article hyperlinked in the assignment directly below (EBP Ethics),( I WILL ATTACH THIS DOCUMENT.) and post, focusing on one of the items in the article.
Please write your paper related to Point 5;
Now we all know – or better know! – That ethics is crucial in all social work practice. How do we conduct ethical practice using EBP tools?

We may start here
https://www.socwork.net/sws/article/view/76/335

Because some of you had trouble opening it (it links to a page that has the link on it, and if that’s too confusing, I’m sorry), here it is. But keep reading below the article for the specifics of how to do the assignment. DON’T STOP WITH READING THE ARTICLE:
The Challenges of Implementing Evidence-Based Practice: Ethical Considerations in Practice, Education, Policy, and Research
Amanda J. Farley, Dennis Feaster, Tara J. Schapmire, Joseph G. D?Ambrosio, LeAnn E. Bruce, C. Shawn Oak and Bibhuti K. Sar, University of Louisville, USA
1 Introduction
Over the past century, the field of social work has evolved from grass-roots community-based movements to an intricate network of formally trained professionals promoting social research, education and practice (Klein and Bloom 1994). While social work professionals vary widely in their roles, skills, and attitudes toward the nature and future of the profession, they are united through the shared embrace of underlying ethical principles?beneficence, non-maleficence, autonomy, and justice (Freeman 2006; NASW 2008)?that guide their interactions with clients. Social workers hold in highest regard the intention to provide ethical and competent services to their clients. Nevertheless, the questions remain: How do social workers know the services they offer are ethical and competent? How do they know that they are providing the best available treatment or intervention, or that services are offered in a way that benefits clients? Is evidence-based practice (EBP) the answer to these questions?
The concepts underlying EBP in health and social services as it is known today?collectively termed from individual disciplines? treatment of the concept (evidence-based medicine, evidence-based nursing, evidence-based policy, evidence-based social work, evidence-based education, etc.)?have evolved over centuries. The use of knowledge as evidence dates as far back as 280 B.C. (e.g. by the Greeks in the Medical Empirics), ?One learns from chance observation, from colleagues past and present, and to reason by analogy??this patient looks like one I saw before?? (Best and Neuhauser 2005, 462). The philosophical, ethical, intellectual, socio-political, technical, and practical elements that make up the concept of EBP have ebbed and surged over time, blending in various ways as they were assimilated and subsequently accommodated by those entities possessing and exerting the most power within and over the environment at the time.
EBP at its core is about curiosity and knowledge. Where did this knowledge come from? Who imparted this knowledge? When did it become knowledge? How does one know if it is good knowledge? Who decides if it is good knowledge? Why is this knowledge better than what one already knows? When considered within the context of professional or expert activity, the concept of duty (specifically moral duty to those patients and clients for whom all this effort is expended) then enters the mix.
In order to fully understand EBP as it relates to knowledge and the moral duties built into the professional pursuit and application of knowledge, one must first be prepared to acknowledge that knowledge (and its varying appropriateness as evidence in practice) is, has been, and always will be a moving target, evolving over time as efforts to prove, disprove, or simply inform our professional activities occur. There may be debate as to when EBP as we know it today actually came into existence (Goodman 2003; Hayes 2005) as well as what its true form should be in the practice of social work (Gray and McDonald 2006); but the general consensus holds that the consideration of evidence on which to base the practice of one?s (or another?s) profession has been a topic of great interest and debate for as long as a moral or ethical duty has been deemed publicly to be a part of that practice.
There are similar views within social work that add to one?s conceptual understanding when searching for a common or unifying definition of EBP. Sackett et al. (1996, 71) defined EBP as ?the conscientious, explicit, and judicious use of current evidence in making decisions about the care of individual patients.? Gambrill (2007, 449) stated, ?EBP describes a philosophy and process designed to forward effective use of professional judgment in integrating information regarding each client’s unique characteristics, circumstances, preferences, actions, and external research findings.? NASW (n.d. ? 3) defined EBP as ?a process in which the practitioner combines well-researched interventions with clinical experience, ethics, client preferences, and culture to guide and inform the delivery of treatments and services.? Petr and Walter (2005, 254) referred to EBP as a ?broadened notion [that] recognizes the importance of the professional and the consumer in determining the relevance of the evidence to the situation at hand.? Gilgun (2005, 52) offered the following four cornerstones of EBP in social work:
(1) research and theory; (2) practice wisdom, or what we and other professionals have learned from our clients, which also includes professional values; (3) the person of the practitioner, or our personal assumptions, values, biases, and world views; and (4) what clients bring to practice situations.
A procedural definition, by Rubin and Parrish (2007a, 407), offers a more detailed explanation:
EBP is a process in which practitioners attempt to maximize the likelihood that their clients will receive the most effective interventions possible by engaging in the following five steps:
Formulating an answerable question regarding practice needs
Tracking down the best evidence available to answer that question
Critically appraising the scientific validity and usefulness of the evidence
Integrating the appraisal with one’s clinical expertise and client values and circumstances and then applying it to practice decisions
Evaluating outcome (with the use of single-case designs if feasible).
Conceptually, EBP promotes the value of utilizing knowledge from many sources and, through critical evaluation of the data from these sources, making an informed decision (in conjunction with the client) on the most effective course of treatment/intervention (Gambrill 2007); this can be applied individually or on a broader agency/program level. ?Evidence? is described by Gambrill (2006b, 99) as ?ground for belief, testimony, or facts regarding a claim for conclusions,? and comes in the form of empirical data, practice wisdom gained through dissemination of experience between colleagues/practitioners, and client feedback (the client being a collaborative partner in this process, whose knowledge, values, and goals are also considered).
While EBP provides a comprehensive philosophy, structure, and process for providing evidence-based ethical and competent direct practice (Gambrill 2006a), and as such has become the gold standard for many disciplines, its adoption by social workers often appears uneven, at best. While EBP has noted support among social work academicians, there seems to be considerable difficulty in its implementation by social workers (and students) in the field with respect to practice, education, policy, and research. The difficulty arises in attempting to actualize EBP in a manner that maintains fidelity to the process, or doing EBP and doing it right. This difficulty invites perturbations within social work practice, policy, education, and research that have far-reaching ethical implications around the implementation of EBP in the field. In this paper, using the NASW Code of Ethics as the primary lens for understanding, we discuss the challenges associated with implementing EBP in social work. We hope that by identifying and parsing concerns related to the process of fully implementing the EBP model, social workers may gain further insight into how to embrace EBP as a best-practices framework, while negotiating the barriers that so often prevent this from fully and successfully occurring.
2 Challenges Associated with Implementing EBP in Social Work
The challenges associated with implementing EBP in social work are best illustrated by examining specific arenas within social work: practice, education, policy, and research.
Practice
To illustrate the challenges faced in practice, let us consider a practitioner who is working with an underserved population?female juvenile sexual offenders in a residential treatment program:
On assuming her position, a practitioner found that most of the materials currently being used in the program have male-oriented themes and testimonies featuring male sexual offenders. Far from being gender-sensitive, these materials are biased and at-times counterproductive to effective treatment with female offenders. Intuitively, the practitioner believed that relevant materials based on competent research would be available. Wishing to provide best practice interventions for her clients, the clinician embarked on a search to find the most effective and appropriate treatment for her clients. Despite carrying a full caseload of 18 residential clients and their families, while maintaining the agency standard of at least 80% direct-client services, the practitioner began her research.
Turning to literature available to her, the practitioner found empirical evidence related to female juvenile sexual offenders to be scarce. A number of Internet searches (the practitioner did not have access to academic databases) resulted in hundreds of sources relating to juvenile sexual offenders, but only 4 or 5 of those articles were found to be empirical, rigorous, and related to the population in question. Studies in this area were significantly more likely to be conducted with male offenders and then generalized in discussion for use with females. In addition, the articles that dealt with this issue often lacked rigor in their choice of design and were often based on retrospective data.
The usefulness of research evidence in direct practice or the development of programs in organizations is influenced, and sometime limited, by a number of factors as suggested in the example above. Small (2005) discussed examples of these relating to the immaturity of social sciences and issues of generalizability. In the first, he noted that many questions relevant to the work of practitioners have not yet been addressed by research, making information relevant to a particular problem or issue difficult to locate and possibly leading them to categorize or conceptualize problems into existing categories that may be only partially appropriate, if at all. A more general issue he offered was the ?primitive nature of many of our research methods,? suggesting that ?theories of human behavior and family and social systems tend to be far more sophisticated than the methods available to study them . . . [and that] many of our measurement tools have a limited ability to capture psychological and social phenomenon with the level of precision often needed in practice? (324). Small (2005) did not state that this imprecision means we should disregard the information obtained with these tools, but he did caution that we should remain aware of the approximate nature of the reality studied as we interpret and apply the findings.
As noted above, generalizing research findings to practice situations, especially those involving populations other than the study sample, can manifest a number of problems (Small 2005). This sentiment is echoed by Rubin and Parrish (2007b, 412) who note that ?although progress is being made regarding generalizability, we may be far from the day when social work practitioners or social work practice instructors can be confident about generalizing RCT [randomized control trial] findings to typical social work clients.? Doing so may actually place the practitioner in conflict with the intent of EBP to not only search for and consider the most relevant research, but also to consider appropriateness for the individual with whom one is working specific clients.
3 Education
The practitioner, herself, attained her MSW before EBP was encouraged as an integral element of master-level curriculum by NASW and the Council on Social Work Education (CSWE), but has become increasingly cognizant of the reported value and prevalence of EBP noted both in current professional literature and via the students she has supervised.
After a review of published research failed to provide evidence relevant to her population, the clinician turned to the practice wisdom of colleagues advanced in the field and who possessed experience specific to the population and topic of interest. To gain access to this information, the clinician continued her search of the Internet, obtained and read books on the topic, and attended professional seminars. Many hours were devoted to the pursuit of this information (necessary for the conduct of best practice); but unfortunately, due to limited agency resources and workload requirements, the vast majority of those hours had to occur on the clinician?s own time and at her own expense.
A challenge faced by many practitioners is that of having the time and skills necessary to obtain and analyze available data. It is safe to say that, given the recent inclusion of EBP in social work curricula, the majority of licensed social work practitioners have not had formal instruction in the requisite skills and process of using EBP. Small (2005, 323) reminded us that ?many . . . do not have a strong grounding in research methods and data analysis and may misinterpret or overstate what research findings actually indicate.?
Practitioners who have been in the field for a number of years prior to the formal and systematic inclusion of EBP in the curriculum may have received a basic familiarization education regarding research design and statistics, but probably not instruction in the skills that would enable them to engage in critical consideration and evaluation of research and specific steps listed by many authors as fundamental to EBP (Gambrill 2007; Rubin and Parrish 2007b). For the practitioner who is urged to obtain these skills post-graduation?often something that must be done on the worker?s time, while working sufficient hours to maintain an income?these skill lists can be daunting, if not downright overwhelming. And in the absence of expert feedback, they can also be difficult to operationalize. Examples of competencies noted by Gambrill (2007, 448) include the abilities to:
Efficiently and effectively track down research findings related to information needs, critically appraise different kinds of research reports . . ., accurately describe the evidentiary status of recommended services . . ., select service providers who use evidence-informed services . . ., use valid assessment methods that maximize the likelihood of choosing services likely to result in hoped-for outcomes, avoid making inflated claims about the effectiveness of services, [and] identify human service propaganda.
While Croxton and Jayaratne (1999) noted that NASW?s Code of Ethics did not specifically address research education at the time, it does provide the following guidance in Standard 5.02(c): ?Social workers should critically examine and keep current with emerging knowledge relevant to social work and fully use evaluation and research evidence in their professional practice? (2008). CSWE (2008, 5), on the other hand, specifically addresses research education in Policy 2.1.6:
Engage in research-informed practice and practice-informed research
Social workers use practice experience to inform research, employ evidence-based interventions, evaluate their own practice, and use research findings to improve practice, policy, and social service delivery. Social workers comprehend quantitative and qualitative research and understand scientific and ethical approaches to building knowledge. Social workers use practice experience to inform scientific inquiry and use research evidence to inform practice.
As with the social work practitioner, social work educators also experience ethically-based challenges in relation to EBP. NASW and CSWE clearly promote an evidence-based focus in both social work practice and education. This focus challenges social work educators to promote and teach evidenced-based practice methods to comply with the ethical standards established by the profession. Carrying out this mandate in a manner sufficient to result in the level of comprehension and practice discussed as requisite to ethical practice in the literature, within the finite amount of time available in graduate programs, can be more than challenging. Research education can often be experienced by social work students as uninspired and negative (Hardcastle and Bisman 2003) on one end of the continuum; and overwhelming, barely relevant, and logistically unsupported by workplaces in the practice arena (Anonymous [2006 MSW graduate], personal communication 10 Dec 2008), on the other.
With respect to classroom experiences, many educators are guided by department syllabi that prescribe a teaching schedule including all content, assignments, and grading criteria. In some universities, detailed power point presentations and lectures on EBP are available to ensure that EBP is taught properly. Beginning with the question and ending with the full evidence base search, students are encouraged to implement their newfound expertise at practicum sites or agencies where they are employed.
Educators are many times perplexed when students return and question the applicability of evidence-based practices in the real world. The dilemma is often what to do when the evidence?when applied to the specific problem?does not result in the client outcome that is expected. Does the teacher have the prerogative to suggest a non-evidence-based intervention without being contradictory to best practice?
Organizational Policy
Once the practitioner obtained the information she believed to comprise best practices for her clients, having taken her clients? needs and voices into consideration as to what they believed would be helpful to them, it was then time to implement the chosen intervention in practice. The practitioner found sufficient evidence to suggest that female juvenile sexual offenders benefit from trauma-based models of intervention, but the agency?s policy (and third-party payer source) did not support the use of a trauma-based model. The clinician found herself in an obviously difficult ethical dilemma?go against the agency or managed care guidelines and provide the evidence-based intervention (assuming resources allowed) or to ignore the findings and continue with business as usual?
Although agencies often report that they support evidence based practice, when it comes to making even the best evidence-supported changes in policy or procedures, recommendations can and do still meet with resistance. Gambrill (2007, 458) cited Oxman and Flottorp?s (1998) suggestion of three major kinds of barriers to application of EBP:
Prevailing opinions (e.g., standards of practice, opinion leaders, professional training, and advocacy, e.g., by pharmaceutical companies), practice environment (e.g., financial disincentives, organizational constraints, perception of liability, and client expectations), and knowledge and attitudes (e.g., clinical uncertainty, sense of competence, compulsion to act, and information overload).
With respect to practitioner implementation of EBP, organizational policy issues factor into all three of the barrier types above. Obvious challenges are the lack of time and resources to perform EBP in the workplace. In reality, clinicians are overwhelmed with caseload and direct service guidelines. At times, salaries, raises, and even continued employment, are not only based on direct service percentages, but can be forfeited for noncompliance with organizational standards and policies grounded in organizational tradition that may be inconsistent with enhancing client well-being. With only 20% of her day allowed for administrative time to complete paperwork, reports, and attend meetings, the practitioner in our scenario (like so many others) does not have the resource support of her agency to conduct on-going learning while on the job. Despite NASW?s (2008) Code of Ethics emphasis on the values of lifelong learning, many agencies do not provide training or training benefits for their clinicians, who again, must take scarce personal time and money to continue this professional development. Specialized seminars are often time and cost-prohibitive for many clinicians (and agencies) in the field, as can be access to the variety of databases that may (or may not) provide the information sought.
Again, as noted for the social work practitioner and social work educator, ethical issues also exist for the social work policy maker/advocate. The policy applications of EBP essentially exist in two forms. The first is the use of evidence from programs to denote outcomes and define best practices in order to secure funding for programs/organizations and to manage liability. This first manifestation of EBP and policy (i.e., evidence used to support programs and organizations) has been broadly used as a basis for constructing and promoting social policies and underwriting the implementing organizations. This expression of the EBP and policy connection is considered part of doing business in the social service arena and generally accepted by most social service practitioners and administrators (Gambrill 2007).
The second form of policy application of EBP is expressed in terms of the place of EBP itself within organizations (e.g., organizational and/or professional policies that indicate that practitioners should use EBP standards and practices as a part of their professional functioning). This second manifestation appears to be the point at which an ethical departure occurs within the social work discipline. The manner in which organizations define and promote the practice of EBP internally seems to be the crux of how EBP is embraced (or not) by social work practitioners (Geanellos and Wilson 2006).
The process of writing grants to fund programs is a well-established part of the social service culture. Funders will provide some or all the resources to implement a program while the providing organization collects data on the manner of implementation and the outcomes of the program for the funders. Evidence is gathered for the purpose of continuing or discontinuing policies, refining research agendas, and formulating/reformulating theoretical bases for policies and programs. In doing so, a common language for program efficacy and feasibility is produced that may serve to facilitate the EBP model at the policy level.
This is potentially not a purely scientific process, however, as the same organizational behaviors may be exhibited for very different purposes (Gibbs and Gambrill 2002). For instance, in the process of EBP, data are gathered and critically evaluated about the efficacy and accuracy of policies and programs to some stated end (Tanenbaum 2003). To the extent that the outcome data support or refute theories, policies, and programs, these are adjusted to reflect the newly emerging understanding of reality (assuming the research is rigorous) (Rubin and Parrish 2007a). In this way, science continues to advance and policies that have firm grounding in empirically derived knowledge are established (this can be called evidence-based policy). In reality, however, this originally scientific process may be turned on its head in the partisan world of politics and policy (Brendtro et al. 2006). An administration or organization may begin with a particular policy that is defined and operationalized in terms of an ideological basis (Gambrill 2006a); this ideological policy stance then funds organizations that will generate data that support this stance. This has the appearance of evidence-driven policy, but in reality, turns the process around from its intent (i.e., Policy Based Evidence, for example in the Just Say No drug campaign of 1980s and in many abstinence-only sex education programs funded during the 1990s through the present).
While in the end, the evidence that is produced in this upside-down process may be used to further scientific bases of policy, the data obtained must be unsnarled from the ideology before it can be used in this manner (Tanenbaum 2003). This also begs the question of how this evidence (both before and after the scientific vetting process occurs) is generalized and disseminated. Because this system of using organizational data as scientific evidence on the one hand, and as support for a particular ideological stance on the other, uses similar language and reporting procedures, social workers implementing the process may well be doing so from a more pragmatic stance and may be at-risk for becoming jaded to the process of evidence as a basis for practice and policy decisions. For example, a social worker may be required to submit data on X program, which may be reframed to support Y?s political agenda, when, in reality, the social worker?s experience tells her that Z is actually occurring.
Taking the case of abstinence-only sex education for example, the top-down political push was for agencies to provide sex education to children and teens focused on delaying sexual activity until marriage. Curricula were developed and programs were funded to carry out these programs stemming from the political ideology that created them. Given the benefit of a decade or more of outcomes available for evaluation, the effectiveness of these programs can be determined. Despite millions of dollars in funding and the backing of government agencies, the results indicate that abstinence-only sex education is remarkablyineffective for delaying sexual activity of teens, preventing pregnancies, and the spread of STIs among the target population (Santelli 2006). The problem, however, lies in the intervening period between the ideological birth of the programs and the overwhelming evidence to refute its effectiveness. Presumably, at some point between, organizations began to see for themselves that these programs did not work.
This creates the daunting ethical snarl. For social service agencies and the workers who are employed by them to continue to function, they must have resources. Often these resources are controlled by political entities that have particular ideological bents that may or may not align well with that of either the agency or the worker (Gambrill 2006a; Rubin and Parrish 2007a). Nonetheless, for survival, organizations may secure funding that requires reporting particular practice behaviors and their results. Do the organizations accept the funding with strings attached? Do they take an approach that refuses such funding prospects and risk not being able to serve clients or support workers? How do workers manage the demands placed on them by their employing agencies and the needs of their clients? In purely black and white hypothetical scenarios, these questions may be easily answered, but in the more nebulous manifestations that occur in reality, these may be more difficult to navigate. The result may be a feeling of disconnection on the part of administrators and workers who implement policies at the boots on the ground level.
In the second evidence-driven policy consideration of EBP, the same process occurs, though with some differences. The NASW Code of Ethics can be interpreted to lend itself to endorsing EBP for social work practitioners at all levels (Gambrill 2006a; Gambrill 2007). Specifically, Section 1.04?Competence (b and c) seem to encompass EBP. Given this, many proponents of EBP in social work think that it should be endorsed for all levels of practice, including the policy level (Gambrill 2006a; Gossett and Weinman 2007).
Indeed, given that EBP could represent best practices for the profession, the temptation may be to create rigid policies and procedures within organizations, leaving social work practitioners feeling squeezed by a policy or management system that is organization-centered rather than client-centered. To the extent that this occurs, the ability of organizations and practitioners tofinish out the EBP cycle may be limited (i.e. while the EBP process may be performed effectively, identifying the most efficacious intervention, securing client and practitioner input, etc., the EBP approach may well end at this point from an organizational perspective). In other words, the focus on using only previously existing models for establishing organizational procedures could even reduce organizational responsiveness to client and community concerns (Bloch et al. 2006). This is not how the EBP process was conceptualized by its originators, but may well represent its implementation among organizations that employ social workers, leaving many with unpleasant tastes in their mouths about EBP.
Research
Given the paucity of evidence for the best treatment for female juvenile sexual offenders, the practitioner speculated about what it would take to conduct a formal study with this population. This is a vulnerable population protected by human subjects? regulations on a multitude of levels. They are minors who reside in mental health facilities and often have criminal backgrounds. They are often wards of the state as well, so gaining informed consent by guardians, navigating through the myriad of state and federal regulations is problematic, and data can be obstructed or censored by these entities as well as the Internal Review boards assigned to each. Immediacy of the current residents needs aside, all of the considerations above made the prospect of conducting research on-site to generate relevant knowledge an overwhelming one, at best. And thus, the only data available to the clinician is that which has been funded, approved, conducted, and published.
As the continued scenario above suggests, organizations and practitioners are not alone in their ethical challenges with EBP?difficulties also exist for the social work researcher in knowledge generation and dissemination, resulting from factors such as: Who is driving the focus of the research? What resources are available? And, what are the concerns related to the population(s) with whom the research is conducted?
The following example is offered to underscore the importance of careful consideration of all steps by a researcher.
A social work researcher is asked by the leadership of a large church to design a study to help determine the effectiveness (longevity) of the women with breast cancer in their support group. The support group is operated by church leaders, who report the main goal of the group is to help women heal by placing faith in God for deliverance from cancer through group prayer. The leaders informally noticed that the women in their popular support group live longer than nonparticipating church members with breast cancer. The church wants to pay the researcher for this study because they believe that others should know about this successful group.
Before exploring design issues and establishing causality, the researcher knows from experience that she must decide whether to contribute to the knowledge base for this area, and she must consider what outcomes might arise for the larger population of breast cancer survivors and in reference to a societal view of faith based support groups. She must consider her own ethical and moral standards and those of the profession to decide whether doing this research lines up with those standards.
At the start, the researcher who is intimidated by the gold standard in EBP?experimental design?and aims for it without regard for other considerations faces an ethical challenge. While basing the design of a study on the hierarchy of research (true experiments, quasi-experiments, and non-experiments) is important, a researcher must ponder many other things. It is important to make clear that there should be no distinction in the ethics between the social work researcher who seeks to add to the general knowledge base and the one who aims to produce evidence useful to evidence-based practitioners. One should proceed in generating research evidence in the manner most appropriate to answering the research question and make it translatable to practice. If done in such a manner, honoring the fundamental ethical principles mandated by the Belmont Report (National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research 1978) and the NASW (2008) Code of Ethics?respect for persons (autonomy and self-determination), beneficence, and justice?a researcher can reduce the likelihood of unethical research practices. Humphries (2003, 87) argued for this in her support of participatory-oriented research methods:
Asking the research questions that are defined by the people whose lives are affected most, in the ways that make sense to them and produce answers that are meaningful to them as primary stakeholders, will feed into decisions about which methods are appropriate.
Also, in her recent discussion of EBP as a philosophy, Gambrill (2007, 458) proposed that ?EBP is as much about the ethics of and pressures on academics and researchers as it is about the ethics and pressures on practitioners and agency administrators.? Antle and Regehr (2003) also posited that an ethical challenge exists for social work researchers in dual relationships arising when acting as both practitioner and researcher. They suggested that researchers ask the following questions?in addition to following professional and personal codes of ethics?when considering the ethics of their research:
Is this research consistent with social work principles of working toward improving the situation of vulnerable individuals or groups in society?
Will this research benefit the group being studied?
What are the broader risks associated with this research? Could a vulnerable group be disadvantaged by potential research findings?
Are subtle risks to self-determination and informed consent adequately addressed in this research?
Does this research involve clients? Has the influence of a therapeutic alliance been sufficiently addressed in the consent process?
Are there sufficient safeguards of ensuring the anonymity of research data, so if the research records are subpoenaed they will not violate participant confidentiality?
Are there research methods that might both rigorously answer the questions posed and serve to empower the individualsor population under study? (142)
While in practice social worker practitioners thrive on multidisciplinary teams, social work researchers are challenged to do the same in research design. As Jenson (2006,4) put it, ?a clear demonstration of the ability to collaborate with investigators from other disciplines and to apply rigorous scientific principles from a public health framework will be a key component of proposal submissions? and should be a consideration for all areas of social work.
As an increasing number of practitioners access and critically appraise the research disseminated?and in consideration of the import of improving the quality of the research evidence?it is important to highlight the nature of scholarly productivity in social work education. In his recent national survey of deans and directors of 130 social work schools (both MSW programs and MSW/doctoral programs), Green (2008) concluded that the scholarship role has become more salient than ever in tenure and promotion decisions and that teaching and service roles have become less influential. Yet, scholarship was tertiary to teaching and service in terms of workload for social work faculty, similar to previous findings (Seaberg 1998 Seipel 2003).
Finally, ethical challenges in research dissemination remain. Gray (2001) noted some of them as biased research and publications, poor quality research, and failure of researchers to present evidence in forms useful to practitioners. Some of the solutions offered are increasing transparency by funders and researchers, better training of researchers and strict ethics committees, and tougher review action and acceptance criteria by journal editors (Gray 2001; Gambrill 2006a).
4 Strategies for Minimizing Challenges
In looking at the challenges that arise in the process of implementing EBP in social work, several ethical dilemmas common among the domain areas of practice, education, policy, and research become apparent. By addressing these dilemmas, challenges can be minimized, although not totally avoided or eliminated.
First and foremost of these dilemmas is the feasibility of utilizing the whole EBP model (or utilizing EBP in a manner consistent with the full EBP cycle). In both the research and education arenas discussed above, issues related to social work faculty work role vs. workload (Green 2008) become problematic. To fully engage in the EBP process is time consuming for faculty who may feel as though they haven?t enough time for scholarship, service and teaching as it is. Similarly, with regard to social workers engaged in practice and organizational policy arenas, time and resources necessary to fully research relevant topics simply may not be available, thereby limiting practitioner autonomy and efficacy and preventing agencies from engaging in the complete EBP cycle. Potential solutions for each of these areas boil down to a major shift for agencies/organizations (including academic institutions) and practitioners, wherein EBP becomes a priority and resources are realigned and reallocated to reflect this new reality. This means balancing the needs of the relevant institution, the client populations, and social workers at all levels of practice. This means creating the ability to commit necessary resources to implement EBP approach in its entirety, including providing relevant and ongoing training for practitioners, teachers and administrators.
The second major dilemma is related to the ideological conflicts encountered while attempting to implement EBP. Social workers must be aware of ideologies behind funding and enact appropriate safeguards to ensure the integrity of the EBP process. This may boil down to adherence to strictly scientific frameworks and methodologies to determine relevance of data and outcomes (Santelli 2006), rather than ideologically based outcomes (i.e. the preceding examples of abstinence only sex education and the Just Say No campaigns). Social work practitioners who care for disadvantaged or vulnerable populations that are often at risk for further marginalization by ideologically driven social work domains must also be on their guard. Possible solutions for these ideological conflicts include unsnarling data from outcomes and subsequent review by additional researchers (Santelli 2006), as well as promoting transparency and accountability of funders and funded research (Gambrill 2007).
The third major dilemma is related to responsiveness to and inclusion of client perspectives with regard to the implementation of EBP. To fully and feasibly implement EBP, agency/organizational policies that direct or inform practitioner intervention need to be evaluated. Researchers need to be clear about whom research will actually benefit and who is not being benefitted or is being left out. Social work educators and practitioners need to be fully aware of the role of client preference and practitioner wisdom in the EBP model, and to prevent these from becoming second best to research evidence. On a related note, organizational policies that are centered on the implementation of EBP should remain open to practitioner and client input, thereby resulting in more flexible implementation of clinical/interventive policies. Efforts that explicitly seek to involve clients in the design and critique of research and attend to outcomes of value to clients (Gambrill 2007), as well as assuring that each of the components of EBP are viewed equally and are directed by client well-being first and foremost, would go a long way toward the successful implementation of EBP by social workers regardless of whether they are involved in practice, policy, research, or education arenas.
5 Conclusion
The fundamental philosophy of EBP is founded on ethical principles and designed to help establish an empirically validated ethical and competent practice. However, there appears to be considerable difficulty in its implementation by social work practitioners, educators, policy makers, and researchers. We have identified and discussed some of the more pressing challenges and associated ethical dilemmas of implementing EBP in social work and strategies to manage them, in the hopes of affirming that the process of EBP is both feasible and practicable.
To minimize challenges of implementation, social workers must remain conscientiously aware of the seduction of the positivist-empirical element of the EBP model in deference to the practice wisdom and client preference elements. The former is an easy lure, since the scientific method is viewed and advocated as the most important means of evidence generation by some (Avis and Freshwater 2006), even as this power differential among the elements of the model continues to be questioned by others (Hall 2008). Similarly, with regard to social work policy concerns, the E in EBP needs to be well-defined, especially where non-scientific ideological concerns come into play (Tannenbaum 2003). Where program outcomes are utilized for the generation of policy, these outcomes must be carefully vetted to ensure that purely ideological factors do not unduly influence either the generation or dissemination of evidence. These concerns and others make it incumbent upon organizations and agencies who utilize EBP to allocate sufficient resources for practitioners to fully implement all phases of the EBP process. If agencies and the professionals they employ choose to do this, then EBP, in its breadth and complexity, stands to be more fully embraced?especially with respect to those elements that may require organizations to realign themselves given the emergence of new evidence?thus providing ethical agency responsiveness to client and community needs.
References
Antle, B. J. and Regehr, C. 2003: Beyond Individual Rights and Freedoms: Metaethics in Social Work Research, in: Social Work, 48, 135-144.
Avis, M. and Freshwater, D. 2006: Evidence for Practice, Epistemology, and Critical Reflection, in: Nursing Philosophy, 7, 216-224.
Best, M. and Neuhauser, D. 2005: Pierre Charles Alexandre Louis: Master of the Spirit of Mathematical Clinical Science, in: Quality and Safety in Health Care, 14, 462-464.
Bloch, R.M., Saeed, S.A., Rivard, J.C. and Rausch, C. 2006: Lessons Learned in Implementing Evidence-Based Practices: Implications for Psychiatric Administrators, in: Psychiatric Quarterly, 77, 309-318.
Brendtro, L.K., du Toit, L., Bath, H. and Van Bockern, S. 2006: Developmental Audits with Challenging Youth, in: Reclaiming Children and Youth, 15, 138?146.
Council on Social Work Education (CSWE) 2008: Educational Policy and Accreditation Standards [PDF document, pp.1-16]. Available from: < https://www.cswe.org/NR/rdonlyres/2A81732E-1776-4175-AC42-65974E96BE66/0/2008EducationalPolicyandAccreditationStandards.pdf (Links to an external site.)> [Accessed 10 November 2008].
Croxton, T. and Jayaratne, S. 1999: The Code of Ethics and the Future, in: Journal of Social Work Education, 35, 2-6.
Freeman, S. J. and Francis, P. C. 2006: Casuitry: A Complement to Principle Ethics and a Foundation for Ethical Decisions, in: Counseling and Values, 50, 142-153.
Gambrill, E. 2006a: Evidence-Based Practice and Policy: Choices Ahead, in: Research on Social Work Practice, 16, 338-357.
Gambrill, E. 2006b: Social Work Practice: A Critical Thinker’s Guide (2nd ed.). New York: Oxford University Press.
Gambrill, E. 2007: Views of Evidence-Based Practice: Social Workers? Code of Ethics and Accreditation Standards as Guides for Choice [Special Section: Promoting and Sustaining Evidence-Based Practice], in: Journal of Social Work Education, 43, 447-462.
Geanellos, R. and Wilson, C. W. 2006: Building Bridges: Knowledge, Production, Publication, and Use [Commentary on Tonelli (2006), Integrating Evidence into Clinical Practice: An Alternative to Evidence-Based Approaches], in: Journal of Evaluation in Clinical Practice, 12, 248?256.
Gibbs, L. and Gambrill, E. 2002: Evidence-Based Practice: Counterarguments to Objections, in: Research on Social Work Practice, 12, 452-476.
Gilgun, J. F. 2005: Four Cornerstones of Evidence-Based Practice, in: Research on Social Work Practice, 15, 52-61.
Goodman, K. W. 2003 Ethics and Evidence-Based Medicine: Fallibility and Responsibility in Clinical Science. New York: Cambridge University Press.
Gossett, M. and Weinman, M. L. 2007: Evidence-Based Practice and Social Work: An Illustration of the Steps Involved, in: Health and Social Work, 32, 147?150.
Gray, J. A. M. 2001: Evidence-Based Healthcare: How to Make Health Policy and Management Decisions (2nd ed.). New York: Churchill Livingstone.
Gray, M. and McDonald, C. 2006: Pursuing Good Practice: The Limits of Evidence-Based Practice, in: Journal of Social Work, 6, 7-20.
Green, R. G. 2008: Tenure and Promotion Decisions: The Relative Importance of Teaching, Scholarship, and Service, in: Journal of Social Work Education, 44, 117-127.
Hall, J. C. 2008: A Practitioner?s Application and Deconstruction of Evidence-Based Practice, in: Families in Society, 89, 6.
Hardcastle, D. A. and Bisman, C. D. 2003: Innovations in Teaching Social Work Research, in: Social Work Education, 22, 31-43.
Hayes, R. A. 2005: Introduction to Evidence-Based Practices, in: Stout, C. E. and Hayes, R. A. (eds.) The Evidence-Based Practice: Methods, Models, and Tools for Mental Health Professionals. Hoboken, NJ: John Wiley & Sons, Inc., 1-9.
Humphries, B. 2003: What Else Counts as Evidence in Evidence-Based Social Work? In: Social Work Education, 22, 81-91.
Jenson, J. M. 2006: A Call for Social Work Research from the National Institutes of Health, in: Social Work Research, 30, 3-5.
Klein, W. and Bloom, M. 1994: Social Work as Applied Social Science: A Historical Analysis, in: Social Work, 39, 421-431.
National Association of Social Workers (NASW) (n.d.) Evidence-Based Practice. Available from: < http:/?/?www.socialworkers.org/?research/?naswResearch/?0108EvidenceBased/?default.asp > [Accessed 21 November 2008].
National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research 1978: The Belmont Report: Ethical Principles and Guidelines for the Protection of Human Subjects of Research [DHEW Publication No. (OS) 78-0012]. Washington, D.C.: Author.
Oxman, A. D. and Flottorp, S. 1998: An Overview of Strategies to Promote Implementation of Evidence-Based Health Care, in: Silagy, C. and Haines, A. (eds.) Evidence-Based Practice in Primary Care. London: BMJ Books, 91-109.
Petr, C. G. and Walter, U. M. 2005: Best Practices Inquiry: A Multidimensional, Value Critical Framework, in: Journal of Social Work Education, 41, 251-267.
Rubin, A. and Parrish, D. 2007a: Challenges to the Future of Evidence-Based Practice in Social Work Education [Special Section: Promoting and Sustaining Evidence-Based Practice], in: Journal of Social Work Education, 43, 405-428.
Rubin, A. and Parrish, D. 2007b: Views of Evidence-Based Practice among Faculty in Master of Social Work Programs: A National Survey, in: Research on Social Work Practice, 17, 110-122.
Sackett, D. L., Rosenberg, W. M. C., Gray, J. A. M., Haynes, R. B. and Richardson, W. S. 1996: Evidence Based Medicine: What It Is and What It Isn?t, in: British Medical Journal, 312, 71-72.
Santelli, J. S. 2006: Abstinence-Only Education: Politics, Science, and Ethics, in: Social Research, 73, 835-858.
Seaberg, J. R. 1998: Faculty Reports of Workload: Results of a National Survey, in: Journal of Social Work Education, 34, 7-19.
Seipel, M. M. O. 2003: Assessing Publication for Tenure, in: Journal of Social Work Education, 39, 79-88.
Small, S. 2005: Bridging Research and Practice in the Family and Human Services, in: Family Relations, 54, 320-334.
Tanenbaum, S. 2003 Evidence-Based Practice in Mental Health: Practical Weaknesses Meet Political Strengths, in: Journal of Evaluation in Clinical Practice, 9, 287-301.
Author?s Address:
Amanda J. Farley/Dennis Feaster/Tara J. Schapmire/Joseph G. D?Ambrosio/LeAnn E. Bruce/C. Shawn Oak/Bibhuti K. Sar
University of Louisville
Kent School of Social Work
US – Louisville, KY 40292
United States
Tel : ++1 502 852 3931
Fax : ++1 502 852 5887
Email : amanda.farley@louisville.edu / dwfeas01@louisville.edu / tara.schapmire@louisville.edu / jgdamb01@louisville.edu /lebruce01@louisville.edu / csoak001@louisville.edu / b.k.sar@louisville.edu
urn:nbn:de:0009-11-24668

WPMessenger