1. Reviewing the evidence about what works means that our work may get easier. Instead of working hard but only spinning our wheels (like a hamster that gets nowhere, which by the way, I have done, have you?), we can start by using an intervention that has already been shown to work with a similar population and setting.
2. If professional resiliency (that is, bouncing back despite all the stressors and challenges and demands and painful stories that we hear and...you can fill in the rest) is maintained through optimism, collaboration and mastery, then doing what works and monitoring the progress of our clients while we do it, can't help but make us be/feel more competent. It means getting consistently and predictably good progress and outcomes in treatment. Starting with interventions that have empirical evidence of effectiveness can't help but make us more optimistic at the outset. Then sharing our experiences of progress and learning from each other, well that closes the loop and is one example of potential collaboration.
3. How confident will you be when you tell a client at the outset that there is solid research evidence that the intervention you are proposing in the treatment plan is effective and that treatment goals are achievable? It's like both you and the client will be standing on the shoulders of giants instead of muddling together in the dark, hopeful but uncertain. Our own confidence at the beginning of treatment is connected to client outcomes. Initially, clients may borrow on our hope when they can't muster the imagination to see a better future. Our beliefs about whether things can/will get better may also influence the choices we make in treatment. Our beliefs send a message whether we are aware of them or not. Doesn't it make sense to be aware?
4. It lends our profession and work some amount of credibility. The stories of our experiences with clients are indeed compelling. Anecdotal evidence makes for great case examples. However, some people only give credence or at least greater weight to numbers, especially in a climate of reduced budgets and increasing demand for accountability. The good news is that numbers can tell a story that is just as compelling and descriptive of the path to growth, healing and resilience. I love showing the graphs. The stories behind them make them look beautiful. Looking at them makes me happy. I also share them with parents, teachers, colleagues, administrators and anyone with a stake in the well-being our our students.
5. Journal articles about intervention research are like my dearest mentors. They summarize everything we know about a particular problem and population (literature review) and then describe exactly what they did and why and what happened (methods and results). It's like a short cut to the hundreds or thousands of trial and errors that I need to log to reach mastery. The book, Outliers, by Malcolm Gladwell talks about having to put in 10,000 hours to become expert at anything (think the Beatles, Bill Gates, concert pianists). I don't see why learning from what others have done and then doing it myself can't get me there sooner.
6. Just like we are thinking and feeling human beings, our practice can learn to integrate evidence and clinical judgment. There are no inherent either/or dilemmas in this process. If we can think and feel, then we can use evidence and clinical judgment.
7. Some detractors of evidence-based practice imply that intervention research ignores minority populations, when in fact 27% of the children's mental health research included Latino subjects. The real question becomes, have we read all the research or do we just assume it doesn't exist?
8. Okay, I'm running out of steam here. Can you help me out with your ideas? And in order to balance this discussion, I will post the 10 Concerns about EBP soon.