Tuesday, February 21, 2017
Greetings! I am
Tania Rempert, Evaluation Coach at Planning, Implementation, Evaluation Org. This post is written together with my colleagues
Molly Baltman from the McCormick Foundation,
Mary Reynolds from Casa Central, and
Anne Wells from Children’s Research Triangle. We would like to share our experience speaking at the
Office of Social Innovation White House convening on Outcomes Focused Technical Assistance (OFTA).
The purpose of this convening was to advance an
outcomes mindset in government, across the public, private, and philanthropic sectors.
David Wilkinson shared the vision of OFTA to focus on building the capacity of social service providers using data to inform smarter service delivery and to implement evidence-based practices in local communities. Wilkinson began the convening by pointing out,
“Government pays for 90% of the funding for social services in this country, but typically pays for outputs and compliance rather than outcomes and impact. As a result, many social service providers do not have outcomes they are actively pursuing….and less likely to have consistent outcomes useful for comparison with their peers.” The White House Office of Social Innovation and Civic Participation would like to change that. This convening was meant to draw attention to the technical assistance needed by social service agencies when tasked with measuring and reporting and using outcomes.
Hot Tip: Principles of OFTA:
- Identify the most important measurable outcomes
- Implement evidence-based practices
- Use data to inform research-based service delivery
We were asked to speak based on our experience with the
Unified Outcomes Project. We shared our experiences focusing on increasing grantees’ capacity to report outcome measures and utilize this evidence for program improvement, while streamlining the number of tools being used to collect data across cohort members. Our model emphasizes communities of practice, evaluation coaching, and collaboration between the foundation and 29 grantees to affect evaluation outcomes across grantee contexts:
Lessons Learned:- It takes at least 2 years to see measurable outcomes and be able to model the use of this data at the cohort level of shared outcomes.
- Grantees are experts through lived experience, use their community voice to determine specific strategies, because they have the language and experience to take each other to the next level, so when they are brought together, a learning community organically develops.
- The beauty of using an evaluation coach visiting organizations on-site to provide technical assistance is that each organization has different needs to make data-informed decision making.
We hope that this initial convening will encourage ongoing discussion and development of strategies in OFTA for evaluation practice and government policy making. Since it is not a thing unless it has an acronym, let all of us in the evaluation community commit to “OFTA often!”
Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.