Home > NewsRelease > STEM TIG Week: Tips for evaluating equity-focused STEM education programs that have good intentions but mixed results by Elizabeth Pierson, Sophia Mansori, Jamie Kynn, and Sara Greller
Text
STEM TIG Week: Tips for evaluating equity-focused STEM education programs that have good intentions but mixed results by Elizabeth Pierson, Sophia Mansori, Jamie Kynn, and Sara Greller
From:
American Evaluation Association (AEA) American Evaluation Association (AEA)
For Immediate Release:
Dateline: Washington, DC
Thursday, March 21, 2019

 
Hello, we are Elizabeth Pierson, Sophia Mansori, Jamie Kynn, and Sara Greller from Education Development Center. Our research and evaluation work in STEM education brought us together for a panel presentation at AEA 2018 in Cleveland where we discussed the potential disconnects between a program’s equity and access goals and its implementation approaches.
With a diversity dearth in the majority of lucrative and stable STEM fields, federal agencies, cultural institutions, universities, and charitable foundations have set lofty goals for increasing access for underrepresented groups; they also hold high expectations for the outcomes of their programs. With these wide disparities in who is employed in the lucrative STEM fields, what responsibility do evaluators have, if any, to ensure that programs they are evaluating are meeting these social justice goals?
One of the many roles of the evaluator is to figure out how to hold up the right mirror at the right angle to help funders and program staff see their work differently. But speaking truth to a funder or program partner can be particularly challenging when it involves questioning a thoughtful and well-intentioned implementation model. In this post, we share some tips and lessons learned in two areas: 1) ways in which evaluators can best measure and collect data around equity even when that is not a program’s primary focus and 2) methods for communicating difficult truths when equity goals are not exactly aligning with a program’s equity outcomes.
Hot Tips: Measuring and collecting equity data
  • Evaluating a program in isolation can mean being less attuned to what is not immediately visible; a landscape review can situate a single program in a broader context and allow for cross program comparisons.
  • Relying solely on participant self-reported data can leave blind spots; the use of validated tools (such as the Dimensions of Success) can help detect what a program is missing or not doing.
  • Access and participation are just two dimensions of equity, but these data can be used as a gateway to important conversations about program goals and design.
  • For large scale implementation programs, measuring equity of access through publicly available data, such as NCES, might be the most affordable approach, but it is important to recognize and communicate the limitations of using such proxy data.
Lessons Learned: Communicating equity disparities
  • As a first step, understand how equity is being defined, and who is being included in that definition.
  • Focus on the positive before the negative; report what IS possible and what is working before highlighting the areas for growth and improvement.
  • Before writing a final report, do a mini pre-presentation to describe the findings and give the funder the opportunity to ask questions and engage with the data to ensure there are no surprises when they receive the final report.
  • Listen carefully to what funders and partners say, and do not say, during the mini pre-presentation. Refocus the final report if necessary to incorporate relevant components of the conversation and highlight areas of expressed interest.
The American Evaluation Association is celebrating STEM Education and Training TIG Week with our colleagues in the STEM Education and Training Topical Interest Group. The contributions all this week to aea365 come from our STEM TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

About AEA

The American Evaluation Association is an international professional association and the largest in its field. Evaluation involves assessing the strengths and weaknesses of programs, policies, personnel, products and organizations to improve their effectiveness. AEA’s mission is to improve evaluation practices and methods worldwide, to increase evaluation use, promote evaluation as a profession and support the contribution of evaluation to the generation of theory and knowledge about effective human action. For more information about AEA, visit www.eval.org.

News Media Interview Contact
Name: Anisha Lewis
Title: Executive Director
Group: American Evaluation Association
Dateline: Washington, DC United States
Direct Phone: 202-367-1223.
Main Phone: 202.367.1166
Jump To American Evaluation Association (AEA) Jump To American Evaluation Association (AEA)
Contact Click to Contact
Other experts on these topics