Co-authored with Dan Cardinali
To address complex social challenges, we need to figure out what works. Collecting evidence, however, is only part of the challenge. There needs to be a process of continuous learning over time of what works, for whom, and under what circumstances - and then using that information to deliver better results.
Reams of number-filled spreadsheets won't automatically lead to improved outcomes for young people, their families and communities. Rather, non-profit leaders and elected officials at all levels of government should use data, evidence and evaluation to continue building evidence of effectiveness. But, evidence shouldn't only be used as an on/off switch - as a way to decide keep or end programs - evidence should be used to deliver better results.
Our organizations are striving to build evidence and invest in what works. A brand new video by Results for America highlights how we, and others, have used individualized continuous learning processes to improve outcomes.
The video shows how City Year, which has spent more than 25 years inspiring young people into a year of service for their community, is using data, evidence and evaluation to improve. Through experience, self-reflection, and, importantly, third-party evaluations, City Year learned that they could make a bigger impact if they focused their efforts on helping kids in school. City Year changed their approach and today they have the evidence to demonstrate their impact: not only did student attendance increase by 14,600 hours compared to the previous year, according to their latest data, students in the City Year program exhibited stronger academic performance as well. Students who work with City Year AmeriCorps members demonstrated 1.4x expected growth in both literacy and math.
The video also describes how Communities in Schools, which surrounds students with a community of support empowering them to achieve, has evolved to achieve better outcomes. Over the course of decades of work, we developed a Virtuous Learning Cycle to evaluate what was or wasn't working, with rigorous research at the heart of our efforts. The cycle's steps include: innovation; research; internal evaluation; codifying research into best practices - or exposing it to third party evaluation; and then codifying research into best practices.
As part of our work, we also discovered that many students suffer a significant dip in self-esteem after their freshman year when high expectations, sometimes self-imposed, collide with the real challenges of high school. We learned what needed fixing and took action: by helping students set achievable academic goals, we boosted self-esteem, lowered drop-out rates and dramatically increased grade point averages.
LIFT has spent 15 years helping people lift themselves out of poverty by pairing rigorously trained advocates with committed community members to build the strong personal, social and financial foundations they need to get ahead. To determine the relational strength of those foundations to one another, we focused our evaluation resources on listening directly to our members. By collecting structured feedback after every meeting with a member, we learned that LIFT members with strong social networks were twice as likely to take concrete steps towards their goals. So we adapted and strengthened the parts of our operations focused on increasing members' social connections. We expanded members' access to community resource referrals, encouraged group workshops and trainings, and provided a greater number of opportunities for peer networking for members. LIFT's work is also proving that even without a large research budget, much can be done to measure success.
Our organizations, and so many others across the country use data, evidence and evaluation every single day in order to learn how to improve outcomes. We think governments at all levels can do the same thing.
If they do, they will be better stewards of taxpayer dollars and not only make better decisions on how to spend them, but get better results with those limited resources.
If they do, they'll be playing what we call Moneyball for Government. That means investing in programs that work, but also redirecting funds away from those that consistently fail to improve lives.
After all, if we play Moneyball we'll build the best possible evidence to improve results for all Americans.