In Adams County, Colorado, local government employees in the county’s employment assistance agency had a problem: about half of their customers—parents with low income—were coming to the county for help but missing the initial mandated orientation that determines whether they keep their benefits. Staff at the Adams Works Family Transition Services, or Adams Works, wanted more parents to attend the orientations, but they weren’t sure how to draw them in. They had tried a few different strategies with limited success. So to understand the root causes of the problem and brainstorm viable solutions, Adams Works partnered with a team from Mathematica led by Jonathan McCay. Together, they sought to leverage academic literature, county administrative data, and small, short-term experimentation to generate fresh ideas about how to better serve residents. As a result, Adams County saw a double-digit increase in the orientation engagement rate.
For this episode of On the Evidence, I spoke with Jonathan about the experience as well as an issue brief he coauthored with staff from Adams Works. Click here to listen to the full interview. You can also read an edited excerpt of the interview in the following transcript.
How did you end up working with Adams County?
As a part of our work with the Colorado Department of Human Services, we’ve had the opportunity to work with select counties for four years. Adams County is in the Denver metropolitan area in Colorado. It is one of the largest counties in terms of public assistance caseloads, particularly for the Temporary Assistance for Needy Families (TANF) program. It’s suburban for the most part, but it certainly has its high share of poverty. Many of those who can’t afford to live in Denver proper have been pushed into the outskirts of the metro area, which includes Adams County. They are dealing with some of the housing crisis and the affordability issues that Denver’s experiencing across the board.
One of the tools that you’re sharing with counties is LI2. In layman’s terms, tell me what that is.
LI2 stands for Learn, Innovate, Improve. The simplest way to put it is that it’s a change management process. It’s a way of thinking about how do I—as a program leader or someone responsible for service delivery—how do I undertake change in a systematic and analytic way? As the name suggests, there are three stages, and each of those stages has a series of collaborative activities in which researchers and practitioners are partnering together to design, implement, and test changes.
What was Adams County trying to improve?
They were trying to improve upfront engagement in the TANF program. The way that the program works is that someone walks into a human services office, applies for cash assistance and—as a condition of receiving cash assistance—the parent from the household has to engage with the employment services provider, and then has to meet certain participation requirements to maintain eligibility. Because eligibility is handled separately from the employment services provider, there is a bit of a disconnect in the sense that a parent walks in, is approved for cash assistance, and leaves that office thinking that that’s all there was to it. But if parents do not engage in those employment services that are required as a condition of the cash assistance, then the human services office has the authority to close that case and revoke the benefits until the parent complies with those requirements.
In Adams County, approximately 50 percent of the parents who walked through the door and were approved for cash assistance ultimately did not engage with the employment services provider, and that resulted in a revocation of benefits for the families and a lot of administrative burden for the program.
What intervention did you end up identifying and how did you go about trying it out?
We spent some time with a select group of staff from the department’s employment assistance division, Adams Works, at a two-day workshop, considering what we already know: what does the research and evidence tell us about how to effectively engage people? Then we applied all of that in the context of Adams County and the problem that we were trying to address. We use a variety of design thinking strategies to tap into the creative process with the people in the room to generate fresh ideas and to concretely map outcomes. [We asked ourselves:] what would success look like in measurable terms, and what are the strategies that we’re proposing to get us there?
And then we articulate the causal link between those two things: Based on those strategies, what is going to result in those measurable real-world outcomes? We call those “targets for change.” These are things very proximal, very near term, that happen within individuals that cause them to engage in a program, to show up to an orientation. We’re really trying to map out what needs to shift about people’s attitudes, about their behaviors, about their skills, anything in that behavioral realm, that will lead to something measurable in terms of an engagement indicator.
The process led Adams County caseworkers to make reminder phone calls before the first orientation. Did it work?
We saw some pretty decent impacts on engagement rates, not only at the first group orientation but at the follow-up one-on-one meeting between the client and the case manager. When you think about the literature and the broader field, that’s surprising. A phone call seemed like a bit of an archaic way to get ahold of somebody and get them to walk through the door. I think there are a couple of things going on in Adams County, and they map back to the [LI2’s] Learn phase. One of the problems we uncovered was confusion. People simply are not understanding when they leave that eligibility interview with their benefits approval in hand that there is…a requirement for sustained engagement with an employment service provider [and] these mailings about attending an orientation are directly linked to that.
Part of what made phone calls successful in Adams County was that clients were receiving a personal outreach call from a lead worker within the agency one or two days before they needed to show up. So it was addressing that core issue of confusion, but it was also giving people an opportunity to name any barriers or obstacles they had to attending and to quickly resolve them over the phone. These calls were taking less than five minutes, but, as the experiment showed, it really made a difference in moving from about 50 percent of people showing up to almost two-thirds of people showing up. [Editor’s note: The county reports that the engagement rate has continued to climb since the study period, which ended in May 2018. As of November 2018, following a scale-up to the entire caseload, the engagement rate was hovering at around 78 percent.]
It worked in Adams County. Would it work everywhere else in Colorado?
It really depends. The purpose of this was not to try something out and then to put out a findings brief that is generalizable to every context in Colorado or across the country. We had the opportunity to work with two other counties alongside Adams County to do their own innovation experiments, and I want to highlight that one of those counties, Arapahoe County, found no impacts on its strategy. We published a brief about that experience because we want to normalize the process of trying something out and seeing if it works.
If it doesn’t work, that is okay, that is part of the process of program quality improvement. I would encourage anybody who’s looking at the Adams County brief to also look at the Arapahoe County brief, because there’s an equally important lesson there in failure. Programs need to be comfortable with the experience that Arapahoe County had. We need to be comfortable with experimentation, trying things out, following where the data lead us and then iterating based on what [they] tell us.
Who’s your intended audience and ideally, how would you like this to affect their work?
I would say the primary audience is probably the practice community and particularly those who are in leadership positions in human services agencies. This is a valuable process. It’s good for our staff and good for our clients to undertake change in this systematic and analytic way that also contributes more knowledge to the field about what’s working for whom and under what circumstances. I think it’s really important to continue to be more nuanced about what works so that people understand the context. If somebody in Michigan picks up what they’ve done in Adams County, they should be thinking, “Okay, that worked there, but why did it work? What can I learn from that and how can I adapt it to Detroit or Kalamazoo or Grand Rapids?” It’s always going to need some tailoring.
McCay, Jonathan, Rayna Jefferson, Hayley Ballinger, Ruby Nolasco, and Heather Lower. “Learn, Innovate, Improve (LI²): Lessons from Adams County’s Efforts to Increase Engagement in the Colorado Works Program.” Washington, DC: Mathematica Policy Research, June 2018.
Schimmel Hyde, Jody, Maureen Alexander, Celeste Roybal, and Dale Nussbaum. “Learn, Innovate, Improve (LI²): Lessons from Arapahoe County’s Efforts to Increase Engagement with Colorado Works.” Washington, DC: Mathematica Policy Research, September 2018.