Data-Driven Progress Requires Digital Trust

Data-Driven Progress Requires Digital Trust

Jun 21, 2021
Digital Trust

Mathematica President and Chief Executive Officer Paul Decker often says that there are two ages in the historical movement to create evidence-informed policies, programs, and practices. In the first age, which started around the 1960s, data were scarce and static. Primary data collection could be expensive and slow. But the technological constraints on sharing information meant that data were relatively secure, too.

The second age, which began only about a decade ago, is being driven by the Internet and data science. With the rise of digital technology, data are all around us. One statistic suggests that in 2020, more than 1 trillion megabytes of data were produced per day. This data science age holds great promise because it means we can move beyond using data to determine whether a program is effective, or not. We can ask more questions with the same data, aggregate related sets of data, and use data to continuously improve a program over time.

However, the potential benefits of big data and data science come with weighty considerations about how to be ethical stewards of information. Using evidence to improve public well-being requires that our clients, partners, and employees have confidence in our ability to protect and secure data and personal privacy rights, and to reduce the effects of inherent bias. This is what we call digital trust. Across our industry, it is acknowledged that the presence of bias hinders the pursuit of equity. Although there are many opportunities to use the immense amount of data generated each day in novel ways to derive insight and drive change, we also recognize that not all data are created equal—bias and incomplete data sets might cause well-intentioned, but unintentionally inequitable outcomes. Consider the example of an algorithm used to predict which patients would need extra medical care; researchers recently discovered that it had been producing biased results, due to an erroneous assumption that the amount spent historically matched the actual need. Mathematica anticipates and addresses these potential problems through its commitment to transparency concerning digital trust.

In a world where big data, artificial intelligence, and machine learning play an increasingly significant role in how organizations do their work, our partners look to us to support them in using these innovations to drive impact. Mathematica Director of Health Technology Matt Gillingham recently wrote about infusing our work with technology as a logical next step in evidence-based decision making for our organization and the field at large. As technology advances, so does our ability to extract valuable insights. Fully realizing the value of these tools—while staying true to our mission—requires us to find ways to uncover flawed assumptions and patterns of bias in data sets. We must continue to build privacy and security into our products and solutions, ever mindful of the importance of ensuring digital trust.

To achieve this goal, we will constantly look for more efficient ways to integrate digital trust into our work. First, we are updating our risk-management practices using trusted, established tools and frameworks. Our focus on digital trust will require us to draw on the expertise of our data scientists to create a programmatic approach to managing bias and inequity, one that considers whether it is right and equitable to use data in certain ways, even when we are legally permitted to do so. We know that establishing digital trust in this space, where data-driven insight can influence public policy, is to venture into somewhat uncharted territory. However, we will be guided along the way by our mission and values. For those considering a similar digital trust journey, we will use communications tools such as our blog to share lessons along the way. We want to model ethical data use and support industry peers who pursue our same data goals.

Our commitment to digital trust and equitable outcomes, despite the challenges of bias in data and machine learning, enables our clients to focus on operating equitable and sustainable programs to enhance our collective impact on public well-being. Taking the lead in digital trust aligns with our mission, vision, and values. As we innovate to drive equity and justice around the world, the way we do it is just as important as what we do.

About the Author

Hillary Lewis

Hillary Lewis

Vice President; Chief Information Security Officer
View More by this Author