Mathematica is continuously monitoring this fluid situation, and we are proactively working to minimize any potential impacts on our clients, partners, staff, and the important work that we do together. Learn more.
Data Collection’s Evolution: Insights from AAPOR
Mathematica’s survey researchers were well represented at the American Association for Public Opinion Research (AAPOR) annual conference in Austin, Texas, last week. With a theme of “Reshaping the Research Landscape: Public Opinion and Data Science,” the conference yielded many insights into the rapidly evolving future of survey research, design, and data collection. Here, some of our experts reflect upon what they learned.
I’ve been attending AAPOR for more than 10 years, and this year’s conference was the best so far. The caliber and topics of the presentations and the tone of AAPOR were more forward-looking and innovative than in the past. In particular, it was gratifying to see that the AAPOR president’s address focused on diversity and inclusion in survey research and related professions.
This year’s conference highlighted the expansion of tools for analyzing data collection operations and for analyzing the information we collect. There are many new ways to think about sharing and collecting information. For example, a session on machine learning examined how this tool enables the collection of high-quality data with shorter questionnaires. It was also interesting to see how satellite imagery is being refined to assess the quality of address listings and how GPS technologies are being applied to collections in transportation and time-use re search, for example.
In sum, I think we’re seeing a closer marriage between data collection and data science. These two areas are complementing each other—leading to faster, more efficient, and ultimately more high-quality data that can help to inform our research.
I’ve been going to AAPOR for more than 10 years, and what stood out for me in Austin was the use of administrative data to improve long-running federal surveys and the continuing refinement around how survey researchers provide respondents with mode choices.
Recently two large federal surveys, the National Survey of Children’s Health and the National Survey of Children with Special Health Needs, changed their sample design to use administrative data for substantial efficiency gains. Previously, these surveys piggybacked on the National Immunization Survey random digit dialing telephone frame to identify households with children. That method has become less effective and more expensive, so the National Center for Health Statistics at the Centers for Disease Control has turned to an address-based sample supplemented with administrative data from several sources, including the U.S. Census and the Internal Revenue Service. This significant design change reduces the effort to reach households with children or children with special health needs.
At past AAPOR conferences we’ve seen many papers showing that respondent mode choice often reduces response rates. That has been handled by providing a sequence of single mode options starting with the least expensive and offering mode choice only to early project non-responders. This year we saw some effective experiments offering a choice of modes (which clients and respondents prefer), but providing different financial incentives by mode, encouraging respondents to use the less expensive modes.
Another hot topic at this year’s conference concerned changes to the regulatory environment, specifically the Telephone Consumer Protection Act (TCPA), which outlines new regulations regarding automated telephone dialing of cell phone numbers. AAPOR assembled a TCPA task force and released a white paper (AAPOR member access required) on this issue—both great resources to help stay informed on these changes.
Donsig Jang, Director of Data Sciences and Statistics
Before I started attending AAPOR about seven years ago, I didn’t quite appreciate the complexity and challenges of data collection. Since then, AAPOR has opened my eyes to the challenges we face. It’s helped me understand the current climate in the field: astronomically increasing data collection cost, rapidly changing technology, Big Data, etc.
AAPOR has helped me to see the full cycle of high-quality data collection and delivery. It starts with data collection, including survey, administrative, commercial, social media, and many other unconventional data. It moves to data integration and processing, data quality measurement, and data visualization. And then it ends with data dissemination. Integrating all of these components and phases is critical for timely delivery of high-quality data. To achieve this, we increasingly need fully integrated teams in survey research. We see this in our work. Mathematica’s data analytics experts work in concert with our surveys and research teams to ensure that high-quality data inform our research, which helps us achieve our mission.
In session after session at this year’s conference, I heard that we’re at a critical juncture in the survey data collection industry—a clear indication of fellow researchers’ awareness of the challenges ahead. In the wake of Big Data, primary data collection will continue and probably even grow—and will provide reliable benchmarks for using many other sources of unconventional data. We need to streamline the entire cycle of primary data collection and delivery in near real time. We already have the capability to collect data in all kinds of ways—such as through mobile device apps that download information in real time, and through open data sources. We also are using nonprobability sampling, which was once frowned upon but is now seen as an increasingly important way to obtain data. We can’t be afraid to jump in and try these new, unconventional approaches to collect data. But we also must realize that these data are like crude oil—we need to refine them using our rigorous data analytic methods to ensure they are of high quality.
Kathleen Feeney, Survey Specialist
This was my first year at AAPOR. I presented a poster that examined parent survey mode choice on Head Start’s American Indian and Alaska Native Family and Child Experiences Survey.
As someone who is relatively new to the field, I found this conference really inspiring. AAPOR has opened my eyes to all the opportunities that exist in survey research, particularly in the area of cultural competencies and surveying diverse populations. For example, one of the sessions I attended looked at perceptions of how New Orleans was doing after Hurricane Katrina, directly after the storm hit and at a few other follow-up points in the following decade. It was interesting to hear about the evolving methods used to reach residents as the city recovered, and also to see the differences and similarities in perceptions of the recovery by race and geographical location. A related presentation surveyed clinical practitioners on how race, culture, and linguistics affect their efforts to communicate information to diverse patients. The conference made me think of new ways to approach my work to make it more culturally relevant.