The Evidence Base

Informing Policy in Health, Economics & Well-Being
A collaboration with
USC Dornsife Center for economic and social research

When Survey Respondents Don’t Pay Attention

As I already discussed in a previous blog post, socio-emotional skills and personality traits, like conscientiousness and self-control, have been found to be important for an individual’s success in life. But we often struggle to find reliable measures of these skills, which limits research to get a better understanding of how these skills could be developed.

My research group, Charassein, at the Department of Education Reform at the University of Arkansas has been working on the idea that measures of survey effort can lead to meaningful measures of socio-emotional skills and personality traits. Surveys take effort to complete and respondents reveal something about their character and personality through the effort they exhibit on these tasks.

In our recent paper published at the Journal of Behavioral and Experimental Economics we validated measures of survey effort as proxy measures of relevant character skills using the USC Understanding America Study, a nationally representative online panel of American Adults. We studied survey effort using measures of item non response, i.e. proportion of questions respondents leave blank, and by measures of to what extend a respondent appears to be answering questions in a careless manner, that is, a respondent gives responses that seem unpredictable given his prior responses to related questions and patterns of responses of others to these questions (“careless answering”). The structure of the panel allows us to observe these behaviors over multiple surveys, taken in different moments of time, and covering different topics.

However, measuring survey attention and effort is not new, survey researchers have often used attention checks or trap questions to identify respondents who do not read instructions or who may exert little effort during the survey. The main difference with my work at Charassein is that those data were often used to delete problematic observations while we suggest it might provide us with proxy measures of socio-emotional competencies and personality traits of the respondents.

I wondered if my proposed measures of survey effort would relate to responses in this type of trap question. Are low effort respondents easily tricked? Or would they notice we are trying to check their attention? Last December I had the opportunity to study these questions. I was able to add the following question to the end of the year UAS survey:

People vary in the amount of attention to these kinds of surveys. Some take them seriously and read each question, whereas others go very quickly and barely read the questions at all. If you have read this question carefully, please type the word “yes” in the blank box below labeled Other. There is no need for you to respond to the scale below.

1 Strongly support

2 Somewhat support

3 Somewhat oppose

4 Strongly oppose

5 Neither support nor oppose

6 Other:

The question was placed in the middle of the survey and made to look like prior and following questions. The good news is that I found that a big majority of respondents read the question and answered correctly, about 92 percent to be exact. Still, more than 7 percent (318 respondents) did not answer correctly to this trap question. I then looked at whether answering correctly was related to their self-reported personality traits, measured in the very first survey they take, and my proposed measures of survey effort. As it is shown in the Figure below, those who read the question and answered correctly, on average, reported higher levels of conscientiousness, agreeableness and openness to experiences, in the initial UAS survey, than those who missed this question. The differences are small but statistically significant and in the direction one would expect, if survey effort captures some of these personality traits. In addition, those who read also presented almost 1 % lower levels of item non response, on multiple other surveys, and 0.4 standard deviations lower levels of careless answering. Therefore, these extreme cases who missed this simple check question also seem to be showing lower survey effort in other surveys. The relationships found for conscientiousness, agreeableness, openness to experience and careless answering remained statistically significant even after controlling for cognitive ability and demographic information of the respondents.

So, going back to my questions, respondents in the UAS do not seem to be easily tricked and most don’t fall for this type of trap question. However, those who do fall, on average, seem to report lower levels of conscientiousness in an initial survey and do tend to show lower levels of effort in multiple other surveys. Overall, we find that measures of survey effort can be valuable proxy measures for important personality traits but better be sneaky and measure them in a way that respondents are not aware they are being observed.