Prof. Yuanyuan Chen of Institute for Advanced Research (IAR) has a paper accepted for publication by Proceedings of the National Academy of Sciences of the United States of America (PNAS) and published online on Dec. 30, 2019. The paper titled “Sensitivity of Self-Reported Non-Cognitive Skills to Survey Administration Conditions” is coauthored with Shuaizhang Feng from Jinan University, James J. Heckman from the University of Chicago, and Tim Kautz from Mathematica. PNAS shares equal reputation with Nature and Science and is one of the most cited multidisciplinary scientific journals with extensive coverage and influence among researchers worldwide.
Prof. Yuanyuan Chen received her Ph.D. from Boston College and was recruited through the Economics Innovation Platform by SUFE as excellent talent from overseas. She currently holds positions as Ph.D. supervisor, Assistant to Dean, and Director of the Center for Migration and Labor Market. Her research areas include labor economics and applied micro-econometrics. In 2019, Prof. Chen has several other papers published or accepted by China Economic Review, Journal of Comparative Economics, Journal of Labor Research, and Frontiers of Economics in China.
In 2019, researchers in IAR have 18 papers in total published or accepted by international and domestic leading journals or theory sections of major periodicals such as Economic Research Journal, China Economic Quarterly, Management World, PNAS, Journal of Comparative Economics, etc. It shows the leaping development of IAR in both quantity and quality of publications, which will improve the power of discourse of SUFE in the international academia and contribute to the construction of a world-class university.
Recent evidence has shown that noncognitive skills matter for success in life and can be shaped through interventions. Because of this evidence, policy makers and researchers have increasingly become interested in measuring noncognitive skills and typically rely on self-reported measures in which respondents rate their own skills. Such self-reports have been applied in program evaluations, as well as school accountability and improvement systems. We demonstrate that self-reports are sensitive to survey administration conditions, including whether a survey administrator describes the skills being assessed and whether respondents receive incentives tied to performance on other tasks. These findings have implications for the interpretation of self-reported measures. Social policies or interventions might affect responses on self-reported noncognitive skills without affecting the skills themselves.
Noncognitive skills (e.g., persistence and self-control) are typically measured using self-reported questionnaires in which respondents rate their own skills. In many applications—including program evaluation and school accountability systems—such reports are assumed to measure only the skill of interest. However, self-reports might also capture other dimensions aside from the skill, such as aspects of a respondent’s situation, which could include incentives and the conditions in which they complete the questionnaire. To explore this possibility, this study conducted 2 experiments to estimate the extent to which survey administration conditions can affect student responses on noncognitive skill questionnaires. The first experiment tested whether providing information about the importance of noncognitive skills to students directly affects their responses, and the second experiment tested whether incentives tied to performance on another task indirectly affect responses. Both experiments suggest that self-reports of noncognitive skills are sensitive to survey conditions. The effects of the conditions are relatively large compared with those found in the program evaluation literature, ranging from 0.05 to 0.11 SDs. These findings suggest that the effects of interventions or other social policies on self-reported noncognitive skills should be interpreted with caution.