Nicole B Kersting
- Associate Professor, Educational Psychology
- Associate Professor, Statistics-GIDP
- Member of the Graduate Faculty
- (520) 621-8737
- EDUCATION, Rm. 602
- TUCSON, AZ 85721-0069
- nickik@arizona.edu
Biography
My research interests are shaped by my interdisciplinary background, which combines my training and expertise in advanced quantitative methods with my substantive interests and expertise in mathematics teaching and learning. My work has focused on understanding the complex relationships between teacher knowledge, teaching, and student learning because understanding these relationships provides the foundation for helping teachers improve their teaching in deliberate and systematic ways. Because I am interested in understanding these basic relationships at scale, my work has focused on the development of innovative, authentic, reliable, and valid measures that can be used at scale. My most current research interests are focused on understanding the applicability of machine learning approaches to issues in measuresement and educational research.
Degrees
- Ph.D. Advanced Quantitative Research Methodology
- University of California Los Angeles, Los Angeles, California, United States
- Assessing Teachers’ Knowledge of Teaching Mathematics: Instrument Development and Validation
Work Experience
- University of Arizona, Tucson, Arizona (2015 - Ongoing)
- University of Arizona, Tucson, Arizona (2009 - 2015)
Interests
Teaching
Consistent with my training, my teaching interests are concentrated on methodology, statistics and measurement.
Research
My research interests are shaped by my interdisciplinary background, which combines my training and expertise in advanced quantitative methods with my substantive interests and expertise in mathematics teaching and learning. My work has focused on understanding the complex relationships between teacher knowledge, teaching, and student learning because understanding these relationships provides the foundation for helping teachers improve their teaching in deliberate and systematic ways. Because I am interested in understanding these basic relationships at scale, my work has focused on the development of innovative, authentic, reliable, and valid measures that can be used at scale. My most current research interests are focused on understanding the applicability of machine learning approaches to issues in measuresement and educational research.
Courses
2024-25 Courses
-
Select Appl/Stats Method
EDP 641 (Spring 2025) -
Dissertation
STAT 920 (Fall 2024) -
Independent Study
EDP 699 (Fall 2024) -
Multi Meth Educ Rsrch
EDP 646A (Fall 2024) -
Research
EDP 900 (Fall 2024)
2023-24 Courses
-
Dissertation
STAT 920 (Spring 2024) -
Independent Study
EDP 699 (Spring 2024) -
Select Appl/Stats Method
EDP 641 (Spring 2024) -
Dissertation
STAT 920 (Fall 2023) -
Dsgn Questionnaire+Scale
EDP 557 (Fall 2023) -
Thesis
STAT 910 (Fall 2023)
2022-23 Courses
-
Dissertation
STAT 920 (Spring 2023) -
Dissertation
TLS 920 (Spring 2023) -
Select Appl/Stats Method
EDP 641 (Spring 2023) -
Thesis
STAT 910 (Spring 2023) -
Dissertation
STAT 920 (Fall 2022) -
Dissertation
TLS 920 (Fall 2022)
2021-22 Courses
-
Dissertation
TLS 920 (Spring 2022) -
Independent Study
STAT 599 (Spring 2022) -
Select Appl/Stats Method
EDP 641 (Spring 2022) -
Dissertation
TLS 920 (Fall 2021)
2020-21 Courses
-
Dissertation
TLS 920 (Spring 2021) -
Intro to IRT Mdlg/Applctions
TLS 571 (Spring 2021) -
Intro to Program Evaluation
TLS 311 (Spring 2021) -
Research
TLS 900 (Spring 2021) -
Dissertation
TLS 920 (Fall 2020) -
Independent Study
STAT 599 (Fall 2020)
2019-20 Courses
-
Dissertation
TLS 920 (Spring 2020) -
Research
TLS 900 (Spring 2020) -
Dissertation
TLS 920 (Fall 2019) -
Intro to Program Evaluation
TLS 311 (Fall 2019) -
Research
STAT 900 (Fall 2019)
2018-19 Courses
-
Dissertation
TLS 920 (Spring 2019) -
Intro to IRT Mdlg/Applctions
TLS 571 (Spring 2019) -
Research
TLS 900 (Spring 2019)
2017-18 Courses
-
Dissertation
STAT 920 (Spring 2018) -
Intro to Program Evaluation
TLS 311 (Spring 2018) -
Dissertation
STAT 920 (Fall 2017)
2016-17 Courses
-
Dissertation
STAT 920 (Spring 2017) -
Independent Study
TTE 699 (Spring 2017) -
Intro to Program Evaluation
TLS 311 (Spring 2017) -
Dissertation
STAT 920 (Fall 2016) -
Intro to IRT Mdlg/Applctions
TTE 571 (Fall 2016) -
Thesis
STAT 910 (Fall 2016)
2015-16 Courses
-
Thesis
STAT 910 (Summer I 2016) -
Dissertation
STAT 920 (Spring 2016) -
Thesis
STAT 910 (Spring 2016)
Scholarly Contributions
Chapters
- Kersting, N. B., Stevenson, P. A., & Chen, M. (2016). Exploring Issues of Dimensionality and Model Selection: Practical Considerations from the Classroom Video Analysis (CVA) Instrument Development Effort. In Psychometric methods in mathematics education: Opportunities, challenges, and interdisciplinary collaborations. Journal for Research in Mathematics Education monograph series.(pp 119-139). Reston: VA: National Council of Teachers of Mathematics.More infoThis chapter explores issues related to the dimensionality of assessment data and discusses implication for model selection.
Journals/Publications
- Heshmati, S., Kersting, N. B., & Sutton, T. (2017). Opportunities and Challenges of Implementing Instructional Games in Mathematics Classrooms: Examining the Quality of Teacher-Student Interactions During the Cover-up and Un-cover Games. International Journal of Science and Mathematics Education, Information not yet available(Information not yet available), online first (no page numbers available yet. doi:10.1007/s10763-016-9789-8More infoThis study explored the design and implementation of the Cover-up and Un-cover games, two manipulative-based fraction games, in fourteen fifth grade classrooms. We examined how the fraction concepts were integrated into the game design and explored the nature of teacher-student interactions during games using lesson videos. Our examination showed that interactions focussed on game progress, rules, and turn taking with little exploration of strategies or the underlying mathematics. To compare the quality of teacher-student interactions during games with interaction quality during other instructional activities, we coded five videotaped lessons from each of the fourteen classrooms. Statistical results from a dependent t-test indicated that teacher-student interactions were of statistically significant lower quality during games than during non-game segments with similar instructional purpose. Teachers might benefit from additional curriculum support and training to implement games as rich mathematical learning opportunities.
- Heshmati, S., Kersting, N. B., & Sutton, T. (2017). Opportunities and Challenges of Implementing Instructional Games in Mathematics Classrooms: Examining the Quality of Teacher-Student Interactions during the Cover-up and Un-cover Games. International Journal of Science and Mathematics Education. International Journal of Science and Mathematics Educationtion, N/A(N/A), published online (still awaiting exact date for print publication). doi:10.1007/s10763-016-9789-8.
- Kersting, N. B., Sutton, T., Kalinec-Craig, C., Heshmati, S., Lozano, G. I., & Stigler, J. (2015). Usable Knowledge for Teaching Mathematics: Further Exploration of the Classroom Video Analysis (CVA) Instrument. ZDM - The International Journal on Mathematics Education.More infoIn this study we further explore the nature of usable teaching knowledge in mathematics. First, we present a model of usable teaching knowledge. Second, we examine how the design of the Classroom Video Analysis (CVA) Instrument as measure of usable knowledge relates to the proposed model. The CVA is based onteachers' ability to analyze teaching events depicted in short video clips. Third, using data from three different CVA assessment scales (on fractions, ratio and proportions, and variables, expressions, and equations), we explore the structure underlying teachers' scored responses to the video clips, aggregated at the clip level, to uncoverthe latent processes that generate teachers' responses. For all three scales we found that a single dominant factor, possibly teachers' usable teaching knowledge, explained up to two thirds of the variation in teachers' clip level scores. We consider the results inlight of our model of usable teaching knowledge and generate testable hypotheses for further study.
- Kersting, N. B., Chen, M., & Stigler, J. W. (2013). Value-added teacher estimates as part of teacher evaluations: Exploring the effects of data and model specifications on the stability of teacher value-added scores. Education Policy Analysis Archives, 21.More infoAbstract: If teacher value-added estimates (VAEs) are to be used as indicators of individual teacher performance in teacher evaluation and accountability systems, it is important to understand how much VAEs are affected by the data and model specifications used to estimate them. In this study we explored the effects of three conditions on the stability of VAEs and evaluated their relative impact. We varied how we accounted for differences among students in their prior learning; whether we estimated VAEs from single or multiple cohorts of students; and the number of students contributing to the VAE for each teacher. Using data from one of the largest school districts in the nation, we created a single, complete data set and used it to estimate several sets of VAEs for each of the 3651 5th grade mathematics teachers. We found that approximately two thirds of teachers' were stable in that they remained in the same performance group across all conditions. We also found that differences in number of students used for VAEs accounted for up to one third of teacher reclassifications into different performance groups; single versus multiple cohort models accounted for about one fifth; and different methods for controlling for student prior learning accounted for about one sixth (16%) of teacher reclassifications. We relate our findings to characteristics of our data and discuss implications for educational policy.
- Kersting, N. B., Givvin, K. B., Thompson, B. J., Santagata, R., & Stigler, J. W. (2012). Measuring Usable Knowledge: Teachers' Analyses of Mathematics Classroom Videos Predict Teaching Quality and Student Learning. American Educational Research Journal, 49(3), 568-589.More infoAbstract: This study explores the relationships between teacher knowledge, teaching practice, and student learning in mathematics. It extends previous work that developed and evaluated an innovative approach to assessing teacher knowledge based on teachers' analyses of classroom video clips. Teachers watched and commented on 13 fraction clips. These written analyses were coded using objective rubrics to yield a reliable and valid indicator of their usable teaching knowledge. Previous work showed this measure to correlate with another measure of teacher knowledge and to predict students' learning from the teachers' fraction instruction. In this study, the authors replicated those findings and further showed that the effect of teacher knowledge on student learning was mediated by instructional quality, measured using video observations of teachers' lessons. © 2012 AERA.
- Santagata, R., Kersting, N., Givvin, K. B., & Stigler, J. W. (2011). Problem implementation as a lever for change: An experimental study of the effects of a professional development program on students' mathematics learning. Journal of Research on Educational Effectiveness, 4(1), 1-24.More infoAbstract: This study investigates, through an experimental design, the effectiveness of a professional development program on teacher knowledge and practices and on student learning. The program consisted of a series of video-based modules designed to respond to needs of U.S. teachers, as highlighted by findings from the 1999 Third International Mathematics and Science Video Study. Sixty-four 6th-grade teachers from five low-performing inner-city schools participated in the study and were randomly assigned to treatment and control groups. Measures included fidelity of implementation, teacher knowledge and practice, and student mathematics learning. The program did not impact significantly teacher knowledge or practices as measured in the study. An effect was found on mathematics learning for students whose teachers reached a certain level of mathematics content knowledge. Discussion of findings includes lessons learned about conducting and studying professional development, particularly in low-performing schools. © Taylor & Francis Group, LLC.
- Kersting, N. B., Givvin, K. B., Sotelo, F. L., & Stigler, J. W. (2010). Teachers' analyses of classroom video predict student learning of mathematics: Further explorations of a novel measure of teacher knowledge. Journal of Teacher Education, 61(1-2), 172-181.More infoAbstract: This study explores the relationship between teacher knowledge and student learning in the area of mathematics by developing and evaluating an innovative approach to assessing teacher knowledge. This approach is based on teachers' analyses of classroom video clips. Teachers watched 13 video clips of classroom instruction and then provided written comments on the interactions of the teacher, students, and content. The quality of teachers' analyses, coded using an objective rubric, are shown to be reliable and valid, relating both to another widely used measure of teacher knowledge and to teachers' own students' learning (from pre- to posttest). © 2010 American Association of Colleges for Teacher Education.
- Kersting, N. (2008). Using video clips of mathematics classroom instruction as item prompts to measure teachers' knowledge of teaching mathematics. Educational and Psychological Measurement, 68(5), 845-861.More infoAbstract: Responding to the scarcity of suitable measures of teacher knowledge, this article reports on a novel assessment approach to measuring teacher knowledge of teaching mathematics. The new approach uses teachers' ability to analyze teaching as a proxy for their teaching knowledge. Video clips of classroom instruction, which respondents were asked to analyze in writing, were used as item prompts. Teacher responses were scored along four dimensions: mathematical content, student thinking, alternative teaching strategies, and overall quality of interpretation. A prototype assessment was developed and its reliability and validity were examined. Respondents' scores were found to be reliable. Positive, moderate correlations between teachers' scores on the video-analysis assessment, a criterion measure of mathematical content knowledge for teaching, and expert ratings provide initial evidence for the criterion-related validity of the video-analysis assessment. Results suggest that teachers' ability to analyze teaching might be reflective of their teaching knowledge. © 2008 Sage Publications.