Peer-Reviewed Manuscripts /cadre/ en Using Hierarchical Logistic Regression to Study DIF and DIF Variance in Multilevel Data /cadre/2019/04/15/using-hierarchical-logistic-regression-study-dif-and-dif-variance-multilevel-data Using Hierarchical Logistic Regression to Study DIF and DIF Variance in Multilevel Data Anonymous (not verified) Mon, 04/15/2019 - 13:29 Categories: Peer-Reviewed Manuscripts Tags: Large-scale Assessment Benjamin Shear

Link to Resource: Using Hierarchical Logistic Regression to Study DIF and DIF Variance in Multilevel Data

Authors: Benjamin R. Shear

Citation: Shear, B.R. (2018). Using hierarchical logistic regression to study DIF and DIF variance in multilevel data. Pre-print: Journal of Educational Measurement, v. 55, no. 4, pp. 513 

Abstract: 

When contextual features of test-taking environments differentially affect item responding for different test-takers and these features vary across test administrations, they may cause differential item functioning (DIF) that varies across test administrations. Because many common DIF detection methods ignore potential DIF variance, this paper proposes the use of random coefficient hierarchical logistic regression (RC-HLR) models to test for both uniform DIF and DIF variance simultaneously. A simulation study and real data analysis are used to demonstrate and evaluate the proposed RC-HLR model. Results show the RC-HLR model can detect uniform DIF and DIF variance more accurately than standard logistic regression DIF models in terms of bias and Type I error rates.

By Benjamin Shear. When contextual features of test-taking environments differentially affect item responding for different test-takers and these features vary across test administrations, they may cause differential item functioning (DIF) that varies across test administrations. Because many common DIF detection methods ignore potential DIF variance, this paper proposes the use of random coefficient hierarchical logistic regression (RC-HLR) models to test for both uniform DIF and DIF variance simultaneously.

Off

Traditional 0 On White ]]>
Mon, 15 Apr 2019 19:29:41 +0000 Anonymous 245 at /cadre
Learning Progressions and Embedded Assessment /cadre/2019/04/03/learning-progressions-and-embedded-assessment Learning Progressions and Embedded Assessment Anonymous (not verified) Wed, 04/03/2019 - 15:06 Categories: Peer-Reviewed Manuscripts Tags: Classroom Assessment Learning Progressions Derek Briggs Erin Furtak

Link to Resource: Learning Progressions and Embedded Assessment 

Authors: Derek C. Briggs and Erin Marie Furtak

Citation: Briggs, D.C. & Furtak, E. (2018). Learning progressions and embedded assessment. Pre-print from S. Brookhart & J. McMillan (Eds) Classroom Assessment and Educational Measurement, NCME Book Series. 

Abstract: 

Learning progressions have great potential as an organizing framework for classroom instruction and assessment. However, successful implementation of this framework hinges upon developing a curriculum-embedded system of student assessment. In this chapter, an approach to meeting this challenge is illustrated in the context of a learning progression in science that crosses the disciplinary boundaries of physics, chemistry and biology in a high school setting. Four key ingredients of our approach include (1) mapping and aligning the scientific content of the learning progression to the curricula of the participating teachers, (2) making the case that assessment activities targeted to the learning progression can provide teachers with relevant insights about their students, (3) bringing teachers together to discuss student ideas that emerge from assessment activities, and (4) linking the assessments within and across the courses taught by participating teachers.

By Derek C. Briggs and Erin Marie Furtak. Learning progressions have great potential as an organizing framework for classroom instruction and assessment. However, successful implementation of this framework hinges upon developing a curriculum-embedded system of student assessment. In this chapter, an approach to meeting this challenge is illustrated in the context of a learning progression in science that crosses the disciplinary boundaries of physics, chemistry and biology in a high school setting.

Off

Traditional 0 On White ]]>
Wed, 03 Apr 2019 21:06:58 +0000 Anonymous 233 at /cadre
Making Inferences about Teacher Observation Scores over Time /cadre/2019/04/02/making-inferences-about-teacher-observation-scores-over-time Making Inferences about Teacher Observation Scores over Time Anonymous (not verified) Tue, 04/02/2019 - 15:10 Categories: Peer-Reviewed Manuscripts Derek Briggs Jessica Alzen

Link to Resource: Making Inferences about Teacher Observation Scores over Time

Authors: Derek C. Briggs and Jessica L. Alzen

Citation: Pre-print of Briggs, D. C. & Alzen, J. L. (2019). Making inferences about teacher observation scores over time. Educational and Psychological Measurement. https://doi.org/10.1177/0013164419826237

Abstract: 

Observation protocol scores are commonly used as status measures to support inferences about teacher practices. When multiple observations are collected for the same teacher over the course of a year, some portion of a teacher鈥檚 score on each occasion may be attributable to the rater, lesson and time of year of the observation. All three of these are facets that can threaten the generalizability of teacher scores, but the role of time is easiest to overlook. A generalizability theory framework is used in this study to illustrate the concept of a hidden facet of measurement. When there are many temporally spaced observation occasions, it may be possible to support inferences about the growth in teaching practices over time as an alternative (or complement) to making inferences about status at a single point in time. This study uses longitudinal observation scores from the Measures of Effective Teaching project to estimate the reliability of teacher-level growth parameters for designs that vary in the number and spacing of observation occasions over a two-year span. On the basis of a subsample of teachers scored using the Danielson Framework for Teaching, we show that at least 8 observations over two years are needed before it would be possible to make distinctions in growth with a reliability coefficient of .38.

By Derek C. Briggs and Jessica L. Alzen. Observation protocol scores are commonly used as status measures to support inferences about teacher practices. When multiple observations are collected for the same teacher over the course of a year, some portion of a teacher鈥檚 score on each occasion may be attributable to the rater, lesson and time of year of the observation.

Off

Traditional 0 On White ]]>
Tue, 02 Apr 2019 21:10:33 +0000 Anonymous 237 at /cadre
Examining the Dual Purpose Use of Student Learning Objectives for Classroom Assessment and Teacher Evaluation /cadre/2019/04/01/examining-dual-purpose-use-student-learning-objectives-classroom-assessment-and-teacher Examining the Dual Purpose Use of Student Learning Objectives for Classroom Assessment and Teacher Evaluation Anonymous (not verified) Mon, 04/01/2019 - 15:23 Categories: Peer-Reviewed Manuscripts Tags: Classroom Assessment Derek Briggs Rajendra Chattergoon Amy Burkhardt

Link to Resource: Examining the Dual Purpose Use of Student Learning Objectives for Classroom Assessment and Teacher Evaluation

Authors: Derek C. Briggs, Rajendra Chattergoon, and Amy Burkhardt

Citation: Briggs, D.C., Chattergoon, R. & Burkhardt, A. (2018). Examining the dual purpose use of student learning objectives for classroom assessment and teacher evaluation. In press, Journal of Educational Measurement.

Abstract: 

The process of setting and evaluating Student Learning Objectives (SLOs) has become increasingly popular as an example where classroom assessment is intended to fulfill the dual purpose use of informing instruction and holding teachers accountable. A concern is that the high stakes purpose may lead to distortions in the inferences about students and teachers that SLOs can support. This concern is explored in the present study by contrasting student SLO scores in a large urban school district to performance on a common objective external criterion. This external criterion is used to evaluate the extent to which student growth scores appear to be inflated. Using two years of data, growth comparisons are also made at the teacher-level for teachers who submit SLOs and have students that take the state-administered large-scale assessment. Although they do show similar relationships with demographic covariates and have the same degree of stability across years, the two different measures of growth are weakly correlated.

By Derek C. Briggs, Rajendra Chattergoon, and Amy Burkhardt. The process of setting and evaluating Student Learning Objectives (SLOs) has become increasingly popular as an example where classroom assessment is intended to fulfill the dual purpose use of informing instruction and holding teachers accountable.

Off

Traditional 0 On White ]]>
Mon, 01 Apr 2019 21:23:49 +0000 Anonymous 241 at /cadre
Using a Learning Progression Framework to Assess and Evaluate Student Growth /cadre/2017/08/25/using-learning-progression-framework-assess-and-evaluate-student-growth Using a Learning Progression Framework to Assess and Evaluate Student Growth Anonymous (not verified) Fri, 08/25/2017 - 15:18 Categories: Peer-Reviewed Manuscripts Tags: Educator Effectiveness Learning Progressions Derek Briggs Elena Diaz-Bilello Fred Peck Jessica Alzen Raymond Johnson

Link to Resource:  Using a learning progression framework to assess and evaluate student growth. [Click here for the Executive Summary]

Authors:  Derek Briggs, Elena Diaz-Bilello, Fred Peck, Jessica Alzen, Raymond Johnson

Citation:  Briggs, D.C., Diaz-Bilello, E., Peck, F., Alzen, J., Chattergoon, R., & Johnson, R. (2015). Using a learning progression framework to assess and evaluate student growth. Boulder, CO: Center for Assessment, Design, Research and Evaluation (CADRE) and National Center for the Improvement of Educational Assessment. 

Off

Traditional 0 On White ]]>
Fri, 25 Aug 2017 21:18:27 +0000 Anonymous 142 at /cadre
The Prospects of Teacher Pay-for-Performance /cadre/2017/08/25/prospects-teacher-pay-performance The Prospects of Teacher Pay-for-Performance Anonymous (not verified) Fri, 08/25/2017 - 15:14 Categories: Peer-Reviewed Manuscripts Tags: Teacher Labor Markets Derek Briggs Michael Turner Charles Bibilos Andy Maul

Link to Resource:  The prospects of teacher pay-for-performance

Authors:  Derek Briggs, Michael Turner, Charles Bibilos, Andy Maul

Citation:  Briggs, D. C., Turner, M., Bibilos, C. & Maul, A. (2014) The prospects of teacher pay-for-performance. Boulder, CO: Center for Assessment Design Research and Evaluation (CADRE).  

Off

Traditional 0 On White ]]>
Fri, 25 Aug 2017 21:14:53 +0000 Anonymous 140 at /cadre