The Effective Use of Data Driven Instruction (DDI)

Learning is a cumulative process. Mastery of knowledge and skills in early grade levels creates an essential foundation for the development of more advanced critical thinking and understanding of complex content. As students advance in their education, these prior-grade skill gaps can hinder their academic performance. This phenomenon has become especially significant following the implementation of the Common Core Learning Standards and their accompanying assessments. CASDA faculty members Bill Haltermann and Pam Roberge collaborated with researchers Katy Schiller (SUNY Albany, School of Education) and Francesca Durand (The Sage Colleges, School of Education) to address both skill gaps and how effective teachers utilize data driven instruction (DDI).  

 

The CASDA team developed a system of professional development that focuses on data driven instruction in order to address prior-grade ELA skill deficits. The system focuses on data-informed targeted skills. Targeting skills is critical to the DDI process because the transition to Common Core and annual assessments revealed that many students exhibited persistent skill gaps over multiple years (grade levels). To identify and target these skill deficits, CASDA faculty members created assessments that reflected how standards-based skills are tested on statewide assessments.

 

Over the past three years, six districts (with a variety of student demographics) have participated in the CASDA- developed DDI system with over 1,900 students taking the CASDA assessments. The teacher professional development was tightly focused on the unique characteristics of the data for individual classrooms and students.

 

This approach to DDI has produced interesting insights into which PD factors have the most impact on student performance on state tests. Some of the findings are detailed below and reflect information that came directly from teachers and observations by the researchers.

 

One of the most significant findings was the impact of teacher engagement in the DDI process on student outcomes.  It was observed that many teachers were highly engaged because (in their words) this was “some of the most useful and practical PD” they had ever experienced. They felt this way for several reasons. First, they liked the fact that the assessments provided clear ways of identifying below grade skill gaps for their current students. The assessments were completely transparent. Teachers knew exactly what passages were on the tests and the skills each question was testing. Some teachers noted this transparency was lacking in commercial adaptive testing.

 

Teachers also liked the immediate feedback on their students which was derived from scoring the tests and the data generated. The feedback to teachers focused on skills, not scores. The targeting of specific skills (as indicated by the assessment data) made teachers much more productive, informing not only whole group instruction, but also individual student remediation.

Based on teacher feedback, the single most important part of the DDI process was the professional development (PD). Teachers felt the PD, which covered targeted strategies and scaffolding, could have an immediate impact in their classrooms. In many instances, teachers reported literally employing some of the strategies/scaffolding the next day.

 

The researchers tracked student performance on the state tests in participating classrooms. Where there was high teacher engagement, evidence shows students were more successful on the state tests, both in terms of proficiency rates and increases in scale scores.

 

Other findings the researchers noted which could also have an impact on student performance include:

  • The ability of teachers to better analyze and interpret data. Many teachers (and administrators) interpret data analysis as looking at scores and performance levels. Many are not trained or comfortable with doing a deeper analysis of skills performance. Benchmarking skills performance and incorporating p-values in the analysis are not familiar techniques for many instructors. More sophisticated analysis leads to better targeting of skill deficits.

  • The ability to identify below-grade-level skill gaps is a central piece of the DDI system. As a consequence, teachers need a comprehensive understanding of the progression of the standards and the related sub-skills. It is entirely understandable that teachers would focus primarily on their own grade level standards, but if a significant portion of their students have persistent (below grade) skill gaps, then this lack of understanding would impede the teacher’s ability to remediate the gaps.

  • The Common Core standards and the resulting assessments have increased the level of rigor required of students. The focus on more complex texts (at all grade levels) and text based evidence has caused a significant shift in instruction. Many think the standards define the high level of rigor, but any individual standard can be addressed at multiple levels of rigor. In fact, the state assessments are the best source for defining the rigor common to all public school classrooms. Many teachers, unfortunately, do not have enough time to analyze the released passages of state assessments, questions and distracter answers in order to develop an in-depth understanding of the appropriate level of instructional rigor (and how to scaffold instruction, if necessary).

  • The corollary to not spending enough time with released questions is that teachers are not taking advantage of a rich source of teacher tools represented by the released questions pool over the last three to four years. Some might ask if this is “teaching to the test”. That is definitely not so. Students need to learn thinking skills and how to apply these skills in many different formats. Using released questions provides an opportunity for instructors to demonstrate different ways of applying critical thinking skills. Creating common core lessons can be very labor intensive for teachers. Leveraging an existing resource makes sense in terms of saving valuable time and ensuring appropriate levels of rigor.

  • Most teachers use graphic organizers. Some tend to think of graphic organizers solely in terms of being an information collector, as opposed to being a learning strategy. Graphic organizers can also be used at a deeper level to model critical thinking skills, visually representing the thinking process needed to accomplish multi-step analyses, such as summarizing when using key details from a text. This use of graphic organizers is absolutely essential, especially for struggling students who have persistent common core skill gaps.

  • Most of the above impacts are centered on classrooms and teachers. Other impacts on student performance are, by definition, institutional in nature. Examples include the vertical alignment and consistent use of the definitions of key terms and graphic organizers. Inconsistent use of terms and instructional tools from grade to grade can be a real impediment to improved student performance, particularly for struggling students.

The above is not meant to be a comprehensive list of variables that affect student performance. However, they do represent significant findings from the DDI work done by the CASDA faculty members with several districts and teachers over the past several years.

 

Moving forward, the research team will focus on several areas. The first is to validate the importance of the DDI system with the multi-grade assessments that are described above. How does the system change instructional practice? Is there a correlation between the DDI system and student performance on state assessments? Some evidence accumulated to date shows that in the participating DDI classrooms student performance on state assessments can be correlated to performance on the multi-grade assessments. Within that DDI system, an effort will also be made to identify which (of the many possible variables) will have the greatest impact on student performance.

 

Next stop: San Antonio, Texas for the American Educational Research Association’s annual convention. That’s where the research team will be presenting their findings to educators from around the country.

Share on Facebook
Share on Twitter
Please reload

Featured Posts

The View From Here

December 19, 2019

1/10
Please reload

Recent Posts

December 19, 2019

November 26, 2019

October 29, 2019

September 26, 2019

August 19, 2019

June 25, 2019

May 22, 2019

April 30, 2019

Please reload

Contact Us:

Visit Us:

Get Social With Us:

  • White Facebook Icon
  • White Twitter Icon
  • White Instagram Icon
  • White YouTube Icon
  • White LinkedIn Icon
518.442.5045
University at Albany 
1400 Washington Ave
Catskill B27
Albany, NY 12222  

© 2019 Capital Area School Development Association