Change contrast:

Data Analysis

The breadth of assessments in MTSS can leave staff awash in data. MIBLSI helps participating districts and schools understand what the data mean and how to use it to positively impact students.

ISD and District Systems Problem Solving

Who: ISD and district implementation teams.

Data reviewed: Teams review their reach and capacity data. Teams also review the capacity, fidelity, and student assessment data from the districts and schools they serve.

How often: At least twice a year at data reviews.

Outcome: Data show the areas that need to be addressed, which helps teams prioritize items for improvement. The revised implementation plans are aligned with other district priorities and integrated into the district improvement plan.

Associated training: Teams learn about analysis and planning at data reviews (two to four times a year).

School-wide Problem Solving

Who: School leadership teams and tier 2/3 systems teams.

Data reviewed: Teams review their fidelity data and their student assessment data, aggregated school-wide.

How often: At least monthly, including at team meetings and data reviews.

School leadership teams also review universal screening data for students after data collection. DIBELS Next and the Student Risk Screening Scale (SRSS) are reviewed three times a year. Early warning indicators (EWIs) are reviewed in the first 20 days of a school year and then after each term.

Outcome: The fidelity data show if schools are setting up their systems to work well. Patterns in student data show possible areas of weakness in school programs. Teams revise their implementation plans to improve fidelity and revise school-wide plans and programs to target negative trends in student data. The revised plans are integrated into the school improvement plan.

When school leadership teams review universal screening data, the teams determine which students are failing to meet benchmarks or appear to be at-risk and need additional supports.

Associated training: Teams learn how to set up a universal screening system in the training sessions that support tier 1. Teams learn about analysis and planning at data reviews (three times a year). 

Grade-Level Problem Solving

Who: Grade-level teams.

Data reviewed: Teams review grade-level student data from all student assessments.

How often: At least monthly at team meetings.

Outcome: Patterns in student data show possible areas of weakness in grade-level instructional practices. Teams revise their instructional plans to target negative trends in student data.

Associated training: Teams learn about analysis and planning at the grade-level problem solving training sessions.

Student Supports for Tiers 2 and 3

Who: School leadership teams, tier 2/3 systems teams, grade-level teams, student support teams, and teachers.

Data reviewed: Teams review individual student data from all student assessments.

How often: Students receiving additional supports have their data reviewed more frequently for progress monitoring. The frequency depends on the type of assessment and the intensity of the student's needs.

Outcome: Teams select or design programs for students that are more intensive than the tier 1 supports. Teams look at a student's entire portfolio of data to ensure all of his or her needs are considered. Also, looking side-by-side at data from both behavior and academic assessments can provide insight on the relationship between reading and behavior for the student. Progress monitoring helps teams determine if the new supports are working, if they need to be adjusted, or if the student can sustain progress with a less intensive level of support.

Associated training: Teams learn about analysis and problem solving for students in the training sessions that support tier 2 and 3 behavior and reading intervention systems.

Page Feedback

Did you find this information useful?