Improving Decision-Making for Impact: Important Considerations for Special Educators and Implementation Teams

Lynne Harper Mainzer, EdD
Deputy Director
Center for Technology in Education
Associate Professor, School of Education

Andrea Schanbacher, MEd
Program Administrator
Center for Technology in Education

Jennifer Dale, EdD
Program Administrator
Center for Technology in Education

Susan Stein, MEd
Program Quality Specialist
Center for Technology in Education

 

Key Principles and Questions

Today’s special education leaders interact with considerable amounts of student performance data on a routine basis.  At the district level, administrators generally focus on using data to make programmatic decisions and generate reports on progress across schools.  The attention at the school level centers on the progress of student groups or individual students with disabilities. Those special educators who are skillful with data usage, monitor student progress toward measurable content indicators and use the results to guide instruction, assign appropriate accommodations, and select instructional and behavioral interventions.  However, despite the potential of using data to track student progress and guide student learning, many educators still grapple with understanding how to effectively implement data-informed decision making practices (Boudett & Steele, 2007; Mainzer & Stein, 2013; Stid, O’Neill, & Colby, 2009).  Whether at the district or school levels, special educators frequently are puzzled as to which data are essential and how to use student progress information so that the greatest gains are realized across student populations and for individual students with disabilities. This white paper posits the following set of principles with associated considerations and questions to help special educators determine:

  • the quality of the data being reviewed;
  • the culture of the school in terms of collaboration;
  • the use of data to promote continuous improvement; and
  • the capacity among teachers and administrators to implement the recommended actions.

By adhering to these principles, special education leaders are empowered to make decisions that result in improved instruction for students with disabilities in inclusive settings.

 

Principle 1:  Ensure high quality data from multiple sources.

Schools leaders need accurate data that are compiled from multiple sources and presented in user-friendly, visual displays. In their fast- paced work environments, educators must have easily accessible, real-time data that are disaggregated and organized to simplify interpretation. Having reliable data “at their finger-tips” improves not only efficiency in making decisions, but also the probability of determining actions that produce positive outcomes for programs and individual students. Consequently, special education leaders should work closely with information technology specialists at their districts and schools to make certain quality data relevant to specified analyses are easily obtainable.

 

Principle 2: Utilize robust data systems and efficient tools.

Technology tools to support data-informed decision making vary in relation to function and robustness.  To mention a few, state and local school leaders utilize data from 1) data warehouses, which are large integrated databases that allow users to view data across multiple operational systems and variables; 2) diagnostic systems, which contain assessment tools to determine student strengths and weaknesses in specified academic and behavioral areas and to formulate target goals and actions; and, 3) analysis & reporting systems, which include  up-to-date applications to examine student progress data across levels—district, school, program, classroom, student group, or individual. When these types of tools are functioning well systemically, efficiencies in data retrieval and organization can be expected. Furthermore, when these tools are utilized routinely within districts and schools that have built collaborative, “tech-savy” cultures, meaningful analyses and results-oriented decision-making can be expected.

 

Principle 3: Build a school culture of collaboration and continuous improvement to implement targeted actions.

The quality of decision-making enhances as the school organizational structure incorporates a culture of collaboration, accountability, and continuous improvement.  In these settings, school leaders build commitment among stakeholders to use data within team-based structures in which clear goals and roles for executing on-going data analysis of teacher performance and student progress are clearly defined.  Teachers are empowered with data analysis skills, technology tools, and instructional resources to make decisions in order to maximize learning opportunities for all students.  Moreover, special educators who are skillful with data usage and tools are able to assess student progress toward measurable content indicators, using the results to guide instruction, create IEP goals, assign appropriate accommodations, and monitor the impact of selected instructional and behavioral strategies.

Central to building a collaborative, data-oriented culture is for general and special education leaders to articulate a unified vision.  To collaboratively steward the vision, actions are utilized that envelop authentic analysis of student progress and drive an unrelenting commitment toward continuous improvement of teaching performance and success for all students. Inspiring this type of shared vision often requires leaders to set aside time to foster understanding among stakeholders of its core elements and to put forth a specific protocol that prompts teamwork and data-informed decision making.

 

Principle 4: Conduct action-centered, data analysis to select, implement, and monitor targeted actions.

Leaders of collaborative, data-oriented teams emphasize that recommended actions are generated as a result of thorough examination of longitudinal and real-time data.  Basically, the actions are targeted solutions that include performance goals, specific implementation steps, and measurable outcomes.  Team members’ efficiency and use of data in meetings can be improved by:

  1. An explicit agreement to look at individual student data and group data as appropriate in order to solve academic and/or behavioral problems;
  2. Use of a protocol(s) for asking essential questions related to examining progress and selecting strategies or interventions;
  3. Longitudinal data relevant to a specific issue under discussion are identified in advance of the meeting;
  4. Multiple sources of data that are timely and relevant to the issue are available for team use;
  5. Data are presented in formats that are easy to understand and use; and
  6. Selected effective strategies and/or evidence-based interventions are targeted to address identified needs and goals.
  7. An action plan with clear steps for needed professional development, fidelity checks, and monitoring activities is developed.

Special education leaders operate within a larger context—the general education district or school culture.  The influence to incorporate the aforementioned principles may be comprised by factors, such as absence of support from the school administrators, minimal access to relevant and current formative assessment data; and/or lack of time allotted within the school day for student progress review meetings.  In many instances, special educators find themselves in situations where there is only time for brief discussion regarding a student or students’ progress.  Consequently, these time-constrained meetings call for streamlined discussions to discern essential performance data quickly and recommend targeted solutions.

Considerations for Effective Data-Informed Decision Making” is offered to expedite meetings in which general and special educators are examining data to guide instructional and behavioral decision making for students with disabilities. Guiding questions are derived from the principles to help the special educator remain aware of critical areas that often are overlooked when examining data and making instructional decisions.

Addressing these considerations and questions during student planning meetings, whether pre-referral, progress checks, or IEP meetings, should stimulate thinking and discussion regarding: 1) data quality; 2) use of the most efficient, robust data systems and tools available; 3) action-centered, data analysis for planning; and 4) the extent of collaboration and data utilization within the school culture and capacity to implement targeted actions. With specific attention given to the core principles for improving data-informed decision making, special educators can contribute to creating conditions in which authentic inquiry is based on high quality performance data (student and teacher). Meeting these fundamental conditions affords greater opportunity for teachers engaged in student planning to set learning goals and implement instructional and/or actions that are relevant to the students’ needs and targeted toward increasing their achievement in the least restrictive learning environment.

 

Principles and Questions Are Not Enough

Principles and questions by themselves are not sufficient for promoting substantive change needed for quality questioning, data analysis, and decision making. Instead, they need to be infused within a straightforward continuous improvement approach to realize their greatest benefit.  Remember, the overall purpose of the solutions chosen by implementation teams is to advance student outcomes. This requires choice of the appropriate intervention and tracking the quality of its use by teachers (Slavin, 2020). One such approach, Dynamic Impact, was developed by the faculty and staff at the Johns Hopkins University Center for Technology in Education (JHU CTE). It was designed to enrich the quality of decision making and boost their impact. Dynamic Impact embeds the principles and questions mentioned earlier within its five-stage improvement cycle—TAP-IT (Teams, Analyze, Plan, Implement, and Track). Protocols are used to outline simple, step-by-step actions to establish high performance implementation teams, conduct high quality root cause analysis, and generate practical action plans that promote fidelity of implementation of evidence-based practices and improvement plans. Questions, such as those outlined above, guide analysis and discussion during each stage.

 

Dynamic Impact in the Field: Improved Child Outcomes

Dynamic Impact was implemented across two school districts—Caroline County Public Schools and Worcester County Public Schools—to support implementation of the Maryland Early Learning Assessment (ELA). The ELA is a valid, formative assessment tool that is used in the natural environment multiple times throughout the school year. It is practical to use and folds easily into the natural flow of an early childhood settings. Teachers track individual children’s growth, individualize learning opportunities, plan for intervention, engage in real-time instructional planning, and ensure that all children are on the path for kindergarten readiness and beyond. The ELA follows a process to help teachers document, analyze, and make instructional decisions based on the information they collect.

School district leaders in CCPS and WCPS identified a goal: to improve implementation of the ELA and utilize its data to help increase the percentage of students with and without disabilities ready for kindergarten as evidenced by the state Kindergarten Readiness Assessment (KRA). Both districts had determined that many early childhood teachers were not maintaining a high enough level of fidelity when implementing the ELA. They assumed that if the ELA was used as it was intended more pre-K students would receive a higher quality of instruction tailored to students’ individual needs. Being aware of JHU CTE’s work in this area, both districts decided to collaborate with the JHU CTE to use Dynamic Impact as part of their strategic plans to promote higher quality implementation of the ELA. The following results reflect the impact of the implementation efforts and systems change in which Dynamic Impact played a critical role in CCPS and WCPS.

Results across three years (SY 17/18, SY18,19, and SY 19/20) are shared. Table 1 illustrates the trend in Kindergarten Readiness data across the state of Maryland over 3-years of KRA administration. The average for kindergarteners demonstrating readiness on the KRA for children with and without disabilities has remained relatively flat over the past three years.  In 2017, 2018, and 2019, the gap between children with disabilities and children without disabilities who were demonstrating readiness was 30, 32, and 31 percentage points, respectively.

Table 2 shows that CCPS narrowed the gap between children with and without disabilities by 13 percentage points over the past three years Not only did the gap narrow the gap, but percentage of kindergartens demonstrating readiness for both children with and without disabilities increased. The percentage of children demonstrating readiness on the KRA increased by nine percentage points for children without disabilities and 12 percentage points for children with disabilities.

Like CCPS, WCPS narrowed the gap between children with and without disabilities from school 2017/18 to 2019/20 (see Table 3). The gap closed by 16 percentage points over the past three years. It is important to note that the state average for students with disabilities was only 19% in 2019-20 school year, whereas 44% of children with disabilities in WCPS indicated readiness for kindergarten. In fact, WCPS ranks number one across the state for children with and without disabilities demonstrating readiness on the KRA.

 

Keep it Simple

Underlying Dynamic Impact is the perspective that improvement processes do not need to be complex to address complicated problems and promote improvement. Instead, school district leaders need to simplify the data-informed decision making process. They need methods for establishing high performing implementation teams capable of determining root causes of problems so viable solutions are within easy reach. Combining three chief elements: 1) key principles; 2) essential guiding questions; and 3) a protocol-driven continuous improvement process equips implementation teams with the necessary tools for realizing powerful results. The key is to keep the processes straightforward and use them routinely so implementation team members are better equipped to promote high quality use of assessments, interventions, or action plans.

 

Helpful Hints for Special Educators and Implementation Teams

  • Important decisions are rarely based upon one data point.
  • Looking at predetermined data helps team members consider issues and solutions overtime.
  • Examination of relationships between unlike data sets may help prioritize actions.

 

REFERENCES

Boudett K. P., & Steele, J. L. (Eds.) (2007). Data wise in action: Stories of schools using data to improve teaching and learning. Cambridge, MA: Harvard Education Press.

Stid, D., O’Neill, K., & Colby, S., (2009). Portland Public Schools: From data and decisions to implementation and results on dropout prevention. Boston Dropout Prevention. San Francisco, CA: The Bridgespan Group, Inc.

Mainzer, L. H. & Stein, S. (Eds.) (2013). Tapping Into Data. Retrieved from http://marylandlearninglinks.org/104727

Slavin, R. (2020, October 29). How to make evidence in education make a difference.

https://robertslavinsblog.wordpress.com/2020/10/29/how-to-make-evidence-in-education-make-a-difference/