Completer Case Study Report

Program: Special Education, Academic and Behavioral Strategist (ABS)

Undergraduate Initial License 

Background Information

This document describes the pilot study endeavor of MSU, Mankato to demonstrate EPP completer impact on student learning and development, or completer effectiveness. Component R4.1 of the Council for the Accreditation of Educator Preparation (CAEP) presents a challenge for EPPs in Minnesota. Due to state negotiations with the teachers’ union, the state does not collect data that links student assessment data to teachers. Therefore, our EPP used a mixed methods study to assess the impact of one program’s completers on P-12 student learning and development. Challenges also arose in recruiting completers to participate in the study. Details will be provided later in the report.

First, the EPP collected teacher linked student assessment data from participants with permission from the school district. Student data consisted of reading curriculum-based measurements used for student screening and routine progress monitoring. Student data for math curriculum-based measurements was collected from one participant. Additionally, understanding the limitation of including data from a small group of participants, the EPP collected data from two other sources, observations of completers teaching, and interviews of completer participants and their students.

Administration and Procedures

In this document, the EEP reports how one program’s completers in their first year of teaching impact student learning and development using data obtained from student screening and progress monitoring data. The completers were graduates of the special education ABS undergraduate licensure program. It is important to keep in mind that the completers in this study are in their first year of their new profession and experiencing the overwhelming feelings of transition from being students of teaching to teachers of students.

Method

A case study was designed to understand the impact of the EPP’s completers’ teaching efforts on their students during their first year of teaching. Multiple data sources were collected throughout the study. These sources included observational data of participants’ instruction; planning and delivery, scores from student curriculum-based measurements, interviews with both completers and their students.  

Participants

The PI spoke with teacher candidates prior to their graduation. The cohort of eligible participants consisted of 20 undergraduate student teachers. The study was explained to participants and nine consented and elected to participant.  The PI remained in contact with the nine throughout the summer, to keep interest high. Once completers learned of their teaching assignment the PI followed up to confirm their participation. Three asked to be removed from the study due to them accepting teaching positions outside the scope of their license. The undergraduate ABS license is a mild/moderate cross categorical license designed to address the largest special education populations. Three completers accepted positions with more complex learners in more restrictive settings and did not feel comfortable adding another component to their first year. Two more withdrew due to family commitments. With four remaining participants, the PI contacted their school administrators to inform them of the study. Only three consented to the study. The three participants worked in different districts all within 65 miles of the EPP campus. All participants worked within the scope of their license and taught a range of students with disabilities in resource settings. Two completers taught students receiving the majority of their education in the general education setting, while one completer taught students who received less than 50% of their instruction in the general education classroom.  

CaptionCompleter Participants Information

  Completer  

  Year Graduated  

Grade and Content Area Observed  

  School Demographics  

A 2022 Middle School Reading and Math Resource   91% Caucasian, 16% free and reduced meals  
B 2022 Middle School Co-taught English Language Arts and Resource Reading/Language Arts   64% Caucasian, 10.4% free and reduced meals  
C 2022 Elementary Reading and Math   61% Caucasian, 22.3% free and reduced meals  

Completer A

Completer A taught in a rural community, he provided reading and math instruction using curricula or interventions he was trained on during his preparation program or attended professional develop through his district. He taught 12 students; due to their grade level and identified needs for special education not all 12 students were taught together. Ten students had goals related to literacy needs, and six students had goals related to mathematics instruction. For consistency of scheduling observations, he requested they occur during his 6th grade English Language Arts resource time. During this block he taught two students with reading and writing needs. 

Completer B

Completer B taught in a suburban district, she provided language arts instruction in both a general education collaborative co-taught classroom, and in a resource classroom. For consistency of scheduling observations, she requested they occur during her 6th grade English Language Arts resource block. She was responsible for the intervention instruction of six students in a resource setting: four 5th graders and two 6th graders. She did not feel as prepared for the curricula she was using. Professional development opportunities were not available. She taught herself about the intervention through the instructor’s manual and often sought advice from the observer.

Completer C

Completer C taught in a different suburban district, he provided reading, math, and social skills instruction to the same nine students: four 3rd graders and five 4th graders. For consistency of scheduling observations, he requested they occur during his 3rd grade reading block. During this block he split the teaching responsibilities with his paraprofessionals, since one student was significantly lower than the other three. Due to a school wide assembly, instruction was shifted, and one observation occurred during a math lesson where all four 3rd graders were receiving instruction. He used district mandated curricula and was familiar with one prior to starting with district and received professional development for the reading interventions he used.

Measures

One measure used for this study was the observation protocol used within the program field experiences and student teaching semester. The Special Education Observation & Advancement Rubric (SOAR; see Appendix A) was developed within the program and aligned with components of Explicit Instruction (Archer & Hughes, 2011) and the special education high-leverage practices (HLPs; McLeskey et al., 2017). Participants were already familiar with the measure and allowed for more natural discussion and feedback from the PI. Four observations were completed, at least two in person (three for two participants) and the remaining being conducted through video recording.

The next data source was student performance scores. Curriculum-based measurement (CBM) is an approach for assessing the growth of students in basic skills that originated in special education; however, this practice is now used routinely in schools for universal screening and ongoing progress monitoring (Deno, 2003). All three completers taught reading and therefore collecting progress monitoring data accordingly. Two participants used CBM data from the Dynamic Indicators of Basic Early Literacy Skills (DIBELS; University of Oregon, 2021). While one used the measures provided by the publisher of the reading intervention used. Only two taught mathematics; both used publisher provided progress monitoring tools.

The third data source was semi-structured interviews. The completers were interviewed about their experiences, curriculum choices, instructional decisions, and their overall perception of effectiveness. Students of the completers were interviewed about their perception of their classroom experiences, general regard of their teacher, and how they viewed their learning and growth throughout the year.

Data Analysis

The results of the analysis indicates that the completers do have a positive impact on student learning and development. We will present evidence that our completers supported student growth appropriate for their level of development.

Results

We found that our completers were progressing as expected for first year teachers. The data indicated that completers demonstrated proficiency in most of the elements of the SOAR. The completers were reflective practitioners who used data and research to implement appropriate interventions.

Completer Observations

Improving learning for students with disabilities and their peers largely depends on teachers who can deliver effective instruction. The position of many teacher quality researchers is that effective instruction is a combination of many factors; selecting the right curriculum, adjusting instruction based on student responses to instruction, as well as how teachers deliver their instruction (McLeskey et al., 2017). It is based on this research the program selects high quality evidence-based interventions to teach candidates during their program and ensures completers have mastered five of the 22 High-leverage practices for special education. The completers of this study were observed and offered feedback on their instructional planning and delivery based on the principles of data collection, explicit instruction, active student engagement and feedback, selection of evidence-based practices, and using data to make instructional adjustments, all indicators within the SOAR observation protocol.

High-Leverage Practice for Special Education of Focus in program:

  1. HLP 6: Use student assessment, analyze instructional practices, and make necessary adjustments that improve student outcomes.
  2. HLP 8/22: Provide positive and constructive feedback to guide students’ learning and behavior.
  3. HLP 12: Systematically design instruction toward a specific goal.
  4. HLP 16: Use explicit instruction.
  5. HLP 20: Provide intensive instruction.

The completers demonstrated growth in both their ability to plan and to deliver effective instruction as measured by the SOAR. The most consistent areas of performance that emerged from the observations of instruction planning were plans for learning segment goal and lesson objective, plans for evidence-based practices, and plans for model/input (see Table 1). While the most consistent areas of performance during instructional delivery were delivers model/input, facilitates unguided practice, implements evidence-based practices, and delivers closing (see Table 2).  These elements of lesson planning and delivery are crucial for novice special educators to master since evidence-based practices hold the most promise for student growth (Cook & Odom, 2013), while providing clear and concise demonstration of expectations, or modeling, allows students to see what proficient performance looks like, and understand their teacher’s expectations (Archer & Hughes, 2011).  Plans for learning segment goal and lesson objective aligns with HLP 11, identify and prioritize long- and short-term learning goals and HLP 12 systematically design instruction toward a specific goal. Completers were able to identify learning goals aligning student need, identified in their individualized education plan (IEP), with the curriculum standards and break the standards down in order to “chunk” material and creating meaningful learning objectives developed in a systematic and logical progression. Plans for evidence-based practices and implements evidence-based practices is at the heart of special education and mandated in the Individuals with Disabilities Education Act (IDEA). While plans for model/input, delivers model/input, facilitates unguided practice, and delivers closing all are embedded in HLP 16, use explicit instruction. Within unguided practice opportunities completers were assessed on their ability to fade their support and facilitate students taking ownership of their learning. This not only is a step of explicit instruction (HLP 16; Archer & Hughes, 2011), it is also practicing HLP 15, provide scaffolded supports. As noted below completers did not consistently evidence data collection using SOAR measures; however, they were able to reflect and discuss student data to adjust and intensify their instruction during post observation meetings and interviews. This is crucial to effective special education instruction and identified as HLP 6, use student assessment, analyze instructional practices, and make necessary adjustments that improve student outcomes to allow for HLP 20, provide intensive instruction.

Student Data

The completers demonstrated positive outcomes on their students’ academic growth and development.  Although the completers did not show consistency on data collection for planning and instructional delivery as evidenced on the SOAR, they each demonstrated the ability during post observation conferences and check in meetings to use the student progress monitoring data to make more meaningful decisions related to changing the intensity of instruction (HLP 20) or even changing the intervention when needed (HLP 6). Completers discussed the student data trends, remarked on student affect during instruction, and were able to reflect on their instruction based on student responses. Completer A was more skilled than the other two, but all three were able to discuss their students’ progress and make appropriate and meaningful changes to interventions. Intensifying instruction (HLP 20) sometimes meant changing student groups, change the pacing of instruction, and often required completers to add more practice opportunities (HLP 18, use strategies to promote active student engagement) to future lessons.

Development of reading skills.

All completers have shown impact on student learning related to reading. Completer A used two well-known curricula for decoding and fluency. His students all demonstrated growth, evidenced by progress monitoring data and student interview. Progress monitoring was collected on students’ Oral Reading Fluency (see Figures 1-10). When viewing these figures, it is important to note the curriculum assesses students on reading a passage twice, once as a “cold” read, or pretest, and again as a “hot” read or post-test. Completer impact is evidenced by all ten experiencing growth and eight achieving above the goal line, and two at goal levels. It is also important to note the “cold” read scores also increased to reflect growth of “hot” read scores.

Completer B also used a well-known reading and spelling curriculum for her reading instruction. Students were able to demonstrate growth as determined by end of unit mastery tests. The curriculum requires students score 17 of 20 prompts or at least 85% proficiency before moving on to the next level. During the case study there was opportunity for ten levels to be completed according to publisher implementation guidelines, due to student progress and interruptions in the intervention block of instruction only five were completed. During that time students were also receiving instruction alongside their grade level peers in the general education classroom. Based on grades and teacher observation students were able to engage in learning activities due to the progress made through the intensive intervention provided by Completer B. For the intensive intervention program provided in the resource room both students were able to complete to mastery five units, Student 1 was able to meet mastery on the first attempt of three units, while Student 2 met mastery on the first attempt only once. Through discussions with Completer B, she determined the student was at the correct instructional level for the program, but she need to provide more intensifications by separating the students to provide one on one instruction and provide Student 2 with more practice opportunities during the intervention program. Similar results are reported with the Spelling progress (see Figures 11 & 12).

Completer C started with a school provided curriculum and through discussions with team members, the PI/observer, and district supervisors was able to use data from student progress to advocate for a more structured, multicomponent intervention that better addressed his students’ needs. Data shared for this report was only available for students who received instruction during observations, due to parent consent and district Research Board requirements. Three students received the same intervention, at a lower level, as Completer B’s students. This well-known curriculum is based on the science of reading and for the first level focuses heavily on phonemic awareness. Like the situation with Completer B, the publisher suggested that ten levels could be completed during the time frame of the study; however, due to interruptions to intervention time and student progress only three were completed. No student was able to complete a unit to mastery on the first attempt (see Figure 13). Using this data and anecdotal notes regarding the intensifications he made during intervention sessions, Completer C was able to advocate for a different reading curriculum for his student. No data was taken as the new curriculum was implemented after the time of this study. However, Completer C reports students are responding more favorably to the new intervention. Although the data does not speak to student growth, this skill and ability this completer used to advocate for his students does evidence Completer Effectiveness and impact on student learning.

Development of math skills.

Only Completers A and C delivered math instruction. Both were able to evidence student growth as measured by CBM data or intervention provided progress monitoring assessment. Completer A used a well-known evidence-based intervention developed through the University of Kansas Center for Research on Learning where rigorous research is conducted for each product before it is released for publication. The UK educational research team continues to research the products in the field to ensure practices and interventions are implemented with fidelity. Completer A was unable to share the data from this intervention due to privacy issues with the computer program used to collect data. The school recently transferred to a new CBM software and Completer A nor the PI/observer were able to receive permission to view the data. Based on Completer A interview and IEP goal mastery student growth was impacted by the implementation of the intervention and Completer A’s instructional delivery.

Completer C also used a well-known math intervention designed using the concrete-representational-abstract (CRA) framework (Agrawal & Morrin, 2016), where students learn mathematical concepts by developing an understanding of the concept using manipulatives and real-life situations (concrete), moving to pictorial representations and drawings (representational), which then support working through problems using a traditional algorithm (abstract). The CRA framework provides support for students to develop both conceptual and procedural understanding at the same time. Three of the four students observed were able to make intervention benchmarks to evidence growth. Student 1 struggled to master concepts and Completer C was able to supplement instruction and eventually moved the student to another group. Data reported (see Figure 14) shows scores on end of unit progress monitoring assessments. Three students were able to demonstrate mastery at 80% or higher. Student 2 struggled with one unit; however, he was still able to demonstrate an average of 78% with four units receiving a mastery score of 80%. Student 3 mastered all unit assessments with 100%. Student 4 averaged 88% with scoring 100% on one unit, 90% on 2 and 80% on one.

Using data to make instructional decisions.

Special education is different from general education in intensity, structure, collaboration, progress monitoring, and in some cases the curriculum itself. Special education is defined in IDEA as specially designed instruction, which means teachers have the responsibility to adapt the content, methodology, or delivery of instruction to meet a student’s needs (McLeskey et al., 2019). It is critical these adaptations be data driven (Deno, 2003). Effective special education teachers create specialized instruction that is intensive to meet individual students’ needs and spur their progress forward. However, to ensure the instruction remains appropriate for students with disabilities, special education teachers not only monitor student progress toward a goal but also track student response to instruction. They use the data gathered to evaluate instructional practices and then make needed changes to keep student growth as the focus of their work. Data is used for more than just tracking student performance. To increase instructional effectiveness, special education teachers also use data to evaluate their own teaching practices to adjust their actions and practice to improve student outcomes.

All participants were able to set target goals, identify trends, plot both on a graph to compare and track student growth. Completer A was quicker to notice when a student appeared stagnant and had more ideas and strategies to adjust his instruction. He initiated the conversation more readily and often stated his concerns in an effort to learn and improve his practice. Both Completer B and C were responded well to observer questions and were able to discuss student performance but were more hesitant to make changes quickly. Both situations provide positives and negatives to teaching practice. If a teacher makes changes too quickly, they may not have given the intervention or the change/adaptation enough time to have an impact on learning. Conversely, waiting to long may stall student growth. Balancing the two by following sound practices of progress monitoring is key to improving student outcomes (Hosp et al., 2016). The standard practice to collect seven data points to establish a trend, while at least three data points are needed to analyze for changing the intensity of an intervention (Wanzek et al., 2020). In the beginning of the year, Completer A was more likely to discuss changes after 2 data points indicating stable or regressed scores. Completer B would wait for 5, while Completer C waited for seven, combining the two standards of instructional decision making. By mid-year, and the end of the study Completer B was analyzing data at 3 or 4 data points, while Completer B remained steady at 5, and Completer C would analyze data at points 4 or 5.

All participants also engaged in collaborative team planning and decision making. Completer A had biweekly data meetings with grade level teams to include general education teachers. Completer B would plan for co-teaching weekly and bring data into the planning meetings. Completer C engaged in monthly data meetings with special education teams and quarterly meetings with grade level teams. Data discussed in these team planning meetings ranged from universal screening measures and class unit assessments to behavioral needs that may have arose between team meetings.

Completer self-efficacy

When asked how confident completers felt to enter the classroom, all reported they were nervous but felt prepared at the beginning of the year. Following the first quarter of teaching Completer A and C reported feeling more anxious and not as confident, while Completer B remained confident toward her own skill and less confident with the skill of her colleagues. At the end of the study all could point to their various levels of success that boosted their feeling of self-efficacy. Completer A reported most pleased in his ability to deliver social skills lesson and help students self-regulate and reflect on their emotions and isolate why they were feeling the way they did. His second area of confidence was in literacy instruction. He attributed this to the program’s literacy content having two courses in reading methods, along with district professional development that supported what he learned while at the EPP. He can attribute the student documented growth based on his ability to plan and deliver instruction targeted at student needs. He reports gratitude to the program for his knowledge and skill acquisition but gave a word to future candidates; “Come in without any expectations, be open to learn, what you think you know; you don’t. No program can fully prepare you for everything, but you will learn where to find the tools you need.” This is important to note, as no program can teach every strategy or evidence-based practice, but they can teach candidates how to be good consumers of research and understand how to locate more strategies as needed.

Completer B reported most growth in her ability to identify students’ area of need and develop meaningful goals for their IEPs. She reported that this area she felt was her weakest in the beginning of the year, but now feels confident and often feels empowered to correct her colleagues. This could be due to the extra time and effort she spent with the PI in looking at student data and calculating measurable goals. She also reported high levels of confidence in her ability to design instruction, which is evidence by growth in her mean scores on the SOAR lesson planning indicators. She increased her score from a mean of .727 on her first observation to 1.545 on her final observation. The SOAR observation protocol uses a 3 point Likert scale (0, 1, 2) to measure performance.

Completer C reported the most growth in his ability to orchestrate the work of the paraprofessionals in his classroom. When the year began, he felt anxious and even stated he was intimidated by the age difference between them. Completer C was in his early 20’s and fresh out of his undergraduate college experience while his paraprofessionals, or instructional assistances, were all over 40. He reported it was like “telling my parents what to do.” As he became more comfortable in his role as teacher and not a student himself, he began to gain more confidence in providing direction to the older women. He also reported that he felt more confident in delivering more tightly aligned lessons and adding more practice opportunities for students. In the beginning of the year, Completer C was more likely to deliver a lesson straight from a lesson plan and not plan for or demonstrate the ability to spontaneously add more practice opportunities for students who needed to engage with the taught skill, strategy, or content. This was evident in his score on the instructional delivery of the SOAR, as he increased his ability to provide a sufficient variety of response strategies, he increased his ability to deliver instruction at a lively pace to maximize instructional time, and increased the opportunities to respond, he also was able to rephrase planned questions to offer prompts that were more process-oriented to assess student understanding.

Improving student confidence

Three students were interviewed. The interviews took place following observations. Students were selected from a convenience sample; all students were present for each instructional observation and were familiar with the observer. Student A received language arts and math instruction from Completer A, Student B received all language arts instruction from Completer B, and Student C received instruction from Completer C for language arts, math, and social skills.

Student A felt she had become a stronger reader, but still did not like math. When asked if she felt she was doing better in math, she shrugged and said, “I guess so, I’m not failing.” Completer A had students complete graphs after progress monitoring to measure and view their own growth.  She was most proud of her reading scores improving. Completer A used a well-known fluency intervention that times students after cold reads and again after reading with the teacher and independently. She often rushed through the reads to “beat” her last score. When asked what she liked best about learning and being in Completer’s A class, she reported that she was glad to be in a different class than her sister. Completer A provided instruction to both girls but knew this was distracting for Student A and he arranged the schedule to teach the girls separately. Both girls were present for one observation and the observer also noticed the change in engagement and overall attitude of Student A. She also shared that Completer A made learning fun. When asked for examples, she again referenced the reading intervention and the timer they used. She also shared that she still did not like to read in her general education classroom but would read at home now. She reported that she read to her pets. When the observer relayed this to Completer A, he said that the parents had shared that same sentiment during the parent conference the night prior and was not entirely sure Student A felt that way or was just repeated what she had heard during the conference. Student A did report that she liked school this year more than last year, and again referenced the student self-monitoring graphs as evidence.

Student B was not very engaged in the interview, although he consented to talk with the observer, he also admitted he was glad to miss a little bit of class to participate in the interview. When asked what he felt most confident about in school, he reported he felt happiest and most successful in physical education. He said he was fast and “can outrun everyone in my grade”. It is not uncommon for students with disabilities to select areas other than academics as areas of preference. They often seek opportunities where they outperform their peers in school and value these experiences most. The questioning was redirected to academic areas, primarily language arts since this was the focus of the case study and Completer B’s area of teaching. He enjoyed that the class warm up every day was to play Wordle, a computer-based game from the New York Times, where players manipulate letters to guess and discover the 5-letter daily mystery word. He did not enjoy reading or writing and wish he could skip those subjects. When asked about being in Completer B’s class he reported she made being in the co-taught class “ok”. He went on to say that having Completer B in the class prevented “the other teacher from asking me questions I don’t know.” He reported he did not “mind” being in the resource class because he knew he needed the extra help, he just wished that Completer B would give him the answers like other teacher do. When asked how that helps him learn, he said, “I get the homework done faster.” The observer asked if completed homework was the same as learning, he shrugged and stated, “I guess so.” He did report that Completer B made him work harder than other teacher, but since she “protects” him in the general education class he can “deal with it”.

Student C was the easiest to interview as he often was distracted by the observer and wanted to talk directly to the observer during every visit. He was excited to share about his school experience and talk about Completer C. When asked what he liked best about school he reported their morning meeting because he got to talk about hobbies and interests there. Completer C had redesigned the morning meeting to give more social opportunities and often built in social skills lessons during this time. Student C always wanted to share the events from home and talk about non-school related things. Completer C had shared that this was the case for almost half of the class so instead of fighting for their attention he used the morning meeting for this outlet. Student C reported he liked coming to school and learning math. His favorite activities included “bears and blocks” referring to the manipulatives Completer C used during math instruction. When asked why he like the bears and blocks he replied, “because it helps me see the numbers”.

Recommendations

The analysis not only investigated how our completers contribute to expected levels of student learning growth, but also examined recommendations regarding our education preparation program. Through completer interviews and analyzing SOAR results We found room for improvement. Given the data collected, we found limited evidence that completers were not as proficient with HLP 8/22: provide positive and constructive feedback to guide students’ learning and behavior. This provides concerns since this was identified as one of the 5 HLPs our completers should master in the program. The interviews also revealed although we discussed and encouraged candidates in the program to use this HLP they themselves did not receive enough feedback on their ability to apply this skill in real environments. They either were in field placements without faculty or had contrived experiences due to the COVID-19 Pandemic which reduced their field experience opportunities. It should be noted all three completers did improve by the end and all received a score of 2 on provides feedback for their final observation.

Overall, the completers demonstrated a positive impact on student growth. The data provided indicates completers were effective in analyzing data to change their instructional practice as needed. Students all reported benefits their teachers provided them and for the most part, enjoyed their teachers and could identify areas of their own growth that they attributed to the completer. Completers attributed their growth to the alignment of program coursework and the practices they experienced in their schools, completion of school provided professional development, and the ability to collaborative discuss data with at least one colleague.

As a result of this study the program will continue to analyze completer feedback and the findings of this study to identify gaps in program coursework and field experiences to make improvements. The program and other programs in the EPP have already begun to investigate areas where candidates can engage in more collaborative learning experiences to better simulate real world experiences. Although some content and knowledge is unique to programs, the EPP is investigating how to bring all licensure programs to align the new state teacher standards. The introduction of these standards encourages more collaboration across programs. The findings of this study will help further those discussions.

Based on the results of this study and the continued requirements of CAEP Standard 4.1 it is also recommended that all programs identify a percentage of completer volunteers to follow into their first years. We believe the results of this study do evidence Completer Effectiveness and a continuation of these, or similar actions, along with other Completer data would provide the EPP and CAEP the necessary evidence for program evaluation and improvement. Without changing of state rule or CAEP requirements case studies will be one viable way to collect data that links student assessment data to teachers/completers.

References

Agrawal, J., & Morin, L. L. (2016). Evidence-based practices: Applications of concrete representational abstract framework across math concepts for students with mathematics disabilities. Learning Disabilities Research & Practice, 31(1), 34-44.

Archer, A. L., & Hughes, C. A. (2011). Explicit Instruction: Effective and Efficient Teaching. Guilford Press.

Cook, B. G., & Odom, S. L. (2013). Evidence-based practices and implementation science in special education. Exceptional children79(2), 135-144

Deno, S. L. (2003). Developments in curriculum-based measurement. The Journal of Special Education 37, 184-192.

Hosp, M. K., Hosp, J. L., & Howell, K. W. (2016). The ABCs of CBM: A practical guide to curriculum-based measurement. Guilford Publications.

McLeskey, J., Barringer, M-D., Billingsley, B., Brownell, M., Jackson, D., Kennedy, M., Lewis, T., Maheady, L., Rodriguez, J., Scheeler, M. C., Winn, J., & Ziegler, D. (2017, January). High-leverage practices in special education. Council for Exceptional Children & CEEDAR Center.

McLeskey, J., Billingsley, B., Brownell, M. T., Maheady, L., & Lewis, T. J. (2019). What are high-leverage practices for special education teachers and why are they important?. Remedial and Special Education40(6), 331-337.

University of Oregon, Center on Teaching and Learning (2019-2020). DIBELS 8th Edition Technical Manual Supplement. Eugene, OR: Author

Wanzek, J., Al Otaiba, S., & McMaster, K. L. (2020). Intensive reading interventions for the elementary grades. New York: Guilford Press.