Revised CAEP Standards Alignment - Coastal Carolina University
In This Section

Revised CAEP Standards Alignment

Data Review by Revised CAEP Standards

Coastal Carolina University has aligned its current assessment instruments to the revised CAEP Standards. The follow sections provide information regarding candidate performance as it relates to the four subsections of revised CAEP Standard 1: Content and Pedagogical Knowledge.

R1.1 The Learner and Learning

Subsection R1.1 of the revised CAEP Standards focuses on the learner and learning and is aligned with InTASC Standards 1-3. Indicators from the Conceptual Framework Rubric, the ADEPT/SCTS 4.0 Rubric, SCOESS Lesson Plan Rubric, and Teacher Work Sample (TWS) Rubric were categorized based on their alignment with the InTASC Standards. It should be noted that the EPP identified the Conceptual Framework Rubric, SCOE Lesson Plan, and TWS indicators aligned with the InTASC Standards, and the South Carolina Department of Education provided the alignment between the InTASC Standards and ADEPT/SCTS 4.0 Rubric: https://ed.sc.gov/sites/scdoe/assets/File/educators/teacher-evaluations/NIET%20InTASC%20Crosswalk%20(2).pdf. Excluding the SCOE Lesson Plan Rubric, which is done in the semester prior to internship, each of these key assessments is completed during the internship semester.

For InTASC Standard 1 (Learner Development), candidates performed well on all indicators across each instrument, although weaker mean scores were noted on the TWS Indicator 2.3: Learning Goals: Appropriateness for Students (Fall – 2.97, Spring – 2.98). Areas of strength for candidates included their knowledge of the different needs of students (Conceptual Framework Indicator 3.1) and their abilities to incorporate student interests and culture into their lessons (ADEPT/SCTS 4.0 Rubric Indicator 10.2). Additionally, candidates demonstrated strength in their abilities to manage behavior and create high expectations for their students.

When examined as a whole, candidates from the EPP’s programs demonstrate clear abilities in attending to learner development and understanding how learner development impacts both lesson planning and classroom management. The use of multiple data measures, as well as multiple indicators, provides evidence that candidates can do this consistently. Candidate scores were stable from one data collection cycle to the next, indicating that those evaluating the candidates are consistent in their scoring, and that program practices addressing InTASC Standard 1 are currently working.

View the Standard 1 (Learner Development) results here: Table 1.


InTASC Standard 2 (Learner Differences) is an area where candidates need further support and assistance, as candidate scores on indicators related to this standard were lower when compared to the other nine InTASC Standards. However, while lower than other performance areas, candidates still met the required mean scores across each of the EPP’s assessment instruments.  

For indicators related to InTASC Standard 2, candidates demonstrated strength in their abilities to Recognize Learner Misconceptions (SCOE Lesson Plan Indicator 8), their knowledge of the classroom, school, and community (TWS Indicator 1.1), and their knowledge of their students (TWS Indicator 1.2). This evidence supports the idea that candidates know the students that they are working with and can identify their strengths and weaknesses.  

One area where candidates need further support is in recognizing how to differentiate their instruction to address learner differences. While candidates know their students, as noted in the evidence provided above, they then struggle knowing how to adjust their instruction to meet the learners’ needs. Indicators from the SCOE Lesson Plan (9. The teacher understands and identifies differences in approaches to learning and performance and knows how to design instruction that uses each learner's strengths to promote growth) and the TWS (Knowledge of Students’ Varied Approaches to Learning, Adaptations Based on the Individual Needs of Students, and Use of Contextual Information and Data to Select Appropriate and Relevant Activities, Assignments, and Resources) identify this as an area for further support, as these are the areas with the lowest mean scores.  

When examined as a whole, candidates from the EPP’s programs demonstrate clear abilities in understanding learning differences, but weaker abilities in knowing how to manage those differences in their instruction. Indicator scores were consistent from Fall 2020 to Spring 2021, and faculty have identified this as an area of focus for the 2021-2022 academic year. Each program is working on methods to increase candidate performance as it relates to differentiating instruction; please see the Elementary Education program plan and Special Education Plan as examples.

View the results for Standard 2 (Learner Differences) here: Table 2.

 
For InTASC Standard 3 (Learner Environments), candidates performed well on all indicators across each instrument, although weaker mean scores were noted on the TWS Indicator 4.5: Use of Contextual Information and Data to Select Appropriate and Relevant Activities, Assignments, and Resources (Fall – 2.81, Spring – 2.76). This weakness was expected, given its previous identification as it related to InTASC Standard 2. Areas of strength for candidates included their abilities to manage behavior (Conceptual Framework Indicator 1.4, SCTS 4.0 Rubric Indicators 17.3 and 17.4) and their abilities to create a respectful culture in their classroom (SCTS 4.0 Rubric Indicators 19.2 and 19.3).

When examined as a whole, candidates from the EPP’s programs demonstrate clear abilities in developing productive learning environments and managing the classroom. The use of multiple data measures, as well as multiple indicators, provides evidence that candidates can do this consistently.  Candidate scores were stable from one data collection cycle to the next, indicating that those evaluating the candidates are consistent in their scoring, and that program practices addressing InTASC Standard 3 are currently working.

View the results for Standard 3 (Learner Environments) here: Table 3.

R1.2 Content Knowledge 

Subsection R1.2 of the revised CAEP Standards focuses on content knowledge and is aligned with InTASC Standards 4-5. Indicators from the Conceptual Framework Rubric, the ADEPT/SCTS 4.0 Rubric, SCOE Lesson Plan Rubric, and Teacher Work Sample (TWS) Rubric were categorized based on their alignment with the InTASC Standards. It should be noted that the EPP identified the Conceptual Framework Rubric, SCOE Lesson Plan, and TWS indicators aligned with the InTASC Standards, and the South Carolina Department of Education provided the alignment between the InTASC Standards and ADEPT/SCTS 4.0 Rubric: https://ed.sc.gov/sites/scdoe/assets/File/educators/teacher-evaluations/NIET%20InTASC%20Crosswalk%20(2).pdf. Excluding the SCOE Lesson Plan Rubric, which is done in the semester prior to internship, each of these key assessments is completed during the internship semester. 

For InTASC Standard 4 (Content Knowledge), candidates performed well on all indicators across each instrument, earning mean scores of 3.0 or higher on nearly every indicator related to content knowledge across all four instruments. Areas of strength for candidates included their knowledge of the subject matter (Conceptual Framework Rubric Indicator 1.1; SCTS 4.0 Rubric Indicators 9.1, 9.3) and how to align their learning goals and standards with the subject (SCTS 4.0 Rubric Indicators 1.1, 1.2, 1.3; SCOE Lesson Plan Rubric Indicator 2; TWS Indicator 3.0). One area where candidates must continue to refine their skills is in demonstrating accurate representation of the content (TWS Indicator 4.2; Fall – 2.88 and Spring – 2.97). While candidates can demonstrate their content knowledge, they sometimes struggle with knowing the answers to questions that students may have that go beyond the textbook.  

When examined as a whole, candidates from the EPP’s programs demonstrate clear abilities in demonstrating their content knowledge and using that content knowledge to create lessons for their students. The use of multiple data measures, as well as multiple indicators, provides evidence that candidates can do this consistently.  Candidate scores were stable from one data collection cycle to the next, indicating that those evaluating the candidates are consistent in their scoring. 

View the results for Standard 4 (Content Knowledge) here: Table1R1.2.

InTASC Standard 5 addresses candidates’ abilities to apply their content knowledge. candidates performed well on all indicators across each instrument, earning mean scores of 3.0 or higher on nearly every indicator related to content knowledge across all four instruments. Areas of strength for candidates included their presentation of the content (ADEPT/SCTS 4.0 Rubric Indicator 3) and their understandings of how to use the content to create meaningful learning experiences for students (Conceptual Framework Indicators 1.1, 1.2, 1.3). One area where candidates must continue to refine their skills is in lesson structure, which is typically an area of challenge for new teachers as they learn how to allocate time for instruction and reflection (ADEPT/SCTS 4.0 Rubric Indicator 4.2). Additionally, candidates must continue to develop their skills in using interdisciplinary themes to create real-world experiences (SCOE Lesson Plan Indicator 13). This was an area where candidates saw a significant drop in mean score across the two data cycles (3.11 to 2.80). However, it should be noted that some of this change may be attributed to contextual circumstances surrounding COVID, which impacted the ways in which candidates were able to adjust instruction (or, in many instances, required to follow detailed, scripted plans to ensure that students could catch up on basic skills lost due to school closures). The EPP will continue to monitor this area for further support.  

When examined as a whole, candidates from the EPP’s programs demonstrate clear abilities in utilizing their content knowledge to create meaningful learning experiences for their students. The use of multiple data measures, as well as multiple indicators, provides evidence that candidates can do this consistently.   

View the results for Standard 5 here: Table2R1.2.

R1.3 Instructional Practice

Subsection R1.3 of the revised CAEP Standards focuses on instructional practices and is aligned with InTASC Standards 6-8. Indicators from the Conceptual Framework Rubric, the ADEPT/SCTS 4.0 Rubric, SCOE Lesson Plan Rubric, and Teacher Work Sample (TWS) Rubric were categorized based on their alignment with the InTASC Standards. It should be noted that the EPP identified the Conceptual Framework Rubric, SCOE Lesson Plan, and TWS indicators aligned with the InTASC Standards, and the South Carolina Department of Education provided the alignment between the InTASC Standards and ADEPT/SCTS 4.0 Rubric: https://ed.sc.gov/sites/scdoe/assets/File/educators/teacher-evaluations/NIET%20InTASC%20Crosswalk%20(2).pdf. Excluding the SCOE Lesson Plan Rubric, which is done in the semester prior to internship, each of these key assessments is completed during the internship semester.  

For InTASC Standard 6 (Assessment), candidates had varied performance ratings across each instrument. Candidates grew in their abilities to use technology to facilitate assessment practices; in the semester prior to internship, candidates earned a mean score of 3.04 (Fall 2020)/2.88 (Spring 2021) on the SCOE Lesson Plan Rubric for Indicator 12, and a mean score of 3.57 (Fall 2020)/3.38 (Spring 2021) on Conceptual Framework Rubric Indicator 2.2 during internship. Similarly, candidates demonstrated growth in their use of both formal and informal assessment strategies, earning lower mean scores (SCOE Lesson Plan Indicator 6) in the semester prior to internship, and higher mean scores (Conceptual Framework Indicator 1.5) at the end of internship. Candidates demonstrated strengths in their abilities to provide academic feedback to students (ADEPT/SCTS 4.0 Rubric Indicators 7.1, 7.3). An area where further support is needed is in candidates’ abilities to make adaptations based on the individual needs of students (TWS Indicator 3.5), which aligns with weaknesses identified under CAEP Standard R1.1. Finally, candidates need further support in ensuring that assessment items are technically sound (TWS Indicator 3.4, Mean Scores of 2.76 and 2.80).  

When examined as a whole, candidates from the EPP’s programs demonstrate clear abilities in their use of assessment practices in instruction. The use of multiple data measures, as well as multiple indicators, provides evidence that by the time candidates complete their internship semester, their scores have increased or remained consistently strong throughout the assessment period.  Candidate scores were stable from one data collection cycle to the next, indicating that those evaluating the candidates are consistent in their scoring.  

View the InTASC Standard 6 (Assessment) results here R1.3 Table1.

For InTASC Standard 7 (Planning Instruction), candidates were consistently strong in their abilities to plan instruction and activities that integrated technology, as well as their abilities to select appropriate student work. An area where further support is needed is in candidates’ abilities to make adaptations based on the individual needs of students (TWS Indicator 3.5) and identify resources to meet the needs of different students (SCOE Lesson Plan Indicators 9, 11), which aligns with weaknesses identified under CAEP Standard R1.1.  

When examined as a whole, candidates from the EPP’s programs demonstrate clear abilities in their abilities to appropriately plan instruction. The use of multiple data measures, as well as multiple indicators, provides evidence that by the time candidates complete their internship semester, their scores have increased or remained consistently strong throughout the assessment period. 

View the InTASC Standard 7 (Planning Instruction) results here: R1.3 Table 2.

For InTASC Standard 8 (Instructional Strategies), candidates performed similarly to InTASC Standard 7, with strengths in their abilities to utilize a variety of instructional strategies, to plan environments supported by technology, and to engage students in problem-solving and making connections.  

When examined as a whole, candidates from the EPP’s programs demonstrate clear abilities in their abilities to use appropriate instructional strategies. The use of multiple data measures, as well as multiple indicators, provides evidence that by the time candidates complete their internship semester, their scores have increased or remained consistently strong throughout the assessment period. 

The results for InTASC Standard 8 (Instructional Strategies) is located here: R1.3 Table 3

R1.4 Professional Responsibility

Subsection R1.4 of the revised CAEP Standards focuses on instructional practices and is aligned with InTASC Standards 9-10. Indicators from the Assessment of Candidate Dispositions, the Conceptual Framework Rubric, the SCTS 4.0 (ADEPT) Rubric, and Teacher Work Sample (TWS) Rubric were categorized based on their alignment with the InTASC Standards. It should be noted that the EPP identified the Conceptual Framework Rubric, and TWS indicators aligned with the InTASC Standards, and the South Carolina Department of Education provided the alignment between the InTASC Standards and SCTS 4.0 Rubric: https://ed.sc.gov/sites/scdoe/assets/File/educators/teacher-evaluations/NIET%20InTASC%20Crosswalk%20(2).pdf. Each of these key assessments is completed during the internship semester; in addition, the assessment of candidate dispositions is completed three times to measure growth; twice prior to internship and once during internship.

Overall, candidates performed strongly on most indicators associated with InTASC Standard 9, Professional Learning and Ethical Practice. Areas of strength included candidates’ promptness, preparation, and participation in professional development (ADEPT/SCTS 4.0 Professionalism Standard 1, mean scores of 3.57 (Fall 2020) and 3.71 (Spring 2021)). Additionally, candidates demonstrated strength in their abilities to assess the effectiveness of their lessons (ADEPT/SCTS 4.0 Professionalism Standard 5, mean scores of 3.59 (Fall 2020) and 3.61 (Spring 2021) (See Table 1).

One area where candidates need further support is in their abilities to reflect on their students’ learning, their teaching, and the implications for professional development (TWS Indicator 7.5). Candidates earned their lowest mean score both semesters in this area (Fall 2020 – 2.59; Spring 2021 – 2.40) (See Table 1). Although candidates do a fine job of reflecting on their effectiveness of their lessons, noted as a strength, they struggle to determine what types of professional development to engage in to address their weaknesses. Part of this is due to lack of exposure to professional development beyond what is provided by the institution and the partnering schools; the EPP must be more diligent about sharing alternative professional development resources and opportunities with candidates so that they know where to begin.

As noted in the Measure 3 narrative, candidates are expected to earn a mean score of 1 or higher by the end of the internship semester on the Assessment of Candidate Dispositions. The rubric, which is aligned to the InTASC Standards and focuses on personal and professional responsibility, is located here: Professional Dispositions Initial Licensure 21SP - Questionnaire.

Candidate data are provided based on their three assessments points during the program. Overall, candidates across all programs demonstrated growth in their enactment of the professional dispositions, when looking across the three data collection points for Fall 2020 and Spring 2021 completers. This growth was to be expected, particularly as candidates spend more time in the classroom and taking over instructional planning as they progress through the program.

Fall 2020

Spring 2021

The results for Standards 9 and 10 are located here: Standards 9 and 10.