Skip to main content Skip to main navigation Skip to footer content

AAQEP Accreditation 2022

Standard 4: Aspect D

The program is committed to and invests in strengthening and improving the education profession and the P-20 education system. Each program’s context (or multiple contexts) provides particular opportunities to engage the field’s shared challenges and to foster and support innovation. Engagement with critical issues is essential and must be contextualized. Sharing results of contextualized engagement and innovation supports the field’s collective effort to address education’s most pressing challenges through improvement and innovation.


The Multiple Subject Teacher Credential Program meets on a monthly basis where data is often prepared and presented to the program faculty members to discuss and reflect on the meaning, quality of the data, and application of the data to practice and program decisions. Prior to 2017, program data was hard to come by. Locked file cabinets held hand-written, paper copies of observations, Fresno Assessment of Student Teachers (FAST) data was accessible through a single faculty member, program data was stored in a spreadsheet somewhere but no one knew where. Program faculty and leaders had limited access to any program-level, coach-level, mentor-teacher, or candidate data. 

With the realization that data accessibility was key to any continuous improvement efforts, Kremen leadership recognized the need for a comprehensive data management system to collect, store, and retrieve data regarding all aspects of the program. After meeting with sales representatives about multiple Data Management System products, we selected Tk20 based on its flexibility, comprehensiveness, and client services which includes an openness to tailored system builds based on specific program needs. The implementation of a comprehensive data management system was difficult but worth it. There were many lessons learned, especially pertaining to the emotional and psychological upheaval that change can produce. An additional area of support that is still needed includes a full time TK20 Administrator and data analyst to support not just the Multiple Subject Credential Program’s needs but for the whole unit. Nonetheless, the result of our ongoing implementation of Tk20 is a flexible and tailored system that can continue to grow with the program over time. It enables the program to use real-time data to make meaningful improvements and stay at the cutting-edge of the field. 

When data transparency and analysis were not part of our program’s culture, it was a challenge to introduce the idea of data collection, sharing, and transparency. A pivotal shift occurred when we were not just collecting data but also examining that data together as a program and asking questions that may lead to program changes based on the data.

The incorporation of data into our conversations with district partners and internal program stakeholders has also increased over time. Having data points related to program content, outcomes, and stakeholder experiences has encouraged faculty to ask honest questions about the effectiveness of our program. One critical question, for example, is why our candidates overwhelmingly have a difficult time passing the Reading Instruction Competence Assessment (RICA). In the past, without data as evidence, it has been easy for faculty to attribute low success with RICA to other circumstances or dismiss the issue as low occurrence. However, faced with the data from our district partners and completers, faculty have pivoted to address this problem of practice through course revisions.

In addition to partner district data, we began to use available data from the CSU Educator Quality Center and to collect data internally to help gain ongoing insights into our programs. In 2018 the Educator Quality Center provided tailored professional development for our program faculty on how to access data through the EdQ Data Dashboard. At first, faculty seemed unsure about the usefulness of the available data but as they engaged in the professional development sessions, they discovered that they could ask questions and access relevant data. The Dashboard provided a bird’s-eye view of our outcomes that many faculty had not previously considered. 

Internal program-level data is now collected regularly and shared with faculty at program meetings and used as a launching point for substantive discussions about the program. For example, we have collected data about candidate experiences through journey maps. This data provided the concrete evidence needed to persuade faculty and leadership that a more comprehensive, whole-person approach to educator preparation is needed. For example, candidates reported struggling with stress and anxiety according to patterns that could be addressed programmatically. These findings prompted discussions of possible strategies for addressing candidate stress, including revising/aligning assignments and providing counseling services. Following our data discussions our faculty also felt more compelled to contribute to our program’s continuous improvement efforts. The first discussion of the journey map data created a palpable sense of energy among faculty members as they carried on with the rest of their day. One faculty member shared:

I have never seen information shared like this before. I often ask my students for their feedback. I can easily see some of them in this data. Yet, being able to see broad themes of perspectives across the whole program is powerful and is giving me much to reflect on further.    

Another faculty member commented on how they felt “troubled by seeing how our program changes are creating anxiety for our students. We know that many experience trauma due to out-of-program factors. We need to consider how we can better manage change so that we don’t cause more stress?” Using an improvement research framework, gave us an opportunity to see the system from a broader range of student perspectives via inquiry cycles that are systematically designed and implemented across all credential pathways. 

Rather than data being a “four-letter word” it now pervades all aspects of the collective work by the university and district partners. This new collaborative approach is best demonstrated by the engagement of Fresno State faculty and Sanger Unified district leaders in Improvement Science Fellowship from 2018-2020. In these rapid cycles of inquiry, university and district team members worked together to identify a problem of practice, collect relevant data, devise a plan of action, and assess the effectiveness of their action plan. 

Central to the conversations between the improvement research fellowship team and our district partner was our willingness to be vulnerable. We shared that our program is part of the problem and solution; that we are not looking to point blame at the district, nor to expect all the change to happen on their end. We advocated to take a look inward in order to identify the areas that require improvement to our processes and methods of teacher candidate preparation. Through this collaboration, the Sanger residency has become stronger and more aligned to the needs of the district for evidence-based instructional strategies for Emergent Bilingual students. 

Engagement in Data Inquiry Cycles:
Rapid inquiry cycles were introduced gradually and to program faculty last. Inquiry was first incorporated into the program for teacher candidates. During the program redesign, a three-course inquiry series was introduced as a grounding point for synthesizing course content, evidence-based practices, theory-to-practice connections, and reflections on clinical work. Over the three inquiry courses, candidates engage in data literacy activities, scaffolded inquiries, team inquiries, individual inquiries, and facilitation of child-led inquiries. Inquiries are based on puzzles of practice that candidates experience in the field and provide opportunities for candidates to practice meaningful data collection, research, and action planning that can be immediately applied in their clinical placements. Candidates bring back artifacts and video recordings of their instruction to share, reflect, and receive feedback.

Next, the residency leadership team began engaging in rapid inquiry cycles as part of the grant-related continuous improvement work. This scaffolded series of learning sprints helped establish routines of data collection, analysis, action planning, and assessing for the EPP leadership. Data and findings from these learning sprints were then gradually introduced to university coaches and faculty over the course of two semesters. District partner data and other internal data were layered onto these more formal learning sprints and presented for analysis and discussion at program meetings. From there, infrastructure in the form of faculty learning communities was developed to provide faculty opportunities to work in teams on data-driven course development. Faculty were encouraged to formulate their own problems of practice and collect data and revise coursework to address them. 

accordion1
Heading Content
Learning Sprint 1
Questions Data Collected Findings Action
  • Who are Teacher Candidates (TC) currently getting feedback from and how often?
  • What is the nature and quality of the feedback TC receive?
  • How does the feedback impact the TC?
  • 39.1% Received written feedback 1x/mo or less
  • 43.6% Received fewer than 6 observations
  • 71.6% Reported feedback aligned to TPEs
  • 56.9% Reported consistent debriefing opportunities92.8% Agreed feedback was connected to practice
  • 61.8% Agreed that observations were high-stakes and summative

 

  • Title changed from university “supervisor” to “coach”
  • Data shared with coaches
  • Professional development provided to university coaches 
  • Coaches engaged in discussion regarding formative coaching versus summative evaluation

 

Learning Sprint 2
Questions Data Collected Findings Action
  • What is the nature and quality of the feedback TC receives from the university coach?
  • How does the feedback impact the TC?
  • Document Analysis of supervisor observation forms (N=1754)
  • TC Focus Groups (N=20)
  • Learning Sprint 2
  • Extreme variance in quality and quantity of coach feedback
  • 57% Received fewer than 6 observations
  • 77.82% feedback comments were specific and actionable but a subset of coaches accounted for the majority of actionable comments
  • Disconnect between coursework and clinical
  • Extreme variance in quality and quantity of feedback based on coach
  • Immediate debriefing is a necessity
  • Differentiated instruction coursework needed earlier in program
  • Data shared with coaches
  • Coaches engaged in professional development on taking scripted notes and providing actionable feedback
  • CREATe observation rubric introduced
Learning Sprint 3
Questions Data Collected Findings Action
  • Is the Faculty Learning Community format useful in sustaining continuous improvement efforts?
  • Faculty Survey (collaborative and individual)
  • Ongoing Reflective Feedback Tool
  • Learning Sprint 3
  • Faculty goals included alignment of course content and assessments, enhancing fidelity across course sections, infusing culturally sustaining pedagogy, improving sequencing, and improving theory-to-practice connections
  • Co-construction with colleagues important 
  • Faculty reported increased collaboration
  • Provided infrastructure for 7 voluntary FLCs based on course topics
  • FLCs met as small groups (5x) and as whole group (3x)
  • Faculty participants included 4 tenured, 10 TT, and 4 PT faculty
Learning Sprint 4
Questions Data Collected Findings Action
  • How can faculty use formative data for collaborative continuous improvement?
  • Faculty developed inquiry mindsets
  • Faculty gained practice with concrete steps of how to engage in meaningful inquiry
  • Faculty recognized importance of triangulation of multiple measures
  • Formative data shared with faculty in whole group FLCs
  • Faculty reflected on how the data from TCs could inform continuous improvement of their courses
  • Data sharing and reflection routinized in EPP meetings
Learning Sprint 5 (Improvement Research Fellowship)
Questions Data Collected Findings Action
  • How often were the university coaches assessing each Sanger Resident on the CREATe Rubric? 
  • Which CREATe items are assessed least and why? 

21 residents were assessed 265 times across all 14 items of the CREATe Rubric. Items 5, 9, and 13 were assessed least by the university coaches. Two dominant themes emerged from the empathy interviews on why supporting Emergent Bilingual students was least discussed with residents (CREATe Rubric Item 9): Transference Gap and University Coach Knowledge/Confidence Gap

  • In partnership with Sanger Unified, the decision to focus our improvement aim on supporting Emergent Bilingual students was decided. 
  • Coaches invited to participate in district-led PD alongside residents
Learning Sprint 6 (Improvement Research Fellowship)
Questions Data Collected Findings Action
What is the quality and quantity of feedback related to instructional planning and practice for supporting Emergent Bilingual students (CREATe Rubric Item 9)?
  • Coaches had misperception that Sanger Unified residents already knew EB instructional strategies (89% of TCs reported they are unfamiliar with or have only a general understanding of ELD Standards.
  • Initially Item 9 on CREATe rubric was formatively assessed 0 times
  • Post district-led PD, coaches (N=3) assessed Item 9 46 times
Faculty-in-Residence offered follow up professional development offered to coaches (focused specifically on Sanger Unified’s Top 10 Instructional Strategies for English Language Development)
Learning Sprint 7 (Improvement Research Fellowship)
Questions Data Collected Findings Action
  • Are the needs of EB reflected in the residents’ lesson plans?
  • Are the ELD standards present on the lesson plan?
  • 52% of the resident’s lesson plans mentioned the use of the Top 10 ELD strategies. 
  • But, only 23.8% of the resident’s lessons plans included an ELD/ELA Standard
Asked 3 residents to add a prompt of their choice on lesson plan to remind themselves to include an ELD/ELA Standard and more intentionally plan for Emergent Bilingual Students
Learning Sprint 8 (Improvement Research Fellowship)
Questions Data Collected Findings Action
  • Are the ELD standards present on the lesson plan?
  • Are specific ELD strategies incorporated into the lesson plan? 
  • How do the residents feel about the prompt? In what ways did they find the prompt helpful or not in their planning? 
  • 100% of the resident lesson plans  included ELD/ELA Standards
  • Only one resident included more than four of the top 10 ELD strategies in their lesson plan
Add a specific prompt developed by the residents on lesson plan to remind them to intentionally plan for Emergent Bilingual students

Data Source 1: CSU EdQ Program Completer Survey/One-Year-Out/Employer

Data Source:
The content of the CSU EdQ Survey is aligned to the Teaching Performance Expectations (TPE). Multiple Subject Credential Program Completers are routed directly to the CSU EdQ Survey when they are applying for their preliminary credential. The survey is optional and a completer may elect to not respond to the survey once it is opened, as the survey does direct them back to pay for their credential.

Type of Data:
The EdQ Program Completer survey measures program performance in relation to the following Teaching Performance Expectations (TPEs): 

  • Engaging and supporting all students in learning
  • Creating and maintaining effective environments for student learning
  • Understanding and organizing subject matter for student learning
  • Planning instruction and designing learning experiences for all students
  • Assessing student learning
  • Developing as a professional educator

How use data to inform program practices:
The data are used as a tool to provide an annual snapshot of progression in each of the areas  as well as to highlight areas of strength and weakness.

Data Source 2: CTC Program Completer Survey

Data Source: 
The CTC Program Completer Survey is aligned to the California Standards for the Teaching Profession (CSTP), and has a 70% response rate. It measures program completer perceptions of program quality and preparedness to teach, as well as multiple unique items (e.g., the length of time it took to complete the program; perceptions of preparedness to teach diverse learners, and the use of technology for instruction). Multiple Subject Credential Program Completers are routed directly to the CTC Survey when they are applying for their preliminary credential. The survey is optional and a completer may elect to not respond to the survey once it is opened, as the survey does direct them back to pay for their credential.

Type of Data: Self-reporting Questionnaire

Of the 31 Self-reporting survey questions, items 1-19 focus on California Teacher Performance Expectations, items 20-22 focus on content, and items 23-31 focus on field work and clinical practice

How is data used to inform program practices:  
The survey responses are used as a tool to provide an annual snapshot of progression in each of the areas as well as to highlight completers’ perceptions of program areas of strength and weakness. Data is triangulated with the CSU EdQ survey data and additional program-level data that is collected qualitatively.

Data Source 3: CCTC Employer Survey

Data Source: 
The California Commission on Teacher Credentialing (CCTC) administers the Employer Survey annually from October 1 to December 31. Employers who have hired at least 2 completers from the same institution over the past 3-5 years are asked to complete the survey to provide information about their perception of how well new teachers were prepared by their preparation programs. The survey items align with the California Teaching Performance Expectations (TPE) and the California Standards for the Teaching Profession (CSTP).  Since both sets of standards inform the knowledge and skills embedded in the Multiple Subject Credential Program, this survey gathers valuable information for our program to reflect on regarding the strengths and weaknesses of the preparation program from Employers’ perspectives. Moreover, the intention is that the individual who has seen the new teachers teach is the person who completes this survey, which also provides our program with insight into how the completers of our program are doing 3-5 years out. 

Type of Data: Self-reporting Questionnaire

The survey consists of 7 domains. Of the 14 Self-reporting survey questions, items 1-4 focus on CSTP Engaging and Supporting all Students in Learning, items 5-6 focus on CSTP Creating and Maintaining Effective Environments for Student Learning, items 7-8 focus on CSTP Understanding and Organizing Subject Matter for Student Learning, items 9-10 focus on CSTP Planning Instruction and Designing Learning Experiences for All Students, items 11-12 focus on CSTP Assessing Students for Learning, items 13-14 focuses on CSTP Developing as a Professional Educator.

How is data used to inform program practices:  
The survey responses are used as a tool to provide an annual snapshot of progression in each of the CSTP as well as to highlight employer perceptions that we can use to assess our program’s areas of strength and weakness. Additionally, these data have been reviewed with our largest district partner that employs the majority of our graduates. Future plans include discussing with our district partners and MA in Educational Leadership colleagues for a non-time consuming way to collect focus group interview data from the principals that hire our program completers.     

Data Source 4: Formative & Summative New Teacher Evaluation

Data Source: 
Fresno Unified School District provides us with the Formative & Summative New Teacher Evaluation of their new hires. This data sources is recognized as a desirable way to measure improvement of instruction, to identify skills and abilities that contribute to the success of the educational program, and to redirect skills and abilities that do not result in optimum student growth. 

Type of Data: Observational Data

Each item in the rubric is aligned with the California Standards for the Teaching Profession (CSTP).

How is data used to inform program practices:  
This data was just recently provided to our program based on a year of planning with our largest local district partner. It has not been brought back to the program faculty and university coaches for a data dialogue session yet. This is being planned for Spring 2022.  

Aspect E →