AAQEP Accreditation 2022
Standard 4: Aspect D
The program is committed to and invests in strengthening and improving the education profession and the P-20 education system. Each program’s context (or multiple contexts) provides particular opportunities to engage the field’s shared challenges and to foster and support innovation. Engagement with critical issues is essential and must be contextualized. Sharing results of contextualized engagement and innovation supports the field’s collective effort to address education’s most pressing challenges through improvement and innovation.
The Multiple Subject Teacher Credential Program meets on a monthly basis where data is often prepared and presented to the program faculty members to discuss and reflect on the meaning, quality of the data, and application of the data to practice and program decisions. Prior to 2017, program data was hard to come by. Locked file cabinets held hand-written, paper copies of observations, Fresno Assessment of Student Teachers (FAST) data was accessible through a single faculty member, program data was stored in a spreadsheet somewhere but no one knew where. Program faculty and leaders had limited access to any program-level, coach-level, mentor-teacher, or candidate data.
With the realization that data accessibility was key to any continuous improvement efforts, Kremen leadership recognized the need for a comprehensive data management system to collect, store, and retrieve data regarding all aspects of the program. After meeting with sales representatives about multiple Data Management System products, we selected Tk20 based on its flexibility, comprehensiveness, and client services which includes an openness to tailored system builds based on specific program needs. The implementation of a comprehensive data management system was difficult but worth it. There were many lessons learned, especially pertaining to the emotional and psychological upheaval that change can produce. An additional area of support that is still needed includes a full time TK20 Administrator and data analyst to support not just the Multiple Subject Credential Program’s needs but for the whole unit. Nonetheless, the result of our ongoing implementation of Tk20 is a flexible and tailored system that can continue to grow with the program over time. It enables the program to use real-time data to make meaningful improvements and stay at the cutting-edge of the field.
When data transparency and analysis were not part of our program’s culture, it was a challenge to introduce the idea of data collection, sharing, and transparency. A pivotal shift occurred when we were not just collecting data but also examining that data together as a program and asking questions that may lead to program changes based on the data.
The incorporation of data into our conversations with district partners and internal program stakeholders has also increased over time. Having data points related to program content, outcomes, and stakeholder experiences has encouraged faculty to ask honest questions about the effectiveness of our program. One critical question, for example, is why our candidates overwhelmingly have a difficult time passing the Reading Instruction Competence Assessment (RICA). In the past, without data as evidence, it has been easy for faculty to attribute low success with RICA to other circumstances or dismiss the issue as low occurrence. However, faced with the data from our district partners and completers, faculty have pivoted to address this problem of practice through course revisions.
In addition to partner district data, we began to use available data from the CSU Educator Quality Center and to collect data internally to help gain ongoing insights into our programs. In 2018 the Educator Quality Center provided tailored professional development for our program faculty on how to access data through the EdQ Data Dashboard. At first, faculty seemed unsure about the usefulness of the available data but as they engaged in the professional development sessions, they discovered that they could ask questions and access relevant data. The Dashboard provided a bird’s-eye view of our outcomes that many faculty had not previously considered.
Internal program-level data is now collected regularly and shared with faculty at program meetings and used as a launching point for substantive discussions about the program. For example, we have collected data about candidate experiences through journey maps. This data provided the concrete evidence needed to persuade faculty and leadership that a more comprehensive, whole-person approach to educator preparation is needed. For example, candidates reported struggling with stress and anxiety according to patterns that could be addressed programmatically. These findings prompted discussions of possible strategies for addressing candidate stress, including revising/aligning assignments and providing counseling services. Following our data discussions our faculty also felt more compelled to contribute to our program’s continuous improvement efforts. The first discussion of the journey map data created a palpable sense of energy among faculty members as they carried on with the rest of their day. One faculty member shared:
I have never seen information shared like this before. I often ask my students for their feedback. I can easily see some of them in this data. Yet, being able to see broad themes of perspectives across the whole program is powerful and is giving me much to reflect on further.
Another faculty member commented on how they felt “troubled by seeing how our program changes are creating anxiety for our students. We know that many experience trauma due to out-of-program factors. We need to consider how we can better manage change so that we don’t cause more stress?” Using an improvement research framework, gave us an opportunity to see the system from a broader range of student perspectives via inquiry cycles that are systematically designed and implemented across all credential pathways.
Rather than data being a “four-letter word” it now pervades all aspects of the collective work by the university and district partners. This new collaborative approach is best demonstrated by the engagement of Fresno State faculty and Sanger Unified district leaders in Improvement Science Fellowship from 2018-2020. In these rapid cycles of inquiry, university and district team members worked together to identify a problem of practice, collect relevant data, devise a plan of action, and assess the effectiveness of their action plan.
Central to the conversations between the improvement research fellowship team and our district partner was our willingness to be vulnerable. We shared that our program is part of the problem and solution; that we are not looking to point blame at the district, nor to expect all the change to happen on their end. We advocated to take a look inward in order to identify the areas that require improvement to our processes and methods of teacher candidate preparation. Through this collaboration, the Sanger residency has become stronger and more aligned to the needs of the district for evidence-based instructional strategies for Emergent Bilingual students.
Engagement in Data Inquiry Cycles:
Rapid inquiry cycles were introduced gradually and to program faculty last. Inquiry was first incorporated into the program for teacher candidates. During the program redesign, a three-course inquiry series was introduced as a grounding point for synthesizing course content, evidence-based practices, theory-to-practice connections, and reflections on clinical work. Over the three inquiry courses, candidates engage in data literacy activities, scaffolded inquiries, team inquiries, individual inquiries, and facilitation of child-led inquiries. Inquiries are based on puzzles of practice that candidates experience in the field and provide opportunities for candidates to practice meaningful data collection, research, and action planning that can be immediately applied in their clinical placements. Candidates bring back artifacts and video recordings of their instruction to share, reflect, and receive feedback.
Next, the residency leadership team began engaging in rapid inquiry cycles as part of the grant-related continuous improvement work. This scaffolded series of learning sprints helped establish routines of data collection, analysis, action planning, and assessing for the EPP leadership. Data and findings from these learning sprints were then gradually introduced to university coaches and faculty over the course of two semesters. District partner data and other internal data were layered onto these more formal learning sprints and presented for analysis and discussion at program meetings. From there, infrastructure in the form of faculty learning communities was developed to provide faculty opportunities to work in teams on data-driven course development. Faculty were encouraged to formulate their own problems of practice and collect data and revise coursework to address them.
|Learning Sprint 1||
|Learning Sprint 2||
|Learning Sprint 3||
|Learning Sprint 4||
|Learning Sprint 5 (Improvement Research Fellowship)||
|Learning Sprint 6 (Improvement Research Fellowship)||
|Learning Sprint 7 (Improvement Research Fellowship)||
|Learning Sprint 8 (Improvement Research Fellowship)||
Data Source 1: CSU EdQ Program Completer Survey/One-Year-Out/Employer
The content of the CSU EdQ Survey is aligned to the Teaching Performance Expectations (TPE). Multiple Subject Credential Program Completers are routed directly to the CSU EdQ Survey when they are applying for their preliminary credential. The survey is optional and a completer may elect to not respond to the survey once it is opened, as the survey does direct them back to pay for their credential.
Type of Data:
The EdQ Program Completer survey measures program performance in relation to the following Teaching Performance Expectations (TPEs):
- Engaging and supporting all students in learning
- Creating and maintaining effective environments for student learning
- Understanding and organizing subject matter for student learning
- Planning instruction and designing learning experiences for all students
- Assessing student learning
- Developing as a professional educator
How use data to inform program practices:
The data are used as a tool to provide an annual snapshot of progression in each of the areas as well as to highlight areas of strength and weakness.
Data Source 2: CTC Program Completer Survey
The CTC Program Completer Survey is aligned to the California Standards for the Teaching Profession (CSTP), and has a 70% response rate. It measures program completer perceptions of program quality and preparedness to teach, as well as multiple unique items (e.g., the length of time it took to complete the program; perceptions of preparedness to teach diverse learners, and the use of technology for instruction). Multiple Subject Credential Program Completers are routed directly to the CTC Survey when they are applying for their preliminary credential. The survey is optional and a completer may elect to not respond to the survey once it is opened, as the survey does direct them back to pay for their credential.
Type of Data: Self-reporting Questionnaire
Of the 31 Self-reporting survey questions, items 1-19 focus on California Teacher Performance Expectations, items 20-22 focus on content, and items 23-31 focus on field work and clinical practice
How is data used to inform program practices:
The survey responses are used as a tool to provide an annual snapshot of progression in each of the areas as well as to highlight completers’ perceptions of program areas of strength and weakness. Data is triangulated with the CSU EdQ survey data and additional program-level data that is collected qualitatively.
Data Source 3: CCTC Employer Survey
The California Commission on Teacher Credentialing (CCTC) administers the Employer Survey annually from October 1 to December 31. Employers who have hired at least 2 completers from the same institution over the past 3-5 years are asked to complete the survey to provide information about their perception of how well new teachers were prepared by their preparation programs. The survey items align with the California Teaching Performance Expectations (TPE) and the California Standards for the Teaching Profession (CSTP). Since both sets of standards inform the knowledge and skills embedded in the Multiple Subject Credential Program, this survey gathers valuable information for our program to reflect on regarding the strengths and weaknesses of the preparation program from Employers’ perspectives. Moreover, the intention is that the individual who has seen the new teachers teach is the person who completes this survey, which also provides our program with insight into how the completers of our program are doing 3-5 years out.
Type of Data: Self-reporting Questionnaire
The survey consists of 7 domains. Of the 14 Self-reporting survey questions, items 1-4 focus on CSTP Engaging and Supporting all Students in Learning, items 5-6 focus on CSTP Creating and Maintaining Effective Environments for Student Learning, items 7-8 focus on CSTP Understanding and Organizing Subject Matter for Student Learning, items 9-10 focus on CSTP Planning Instruction and Designing Learning Experiences for All Students, items 11-12 focus on CSTP Assessing Students for Learning, items 13-14 focuses on CSTP Developing as a Professional Educator.
How is data used to inform program practices:
The survey responses are used as a tool to provide an annual snapshot of progression in each of the CSTP as well as to highlight employer perceptions that we can use to assess our program’s areas of strength and weakness. Additionally, these data have been reviewed with our largest district partner that employs the majority of our graduates. Future plans include discussing with our district partners and MA in Educational Leadership colleagues for a non-time consuming way to collect focus group interview data from the principals that hire our program completers.
Data Source 4: Formative & Summative New Teacher Evaluation
Fresno Unified School District provides us with the Formative & Summative New Teacher Evaluation of their new hires. This data sources is recognized as a desirable way to measure improvement of instruction, to identify skills and abilities that contribute to the success of the educational program, and to redirect skills and abilities that do not result in optimum student growth.
Type of Data: Observational Data
Each item in the rubric is aligned with the California Standards for the Teaching Profession (CSTP).
How is data used to inform program practices:
This data was just recently provided to our program based on a year of planning with our largest local district partner. It has not been brought back to the program faculty and university coaches for a data dialogue session yet. This is being planned for Spring 2022.