CCTC Accreditation 2022
Standard 4
Continuous Improvement
The education unit develops and implements a comprehensive continuous improvement process at both the unit level and within each of its programs that identifies program and unit effectiveness and makes appropriate modifications based on findings.
- The education unit and its programs regularly assess their effectiveness in relation to the course of study offered, fieldwork and clinical practice, and support services for candidates.
- Both the unit and its programs regularly and systematically collect, analyze, and use candidate and program completer data as well as data reflecting the effectiveness of unit operations to improve programs and their services.
- The continuous improvement process includes multiple sources of data including 1) the extent to which candidates are prepared to enter professional practice; and 2) feedback from key stakeholders such as employers and community partners about the quality of the preparation.
- Figure 4.1: KLASSI graphic (PDF)
- Table 4.2: Table of Data Sources Used in our Continuous Improvement Efforts (HTML below)
- Figure 4.3: Kremen Organization Chart (PDF)
- Table 4.4: Assessment Roles and Responsibilities of Personnel in the Unit and Programs (HTML below)
- Table 4.5: Assessment Cycle Schedule (HTML below)
- Table 4.6: Decision Points Matrix: Basic Credentials (PDF)
- Table 4.7: Decision Points Matrix: Advanced Credentials (PDF)
- Table 4.8: Sample Program Modifications (HTML below)
Figure 4.1: KLASSI Assessment System
Our unit-wide assessment system, Kremen Learning Assessment System to Sustain Improvement (KLASSI), is an assessment and accountability system built upon a continuous improvement model.
Our assessment is an on-going, goal-oriented process, viewed as the vehicle for continuous
improvement. Assessment system activities include not only gathering data, but also
turning those data into rich information through a feedback process used to guide
individual candidates, faculty members, programs, and the unit in improving performance,
quality and effectiveness. We view assessment as an integral part of learning to foster
improvement and the first step in a continual learning cycle (an assessment-learning-change
cycle), which includes measurement, feedback, reflection, and change. Aimed at improving
teaching and learning, our assessment is an iterative process of developing and organizing
activities, signature assignments, courses, curricula, or programs; collecting and
interpreting data; and using outcome information to guide decisions. These outcomes
serve as determinants of program effectiveness and accountability.
Table 4.2: Table of Data Sources Used in our Continuous Improvement Efforts (PDF)
Unit Data Sources
Annual reporting of key program details to CTC, including demographic data of candidates enrolled within each initial and advanced credential program and program specific details regarding enrollment.
The CSU Educator Quality Center administers surveys annually to completers of the Multiple Subject, Single Subject, and Education Specialist Programs to gather data about graduates’ perceptions of the program. Results can be disaggregated by various measures including campus, year of completion, respondent race/ethnicity, and type of credential.
The CTC administers surveys annually to candidates from the following programs: Multiple Subject, Single Subject, Education Specialist, Preliminary Administrative Services Credential, and Other Educator. Results provide aggregate data for CSU, UC, and Private Institutions as a whole about graduates perceptions of program effectiveness.
The CTC administers surveys annually, beginning in Spring 2019, to individuals who served as Master Teachers for Program Candidates. Results provide aggregate data for CSU, UC, and Private Institutions as a whole about Master Teachers’ perceptions of program effectiveness and candidate preparation.
The CTC administers surveys annually, beginning in Fall 2019, to administrators who employ recent program graduates. Results provide aggregate data for CSU, UC, and Private Institutions as a whole about employers’ perceptions of program effectiveness and candidate preparation.
Accreditation Information Management System (AIMS), an integrated system that provides tools for institutions to collaborate with Council for the Accreditation of Educator Preparation (CAEP) and to facilitate the accreditation process.
CAEP, our current national accreditation body (through 2022), requests an annual report of the number of program completers for the previous year along with a narrative of how the Educator Preparation Program continues to meet accreditation requirements. Note: we are in the process of seeking accreditation through Association for Advancing Quality in Educator Preparation (AAQEP).
CTC annual reporting. Data provided include:
- demographics of initial credential completers and those currently enrolled;
- past, current, and future goals regarding special education, science, math, and English learners.
- admission requirements and School attributes of technology use and teacher training.
Each program and department at the university develops a Student Outcomes Assessment Plan (SOAP), a 5-year plan to assess program effectiveness related to outcomes, and an annual report/analysis of collected data related to the SOAPs. These are examined at the department, school, and university levels.
Our Office of Institutional Effectiveness generates data visualizations that focus on student demographics, student DFW rates in courses, graduation/completion rates, and other information about student success.
The Fresno State Student Rating of Instruction (FSSRI) system allows instructors to choose from a bank of questions and is an opt-out system, meaning that there is a default evaluation distributed to students unless an instructor opts for a different set of questions or to opt out altogether. Campus leaders with expertise in assessment have tested the reliability and validity of each item. Results of these evaluations are placed in faculty personnel files and are part of our university-wide personnel review system.
Faculty at Fresno State are required to be observed by peers on a regular basis, as per our academic policy manual (APM 322). Observations should address course design and delivery. These observations may be narrative or they may use an adaptation of a form developed by the Academic Senate. Observations are considered as part of the personnel review process. Kremen has also developed its own form used for peer observations.
Our Information Technology colleagues built a system within PeopleSoft that allows us to pull data into reports that shape our understanding of admissions, student demographics, course grades, and other pertinent items related to our enrolled students.
We use TK20 to collect a variety of data including clinical practice hours, coaching observations, FAST scores, interim and final evaluations, lesson plans, class profile, self evaluation, and disposition survey.
Prior to beginning the program, Preliminary Credential Candidates submit a signed 9-item Teacher Candidate Commitment Form, which is aligned with the dispositions articulated within the CSTPs.
Although most of the California State University campuses use the CalTPA (administered by Pearson) to assess student teacher’s proficiency in California’s Teacher Performance Expectations (TPEs), Fresno State developed its own in-house system, FAST II, which is now in its second iteration. Fresno State is the only campus within California to use this two-part system:
We collect, report, and analyze the data associated with this assessment.
Our Credential Analyst keeps the Individual Development Plans (IDPs) on file (in addition to each coach giving a copy to individual candidates). Coaches identify areas for growth which helps us reflect on how we can build teacher candidate knowledge in specific areas.
As graduate students, candidates in the Advanced Credential Programs are required to complete Fresno State's Graduate Level Writing Requirement. We track the number and percent of students who pass annually by program, noting major problem areas. Programs then use the information to inform future instruction.
Individual Credential Program Data Sources
Although at present we use an observation form for Multiple Subject, Single Subject, Education Specialist, and Bilingual Authorization programs available in TK20, we are transitioning to the TNTP Core as adapted by Chico State to reflect the California TPEs. This rubric will allow us to collect data across our basic credential programs in key areas: culture of learning, essential content, academic ownership, and demonstration of learning.
Each Advanced Credential Program uses a Field Placement Evaluation tool that is aligned with its specific program foci. In this way, the tool allows the program to collect data specific to the program goals, data the programs then use to engage in continuous improvement.
Local language assessments in Spanish and Hmong are being created in Spring 2021 for administration in Fall 2021 and recommended to all candidates to assist in the proper course placement. For example, based on Candidate Exit Survey assessment results, it may be recommended that students take a preliminary language course prior to taking the advanced BAP courses.
A District Partner Survey is being developed in Spring 2021, this survey will collect the following data: how many Fresno State BAP graduates are employed, how many bilingual/dual immersion schools and classrooms they have, how many languages are taught, and anticipated growing need of teachers.
- Program SOAP
- Fieldwork Evaluation Rubric: The Master Teacher or University Supervisor uses a rubric which assesses candidates in four areas:
- Engaging and supporting all students in learning
- Creating and maintaining an effective environment for students
- Understanding and organizing subject matter for student learning
- Planning instruction and designing learning experiences for all students
- Program SOAP
- Candidate performance on CalAPA (California Administrator Performance Assessment):
Aligned with the California Administrator Performance Expectations, the CalAPA includes
three specific assessment cycles:
- Leadership Cycle 1: Analyzing Data to Inform School Improvement and Promote Equity
- Leadership Cycle 2: Facilitating Communities of Practice
- Leadership Cycle 3: Supporting Teacher Growth
- Survey of District Partners
In Fall 2020, the Education Specialist Program faculty developed a needs-assessment survey with items based on the Education Specialist Program Standards. The survey was distributed to both current students and recent alumni to inform program improvement.
- Program SOAP
- Candidate Exit Survey (Sample Data Summary):
- Program completers rank how prepared they are in various types of literacy instructional approaches, assessments, and site literacy needs.
- Program SOAP
- Field Placement Assessment: The On-Site Supervisor completes an evaluation form which gives an overall assessment and also rates the student’s performance in 15 areas using a four point scale.
- PPS: School Counseling Alumni Survey: When students graduate from the program they are asked to voluntarily complete the alumni survey
- Program SOAP
- Pre- and Post-Knowledge Based Questionnaire (Candidates) rating themselves in 26 areas and setting goals for the practicum experience
- Preceptor Evaluation of Student Performance
- District School Nurse Supervisor Survey of Program Effectiveness
- Program SOAP
- Field Evaluations of practicum and internship field experiences by field supervisors
- PRAXIS: Data are collected each year on PRAXIS scores for the National Certification in School Psychology exam
- Program SOAP
- Social Work Education PPS Credential Candidate Dispositions Evaluation: Completed every semester by Field Instructor in consultation with the Social Work candidate; focuses on 7 different areas of preparation.
- Social Work Education Evaluation of Student Performance: Includes information about the number of hours completed in developmental, diversity, and area of specialization settings in addition to evaluating candidates in the following areas: Professional development; Professional values and identity; Multi systems practice; Evaluation of social work practice
- Program SOAP
- Graduate Exit Survey
- Speech Pathology Performance Evaluation: Using CALIPSO, the supervisor fills out this evaluation which includes demographic information students in the clinical setting. It also breaks down the candidates skills in several areas in terms of speech sound production, fluency, voice, language, hearing, swallowing, cognition, social aspects, and AAC.
- Knowledge and Skills Acquisition (KASA) used to provide feedback and evaluate students throughout the program
- Employer Survey sent to school districts, residential schools, private companies, and other entities known to employ graduates
- Alumni Survey
- PRAXIS Results of program graduates
- Cumulative Evaluation
Figure 4.3: Kremen Organization Chart (PDF)
Table 4.4: Assessment Roles and Responsibilities of Personnel in the Unit and Programs (PDF)
Unit Head
Dr. Randy Yerrick, Dean
The Dean of the Kremen School, Dr. Randy Yerrick, functions as the acting Director of Teacher Education according to the KSOEHD Faculty Assembly Constitution (KSOEHD Faculty Assembly Constitution, p. 27 Article VIII Section 2 [3]). As such the Dean oversees all continuous improvement efforts: leadership, research, planning, coordination, implementation, documentation, monitoring, analysis, and outreach. The Dean works with faculty and staff in programs across the university to assure quality educator preparation and continuous improvement in response to shifting contexts (e.g., COVID -19), as shown in Figure 4.3, the Kremen Organization Chart.
Continuous Improvement Team
Provides leadership and support for assessment and continuous improvement efforts. |
Advisory Boards
Provide guidance, insight, and feedback on educator preparation programs. |
Teach, coach, and support credential candidates both in classroom and clinical practice settings. Give feedback and evaluate within FAST II assessment and clinical practice.
Laura Rabago |
Sherri Nakashima |
||
Renee Flores |
Renee Petch |
||
Brenna Barks |
Maria Vargas |
||
Ricci Ulrich |
Dr. Jessica Hannigan |
||
Dr. Felipe Mercado |
Dr. Ignacio Hernandez |
||
Department Administrative Support Coordinators
|
Dean’s Office
|
Agriculture Specialist: |
Bilingual Authorization: |
||
Deaf Education: |
Early Childhood Education Authorization: |
||
Education Administration: |
Reading Specialist: |
||
School Counseling: |
School Nurse Services: |
||
School Psychologist: |
School Social Work: |
||
Single Subject: Dr. Imelda Basurto
|
|
Table 4.5: Assessment Cycle Schedule (PDF)
Activity |
Participants |
Implementation |
Review data from CSU/CTC Surveys: Completers Survey, One Year Teachers, Employer Surveys |
Deans, Program Coordinators |
Dean works with Program Coordinators, Faculty, and Staff to implement changes based on findings from analysis |
Review data from CSU/CTC Surveys: Completers Survey, One Year Teachers, Employer Surveys |
Program Coordinators, Faculty, Staff (In department/ program meetings and during Data Summits) |
Program Coordinators, Faculty, and Staff implement programmatic changes based on results of survey analysis |
Annual Report to Provost (Current 2019-2020 Report)
|
Chairs and Dean |
Feedback from Provost implemented by Chairs and Dean |
Programs in Kremen: Clinical Practice and Teacher Internship Program Report:
|
Coordinators and Faculty |
Program Coordinators and Faculty use written feedback to inform planning for the subsequent academic year. |
FAST II Psychometric Evaluation to determine instrument validity and reliability |
Faculty in the Department of Curriculum and Instruction, the Office of Clinical Practice, and the Dean’s Office |
When group pass rate differences are found, we investigate the possible sources and address/revise as necessary. |
Faculty Review process:
|
Personnel Committees (department, school, university), deans, provost |
Feedback during each review by each reviewing body |
Accreditation Reviews & Report Preparation |
Kremen works with Coordinators of Programs that are already nationally accredited:
|
Program Coordinators and Faculty implement changes based on findings from report preparation |
Activity |
Participants |
Implementation |
Accreditation Unit Meeting to review unit organization and all activities within education unit |
Dean, Program Coordinators |
Dean and Program Coordinators make changes based on findings |
Data Summit to discuss unit-wide data collection activities and analysis for the purposes of continuous improvement. We divide into small breakout groups by department or program to discuss data collection, analyze data, and plan for improvements. We also encourage ongoing follow up and data analysis within program/department meetings. |
Kremen faculty and staff |
Implementation by program faculty and staff |
Analysis of Tableau Visualizations (enrollment data) |
Deans, Department Chairs, Departments, and Program faculty |
Specific follow up on recruitment, student support, and curriculum |
Credential application analysis and review |
Credential Analyst |
Credential Analyst presents to Deans; Dean, Program Coordinators, and Staff work together to implement changes based on findings |
Review of Admissions Data |
Center for Advising and Student Services as well as the Office of Clinical Practice |
Centering for Advising and Student Services, Office of Clinical Practice, and Program Coordinators work together to make decisions about admissions, special considerations, and recruitment efforts |
Course Alignment Discussion: Key assignment expectations plus student performance on key assignments |
Program Faculty |
Faculty revise coursework based on analysis of student performance and expectations |
Final assessment of candidates (coursework and clinical practice) |
Faculty |
Faculty make changes to courses based on analysis of candidate final assessment performance |
FAST II calibration & scoring sessions |
FAST II Coordinator, Coordinator of Office of Clinical Practice, faculty |
FAST II Coordinator works with faculty to implement changes based on results |
Analysis of FAST II Results |
Program faculty (SS and MS) and the Department of Curriculum and Instruction |
Planning related to student support, curriculum, and any needed revisions to the FAST II |
Clinical Practice placements review |
Coordinator of the Office of Clinical Practice and program assistant, program coordinators, Associate Dean |
Coordinator of the Office of Clinical Practice makes changes to placements based on findings |
Induction Board Meetings |
County/district/ school personnel, Coordinator of the Office of Clinical Practice |
Coordinator of the Office of Clinical Practice makes necessary adjustments to placements |
Review of Policy and Student Concerns |
Program faculty and coordinators |
Program coordinators implement changes when necessary based on findings |
Student Evaluations Review |
Deans, department chairs |
Deans and department chairs follow up with faculty as needed |
Special Consideration Committee |
Program coordinators, staff, and faculty |
Program coordinators and staff |
Activity |
Participants |
Implementation |
Clinical practice midterm and final evaluation meetings |
Coaches, students, mentor teachers |
Coaches, students, and mentor teachers adjust support based on discussions at evaluation meetings |
Residency Meetings to discuss and review |
District/school personnel, residency coordinator, faculty, program coordinators |
Residency coordinators |
Activity |
Participants |
Implementation |
Assessment Committee: Review and implementation of policy changes |
Unit Faculty representatives |
Faculty and staff make changes to assessment plan based on feedback |
Curriculum Committee: Review and implementation of policy changes |
Unit Faculty representatives |
Faculty and staff make changes to curriculum based on feedback |
Multiple Subject Program Meetings: Review and implementation of policy changes; review of programmatic data (candidate performance, field place evaluations, exit survey findings, etc.) |
Program Coordinator, Faculty, and Coaches |
Faculty and staff make changes to program based on feedback |
Single Subject Program Meetings: Review and implementation of policy changes; review of programmatic data (candidate performance, field place evaluations, exit survey findings, etc.) |
Program Coordinator, Faculty, and Coaches |
Faculty and staff make changes to program based on feedback |
Individual Program Meetings: Review and implementation of policy changes; review of programmatic data (candidate performance, field place evaluations, exit survey findings, etc.) |
Program Coordinators plus Faculty |
Faculty and staff make changes to program based on feedback |
Activity |
Participants |
Implementation |
Credential Program admissions and monitoring processes: Each program monitors candidate admissions and progress towards the credential at four crucial points (See Decision Points Matrices for Initial and Advanced Credentials, Tables 4.6 and 4.7) |
Program Coordinators and Advisors |
Program Coordinators, Advisors, Faculty, and Coaches make changes in how candidates are supported based on findings at program checkpoints |
Coaching and feedback sessions in clinical practice; monitoring of candidate development in TPEs (at least 6 times/semester) |
Coaches, mentor teachers, students |
Coaches and mentor teachers make changes to support provided based on findings from discussion |
Analysis of Student Outcomes Assessment Plan (Assessment cycle happens throughout the year. Specific documentation and analysis formalized in a document each fall) Each program and department at the university develops student outcomes, a 5 year plan to assess program effectiveness related to outcomes, and an annual report/analysis of collected data related to the SOAPs. These are examined at the department, school, and university levels. |
Assessment coordinators (University and Kremen), departments, program faculty |
Based on feedback from Assessment Committee and findings from analysis of student outcomes, Program Coordinators work with faculty to implement changes to program structure and coursework |
Activity |
Participants |
Implementation |
According to the university, “Periodic program reviews provide a mechanism for faculty to evaluate the effectiveness, progress, and status of their academic programs on a cyclical basis. It is an opportunity for the department (or program) to evaluate its strengths and weaknesses within the context of the mission of the university and of current and emerging directions in the discipline.” |
Coordinators, Faculty, Departments, Deans, Community Partners) |
Based on feedback from University and program evaluation of effectiveness and progress, Program Coordinators work with faculty to implement changes to program structure and coursework |
Table 4.6: Decision Points Matrix: Basic Credentials (PDF)
Table 4.7: Decision Points Matrix: Advanced Credentials (PDF)
Table 4.8: Sample Program Modifications (PDF)
Modification |
Process Used to Identify and Implement Modification |
Adopting a new rubric, a version of The New Teacher Project's (TNTP) Core rubric adapted by Chico State to foreground the California Teacher Performance Expectations. The decision to use this rubric was made with full participation and consultation of stakeholders. |
After feedback from coaches that the rubric being used to evaluate candidates' field placement work was too unwieldy, Program Leadership identified existing rubrics that might suit our needs. We shared these with Coaches who would be using the rubrics and evaluated the strengths and limitations of each. |
Modification |
Process Used to Identify and Implement Modification |
In-Progress Modifications: More focused instruction in courses on (1) developing curriculum and modifying instruction for students with special needs and (2) developing IEPs and holding team meetings |
In Fall 2020, the Education Specialist Program faculty developed a needs-assessment
survey based on the Education Specialist TPEs that was distributed to 121 current
students and recent alumni. A total of 33 individuals responded to the survey. |
Modification |
Process Used to Identify and Implement Modification |
Specific changes to coursework: |
Program faculty developed a survey with items based on the current training standards
of the National Association of School Psychologists (NASP). The survey was emailed
to field supervisors, local administrators/employers, and alumni of the program, and
respondents were asked to indicate district or agency need on a scale from strong
need to no need, and also to rate the CSU, Fresno training program as need met, partially
met, not net, or unable to judge. There were 54 responses (54% response rate); three-fourths
were CSU, Fresno graduates ranging from 1991 to 2017, with the majority graduating
during this review period (2011-2017). |