Skip to main content Skip to main navigation Skip to footer content

AAQEP Accreditation 2022

Standard 3: Aspect E

Preparation programs ensure that candidates, upon completion, are ready to engage in professional practice, to adapt to a variety of professional settings, and to grow throughout their careers. Effective program practices include: consistent offering of coherent curricula; high-quality quality, diverse clinical experiences; dynamic, mutually beneficial partnerships with stakeholders; and comprehensive and transparent quality assurance processes informed by trustworthy evidence. Each aspect of the program is appropriate to its context and to the credential or degree sought.


Internal Audit Process:
Three Education Specialist Credential Candidates were selected for the internal 2020-2021 audit process. Two of the candidates are multilingual persons of color.  All three faced different challenges in the program. The first candidate entered into the internship program as soon as he was eligible to do so but was not able to pass the RICA exam by the time he completed his credential coursework, necessitating an extra year as an intern until he passed the RICA exam.  The second candidate was a Dual (Multiple Subject and Education Specialist) candidate who struggled with peer relationships and with the two general education clinical experiences.  In the third phase of the program, the candidate was counseled out of the Dual program and into the Education Specialist only pathway and successfully completed the final clinical experience in special education. Because of a pending move out of the area, the candidate chose not to complete the remaining exam requirement needed to apply for the preliminary credential. The third candidate entered the program from a single subject content area, received a special consideration internship for a difficult to fill teaching position, has successfully completed all program coursework but has not been able to pass one of the required CSET exams to apply for his preliminary credential. We traced their academic journeys from the time they were recruited to completion. The data we collected included information at various points of the program included the  following:

  1. Recruitment information: where, when and how?
  2. Admissions process: what process and criteria were used?
  3. Advising: what kinds of advising did the candidate receive as they moved through the program?
  4. Monitoring: what criteria were used to monitor the candidate?
  5. Final Clinical Placement information: were criteria met? How?
  6. Completion: what process was followed? Were criteria met?
  7. Completer Follow-Up: who gets contacted post-completion? Has that happened yet?

See Appendix D table for details of our audit.

Outcome from Audit Process:

Analyzing the data:

  • What are the strengths of your process? How do students get advised and supported throughout their experience in your program? 
    The strength of the program is the communication between students and the Coordinator of the program. The Coordinator works to help students take the proper coursework, in addition to problem solving when challenges arise. The Coordinator also advises and supports students throughout the program. 
  • Are there scholarships or other kinds of support given to students? 
    Yes, scholarships and financial support are readily communicated to candidates. Faculty and Coordinator are responsive to letters of support. Some of those scholarships the candidates received included the Aspiring Teacher Scholarship, TEACH Grant, Kremen School of Education and Human Development Alumni Scholarship, the LARCs Scholarship and the Golden State Teacher Grant. 
  • What seem to be the richest experiences for student progress towards the credential?
    Field work experiences are the richest experiences in that candidates get experience working directly with students, implementing what they’ve learned in coursework, collaborating with mentor teachers, and reflecting on their instructional experiences.  
  • Were there any problems the candidate encountered? 
    Problems include passing the CSET exams and the RICA exam, in particular for students who are multilingual persons of color.  An additional problem with the process is the stress that the candidates encounter due to the combination of coursework, field experiences, and more recently Covid. 
  • If so, how were the problems resolved, how quickly, and who addressed the problems? Who should have been involved that wasn't? 
    For the two candidates having difficulty passing the required exams, advising and test preparation were addressed by the Program Coordinator during multiple meetings and counseling sessions. For the candidate having difficulty in the clinical experience, the University Clinical Coach, the district mentor and the Coordinator addressed the candidate’s issues and all three provided additional support.
  • What does this info reveal about your processes? For example, are you admitting the best applicants? Do good potential students face any barriers? 
    This audit reveals that the Program Coordinator went beyond the current scope of the program design and worked very hard to support students. This also reveals that there may not be enough supports built into the program, and the process to adequately support students during the process of achieving a credential for Education Specialist, or Dual Certification. In addition, test preparation support would be helpful for candidates needing to pass required exams. This issue is being addressed in two ways. Several years ago the Kremen School began offering an additional reading competency preparation course to support Multiple Subject and Education Specialist candidates in passing the RICA examination. Second, California recently passed new legislation (July 2021)  allowing alternative pathways to meet the examination requirements.
  • Which aspects of your program need further study or redesign? 
    The role the Program Coordinator currently takes on in student support is not sustainable. Because of the complexity of the Education Specialist Credential Program and its multiple pathways, more supports need to be in place for our candidates as they make their way through the process. This is an important issue that needs to be addressed as soon as possible. One option would be to create and add an interest tab on the program website. This tab would send prospective candidates to an interest survey where essential contact information and how they learned about the program  would be captured in a centralized database. As part of this process, after prospective candidates complete the interest form, the Coordinator and faculty could then contact the prospective student via phone call or email to provide a program overview, answer questions, provide application information and support, and/or direct them to information sessions.

Process for Ensuring Data Quality through Validity, Reliability, Trustworthiness, & Fairness:

Key Data Measures can be found in Appendix E

Mid Semester & Final Semester Formative Performance Review (CREATe Rubric) 
A collaborative process between Fresno State and three school districts led to the development of a common rubric intended to provide action-oriented formative feedback for teachers that is most currently used by coaches and credential students during the mid and final semester formative performance review. Aligned to both district and state standards, the rubric focuses on specific action-oriented competencies. The rubric also provides multiple stakeholders a common language that helps prevent mixed messages or misinterpretations of jargon. Various constituents can use the rubric to provide specific feedback and next steps to strengthen practice for any teacher along the novice to expert continuum. Each competency is represented in the rubric as a separate continuum so that the rubric can help identify each teacher’s individual zone of proximal development (Vygotsky, 1978) and help teacher educators provide tailored and targeted scaffolding. Thus, the rubric provides a way to create practice profiles across multiple competencies that can help document teacher development over time.

Through synthesis of existing district observation tools, and by aligning this synthesis to the new Teacher Performance Expectations (2016 CCTC TPEs), the rubric went from design into implementation. Only observable standards were included, selected by consensus among representatives of the partner districts. Based on partner district input, the rubric includes 14 items in four sections: 1) Positive Environment, 2) Instructional Design and Implementation, 3) Rigorous and Appropriate Content, and 4) Reflection-In-Action.  

Coaches who supervise our teacher candidates' clinical experience and district partners were offered three days of in-depth training on the use of the rubric when it was first introduced. The three day, two-hour in length training workshops utilize video clips, small group discussion, whole group discussion, and individual reflection to engage coaches in actively thinking about and trying out each rubric item. The trainings included two live 20-minute observations with placing the teacher on the continuum followed by immediate discussion and reflection using the rubric after visiting the classroom. As coaches and district partners were intentionally introduced to the rubric, teacher candidates were also introduced to the rubric over by the coaches who were trained. These opportunities for coaches and teacher candidates to develop knowledge, skills and dispositions about rubric-use are critical to rubric implementation to fidelity. At one time, all of the Education Specialist coaches and the Coordinator had participated in the CREATe Rubric training. Due to attrition, half of the trained Education Specialist coaches left and have been replaced with new university coaches, who have not yet been trained in the use of the CREATe Rubric as training is no longer offered. 

Although the CREATe Rubric has been well-received by district leadership and candidates, coaches and faculty have been split on their reception of the CREATe rubric. Some of their concerns center on the length and cognitive load of the tool as well as disagreements around who is responsible for introducing and “teaching” the rubric to the credential students. It is plausible that although the lead developers of the rubric engaged collaboratively and intentionally with our district partners to develop this tool, they ended up excluding the coaches from the development process. It is plausible that this misstep did not set the rubric up for sustainability as the coaches who would use the rubric most directly with students on a regular basis are arguably the most valuable voice to have at the table when developing tools that directly impact their practice. When faculty who were the lead developers of the CREATe moved on from the program and coaches continued to question the utility of the CREATe Rubric, an assessment of all of the available reliable rubrics was conducted based on a set of criteria. Additionally presentations and surveys to gather coach, faculty, and district feedback was administered. Based on this input, a decision was made in December 2020 to transition to the New Teacher Project (TNTP) Core rubric, which Chico State adapted to align with the CTC Standards. A Rubric Advisory Board had formed consisting of representatives from all three basic credential programs and an implementation timeline plus additional adaptations were made. However, by March 2020 California moved into its first COVID-19 lockdown putting the new formative rubric implementation timeline on hold. 

From our investigations, we learned that all candidates responded well to a formative rubric that provided common language to make the skills of teaching more visible so they could receive actionable feedback from their coaches. This makes moving toward initial implementation of the new rubric ever more pressing and promising.  

Fresno Assessment of Student Teachers (FAST)
​​The Fresno Assessment of Student Teachers (FAST II) is a state-approved Teacher Performance Assessment (TPA) system designed for use by the Multiple Subject Credential Program. FAST II assesses the pedagogical competence of teacher candidates, including Interns, with respect to the 13 Teaching Performance Expectations (TPEs). Faculty and coaches receive formal FAST II training and take a test in order to become reliable scorers. The FAST II --because it is run in-house -- allows faculty and coaches to participate in the scoring, which gives us great insight into how our candidates are performing in ways that help us reflect on what is working and what needs more attention.

Education Specialist candidates participate in the first FAST-2 subtest, the Site Visitation Project, in their initial general education clinical placement in a classroom with an experienced Multiple Subject Mentor Teacher. The Site Visitation Project is administered by the Multiple Subject Coaches, who participate in the FAST training and calibration sessions each semester with a 100% calibration rate, becoming reliable scorers of the SVP and TSP. On average, 7 tenure track Multiple Subject Program Faculty attend the FAST TSP training and calibration sessions each semester. 100% pass the calibration test and become reliable scorers for the TSP. 

Education Specialist candidates will soon be required to engage in a formal Teacher Performance Assessment. Our program is proposing that, in place on the statewide examination costing $300 per candidate attempt, our candidates will be permitted by the CCTC to engage in the both FAST II assessments. At that point, our  ES coaches will be trained and calibrated on the scoring of both the SVP and TSP. 

Key Assignments from Courses
Throughout our responses to Standards 1 and 2, we also relied on assignments from key courses within the program that we believe are critical to candidates developing the necessary knowledge and skills to be successful educators of students with special needs. These courses all align with the California Commission for Teacher Credentialing Program Standards for Education Specialist preparation programs. In turn, each of the assignments utilized were designed to measure candidates’ knowledge of core content and their ability to apply that knowledge. Consequently, we believe the assignments are valid measures. Moving forward, we will review course assignments on an ongoing basis to ensure that, in fact, what is being assessed is, in fact, the intended content.

Similarly, because these courses are consistently taught by the same faculty members and because the rubrics used to assess the assignments do remain stable, we believe the scores reported are reliable. Still, we know that there is always room for improvement. Moving forward, as a faculty, we will work to review student samples using the rubrics to ensure that we have a common understanding of how to implement the rubrics. Additionally, we will work to collect student exemplars that can be shared with future cohorts to help ensure the fairness of the assessments.

Aspect F →