Skip to contentSkip to navigation

Get the latest information about Fall 2021 Repopulation and COVID-19. Before coming to campus, take the COVID-19 Daily Screening.

QAR 1: Conclusion

Overall Findings & Analysis form Self-Study
As documented throughout this Quality Assurance Report, we viewed this first AAQEP self-study as an opportunity to establish a baseline portrait of our initial credential programs using the AAQEP Standards as a framework. Our belief is that establishing an accurate baseline is critical to our ability to engage in authentic continuous improvement work as we move forward. As can be expected, our findings from our analyses allowed us the opportunity to see areas where our programs--and we as an institution--excel, as well as to identify specific areas for growth.

Areas of Strength
In particular, our findings demonstrated to us that each of our initial credential programs--Multiple Subject, Bilingual Authorization, Single Subject, Agriculture Specialist, and Education Specialist--are providing the appropriate coursework and fieldwork experiences to prepare our program completers to excel in their future roles. We found our programs include the necessary coursework to ensure our completers develop the knowledge and skills to be able to support all learners in their development and to create positive and productive learning environments. In particular, we saw that our programs excel in preparing our candidates to engage in culturally responsive practices, which affirms our School’s mission to prepare education leaders to serve in diverse communities. In order to come to these conclusions, program faculty analyzed data from a variety of sources, including candidate performance on key assignments, performance-based assessments, field experience supervisor evaluations, and surveys of completers, alumni, and employers. 

In fact, a key finding on the part of faculty was of the rich data sources to which we have access. Throughout this QAR, faculty drew on the surveys administered annually by the CSU Educator Quality Center to recent completers and completers who are one year out and the surveys administered annually by the California Commission on Teacher Credentialing to completers and employers. We believe these data sources will be immensely helpful to other CSU institutions who choose to pursue accreditation through AAQEP. Although we discovered flaws with both sets of instruments--in particular, the inability to be able to disaggregate data in all the ways we wanted to do so, we also realized how fortunate we are to have access to these datasets and to be able to use them to gather insights about the work we do. Similarly, we appreciated the data we were able to analyze from our candidates’ performance on the two components of the FAST and their field-based evaluations, along with the psychometric report prepared on the FAST which allowed us to evaluate its reliability. We also were able to draw on the resources of our Office of Institutional Effectiveness that helped us to access information about our enrollment data. Looking across our findings from all datasets provided valuable insights into the work of our programs.

We believe there are multiple reasons why our completers are well prepared, and our findings confirmed these beliefs for us. To begin, our credential programs have a long history of being fully accredited by the California Commission on Teacher Credentialing, meaning programs’ practices align with specific program standards. These program standards are updated regularly to respond to shifts in the demographics of our state’s learners and new findings about best practices from educational research. In response, our programs engage in a regular cycle of revisions. Additionally, as demonstrated in our findings, our programs make an effort to maintain strong, collaborative partnerships with local school sites and districts that allow our candidates to have positive and meaningful field experiences in diverse contexts. These field experiences are critical to allowing our candidates to apply their learning from their coursework in local contexts, especially as the majority of our completers remain in the region. Finally, faculty members in all programs maintain active connections with educators throughout the region, which allow them to stay in touch with challenges faced by individuals in the field. They are then able to use this learning to inform their program practices, ensuring that our completers are prepared to meet these challenges.

Our findings also highlighted that candidates and completers of our programs come from a range of racial and cultural backgrounds that are largely representative of the learners in our context. As documented in responses to Standard 3, the majority of learners in our region identify as Hispanic/Latinx. Similarly, in our Multiple Subject, Bilingual Authorization, Single Subject, and Educational Specialist programs, the majority of our candidates identify as Hispanic/Latinx. While this is not the case in the Agriculture Specialist program yet, looking at the demographics of candidates in recent cohorts and the increase in the number who identify as Latinx highlights the targeted recruitment work the program has done and will continue to do. Once candidates enroll in our programs, our programs work to realize the notion of “high challenge, high support.” Our QAR responses highlight the fact that program faculty provide a number of supports to ensure their success, while still maintaining a high level of rigor. We take pride in the high percentages of diverse candidates that successfully complete our programs annually.

Another strength highlighted in our findings are the partnerships we already have in place with P12 educational stakeholders throughout the region. As we documented these partnerships in our responses to Standards 3 and 4, we realized the depth of these partnerships and the ongoing commitments of both program faculty and our P12 partners. This commitment extends all the way to our university president and deans from throughout campus who meet twice a year with regional P12 educational leaders at the President’s Commission on Teacher Education. The commitment is also well documented in the number of teacher residencies we have within our Multiple Subject, Bilingual Authorization, and Education Specialist programs, opportunities that can only come from close collaboration with district and site partners. Also worth highlighting are the strong partnerships our Agriculture Specialist faculty members have with both local agriculture educators and agriculture educators throughout the state. These partnerships work to strengthen the preparation we are able to provide to our preliminary credential candidates.

Finally, as discussed in the responses to Standard 4, in the work they do, our programs absolutely reflect the mission of our university and our school. The university mission is “to boldly educate and empower students for success,” while the vision is that students will be prepared to become our next generation of leaders. Similarly, the Kremen School of Education and Human Development's mission is the recruitment and development of ethically informed leaders for classroom teaching, education administration, counseling, and higher education. As documented throughout this QAR, our programs carry out both missions and the university’s vision as we actively recruit and prepare our credential candidates to become engaged educators in the P20 education system of the region and state. 

Areas for Growth
In addition to providing confirmation of the quality of our advanced credential programs, engaging in this self-study also demonstrated clear areas where our programs and our educational unit as a whole have room to grow and improve, areas that we can build on in our ongoing efforts towards continuous improvement.

Ongoing Continuous Improvement Efforts. Overall, our programs very much valued the opportunity to engage in the self-study afforded by the AAQEP accreditation process. But we realize that this is just the first step. As we have articulated throughout, we saw the preparation of this first Quality Assurance Report as the opportunity to create a baseline understanding of where our programs are individually and where we are as an educational unit. One of our findings is that some of our programs were further along in their intentional use of data to inform program practices and to engage in a continuous improvement process. In particular, our Multiple Subject program has made great strides in this area in recent years. 

As we move forward, we intend to continue these efforts unitwide, utilizing the Data Summits established in 2020-2021 and continuing to support the use of data in making programmatic decisions. In the spirit of continuous improvement, as a unit, we will also reach out to collect data on how these efforts are going and how we can further support program leaders and program faculty in their efforts.

New Systems to Collect and Analyze Data. While we highlighted above the value we found in the existing data sources to which we have access, one of our biggest take-aways from this process is our need to create new systems for the ways in which we collect and analyze data, in addition to our need to make changes to what data we actually collect. As documented in our responses to Standards 1 and 2, we realized that we do not have a unit-wide systematic approach to collecting data from any of our key stakeholder groups--completers, K-12 partners, employers. While the CSU Educator Quality Center and the California Commission on Teacher Credentialing both administer surveys to program completers, employers, and year-out professionals, we discovered that their measures do not always align with the analyses we were trying to do in response to the Standard 1 and Standard 2 aspects. Another challenge to the data collected from these large-scale surveys is that, in many cases, we were unable to disaggregate the data in a way that made the findings meaningful to us. For our smaller specialist credential programs, the number of completers was too small. As a result, for this QAR, we were not always able to capture the perspectives of each key stakeholder group. 

Moving forward, we intend to develop unit-wide surveys that can be administered annually to each stakeholder group that will include both general items about the work our institution does as a whole and program-specific items. The hope is that this will allow us to collect data that will be useful at both levels but that will not lead to survey-fatigue from administering too many surveys, which is already a concern given the administration of both the CCTC and CSU Educator Quality Center surveys. During the 2021-2022 academic year, we intend to develop tools that we can pilot in Spring 2022, working with program faculty to develop items that are meaningful both unit-wide and programmatically and that will support our inquiries into program improvement. Once we have responses, we can begin the process of analyzing the data we collect to ensure the reliability of the tools and then making the necessary revisions. Our hope is that we will create tools that we can use on an ongoing basis to provide meaningful information about the work we do, information we can then use as we continue to engage in our ongoing cycles of continuous improvement.

Beyond that, we hope to also add in more qualitative data collection efforts, such as annual focus groups with each stakeholder group. Ideally, the protocols for these focus groups would come out of the responses to the surveys. In this way, the discussions would help to further illuminate the findings from the surveys.

More Meaningful Recruitment Efforts. Related to the ways in which we collect and analyze data, another area we realized we need to improve is in tracking our candidates--from the point when they are first interested in the program, through their application, to their eventual enrollment, through their ultimate completion of the program, and into their respective roles as educational leaders. While programs do hold a number of recruitment events, rarely is attendance at those events recorded, so we have no way of knowing who attended--and who ultimately applied. This lack of data collection at the recruitment events means we are doing nothing to evaluate their effectiveness, and so we have plans to change that. As an educational unit, we are currently working with our Communications Coordinator to create a system that will allow us to more strategically collect data at recruitment events and then track who from each event applies to, is admitted to, and enrolls in our programs. At the program level, we then want to analyze this dataset on an annual basis in order to be more strategic in our recruitment efforts. These efforts will include engaging in inquiry to learn more about why more black and Southeast Asian individuals pursue careers in teaching and then engaging in targeted recruitment efforts to members of these communities.

Engaging in Data Analysis and Quality Assurance. Another key finding of all programs was the value of engaging in an internal audit as a way to evaluate the work of the program. Again, this was the first time programs had undertaken this process, and it led to authentic findings about where programs were excelling and where they might be able to do more to better support candidates. Moving forward, programs intend to formalize their processes and engage in this process on an annual basis, with a plan to review findings from the audit at program meetings. These findings can then be used to support ongoing continuous improvement into program practices.

As we begin to rethink the data sources we use to evaluate our program practices, we recognize that we also need to be investigating the reliability, validity, trustworthiness, and fairness of the instruments we use throughout programs on an ongoing basis. As highlighted above, we have done some of this work in the past with our analyses of our FAST scores. But the findings of these analyses have not been shared publicly with program faculty. In future years, program faculty will continue to engage in the investigations of data quality, working together to evaluate tools such as surveys or focus group protocols; ensuring the validity of course assignments and assessments and field experience evaluations; and analyzing student work in response to key assignments across different sections of courses, much as we do in FAST calibration sessions. As a unit, we intend to support these efforts by making data source evaluation a focus of future Data Summits.

Supporting Transitions. Our findings also demonstrated that, while our work preparing our candidates during their time in our programs is strong, we do little to inquire into and support their transition into their new roles and to provide ongoing professional development. Providing better support to our alumni would also allow us to have a broader reach within our service area. In order to begin to do this well, we need to extend our data collection beyond the point of program completion into our completers’ time of employment. As we move forward, we intend to create a system to collect data from our completers that allows us to learn more specifically about where they are placed when they leave our program and then to follow up with them to learn about successes and challenges they experience in those placements. We envision this beginning with our internal survey that we administer to completers at the time they leave our programs that will collect contact information along with details about their places of employment. Beyond that, we intend to follow up with them on an annual basis in order to continually update our database, learn about our programs’ impact on their practice, and learn how we can continue to support them in their professional roles. 

Within the next year, we intend to pilot these instruments to see how effective they are in gathering the data we are seeking. Once we do so, we will evaluate that effectiveness and then make any necessary revisions in order to ensure that we gather the information that is most meaningful to us. 

That said, our hope is that this process will be mutually beneficial for both us and our completers. As we learn about particular challenges our initial credential candidates are facing in their roles as educators, our intent is to provide professional development opportunities to address these challenges, while also making changes to our program practices so that future completers will not face the same challenges. Our hope, too, is that as these educators are ready to pursue the next steps in their educational journeys, they will return to the advanced credential and graduate programs we offer. 

Stronger P12 Collaborations. Finally, we want to continue to build on our existing strong partnerships with local sites and districts to make them even more collaborative and, hopefully, beneficial for all involved. Moving forward, as highlighted in Standard 4, our programs have plans to be more strategic in the creation and use of advisory boards to inform program practices. Currently, most programs do not have their own advisory boards to help shape the work of the program. Bringing individuals from the community together to discuss program practices and to analyze program data is a critical next step to improving the work we do to prepare qualified candidates. We envision that these partnerships will be mutually beneficial in that we can share data from our programs to get input and guidance, and our P12 partners can do the same. Our ultimate goal is to work together to ensure that we truly are empowering our candidates to become the next generation of ethically informed educators in our region and our state.

Top of Page