Standard 3 Aspect E
Standard 3e: Preparation programs ensure that candidates, upon completion, are ready to engage in professional practice, to adapt to a variety of professional settings, and to grow throughout their careers. Effective program practices include: consistent offering of coherent curricula; high-quality quality, diverse clinical experiences; dynamic, mutually beneficial partnerships with stakeholders; and comprehensive and transparent quality assurance processes informed by trustworthy evidence. Each aspect of the program is appropriate to its context and to the credential or degree sought.
Internal Audit Process:
Six program completers (Appendix D) were contacted via email by the program coordinator to learn about their program experiences. All six candidates successfully completed the program. Of these students, three students responded. The program completers were asked the following questions:
- How did you hear about our program? Were you recruited in any way? Did you attend any pre-admission meetings or events?
- Did you receive any scholarships, grants, or fellowships during your time in the program?
- Did you have any separate advising meetings or specific emails outside of the program Canvas site? If so, how often were these and what was the nature of these meetings/emails?
- Did you complete your fieldwork requirements (working with children or teachers for LEE 224, 230, 234, and 254) in your own classroom? Or did you meet these requirements in a different setting? (if so, please explain)
- Were there any specific challenges while completing the program? These can be related
to courses, fieldwork, and/or the comprehensive exam or project.
- What would you say was the most challenging part of the program overall?
- Were there any stand-out positive experiences you had while completing the program?
- What would you say was the best part of the program overall?
Outcome from Audit Process:
The admissions criteria and process is very clear. None of the four students included in the audit attended or received any type of specific recruitment. The program website and admissions analyst, Renee Petch, are vital in this aspect to the program.
The development and maintenance of the program Canvas site is important to student advising. Students receive messages about important deadlines through the Canvas site and can easily view key program information.
Additionally, the program coordinator held a specific advising session for all students prior to taking their comprehensive exam. This meeting was helpful for students to know what to expect.
There are a number of scholarships available for students in Kremen. Of the four students included, one student was a Graduate Research Equity Fellow and another received a scholarship. One student indicated they did not apply for any scholarships and another indicated they were not aware of any scholarships available. Learning about where to locate and apply for scholarships is something that can be made more clear during new student orientation and through the program Canvas site.
Each student noted different experiences. One student who was a Graduate Equity Fellow noted it was this experience of working with program faculty on research projects that was most valuable. Another student mentioned the strength of having a cohort that took classes together during the program and how the program was organized with classroom teachers in mind. One student discussed the helpfulness of the group and collaborative projects in courses and a study group for the comprehensive exam.
One candidate did not enroll in a required course early on in her program. This error was not realized until the candidate submitted her Advancement to Candidacy paperwork. The program coordinator worked with the student to determine the best way for the student to complete the program requirements and complete the program in a timely manner.
This error occurred prior to the development of the program Canvas site. Now all students receive clear instructions about which courses they need to take each semester.
Another student who took the LEE 153 course in the summer mentioned this was better financially, but very stressful. Additionally, this same student mentioned issues with the comprehensive exam and how the practice of the comprehensive exam does not seem to align with the philosophy of the program. This is an important piece of feedback that should be discussed further in the Fall 2021 semester.
Based on this internal audit, students are well-supported at all stages of their program and errors are quickly corrected.
Process for Ensuring Data Quality through Validity, Reliability, Trustworthiness,
The majority of the measures used to evaluate our program in this QAR are key assignments students complete as part of their coursework. Each assignment was created to align with course expectations, which align with the CCTC Program Standards for the Reading/Literacy Specialist. In this way, we believe the assignments do serve as valid measures to assess our candidates’ learning. (See Appendix E)
Moving forward, we intend to engage in a deeper analysis to ensure that what we believe about the assignments’ validity is, in fact, true. For example, as we were using the rubric scores for the LEE 213 Inquiry Project, we realized that the rubric provides more of a holistic analysis of student learning rather than breaking the assessment down to specific areas intended to be measured with the assignment. As a program faculty, we intend to begin analyzing both the assignment expectations and the tools used to measure those assignments to ensure that they are, in fact, valid.
Once we have the revised assignments and assessment tools in place, we will pilot them in our courses. In order to establish reliability, as a program faculty, we will engage in joint scoring of student samples using the revised rubrics to ensure that we are all in agreement on how the assignments should be scored. Where disagreements arise, we will work to come to a common understanding. We will repeat this process on (an annual, biannual, regular) cycle to ensure the ongoing reliability of the measure.
As a program faculty, we do believe that the assignments we provide to candidates are fair in that we always provide specific instructions along with rubrics to provide details for how the assignment will be scored. Moving forward, we will also work to collect student samples from successful candidates that can serve as models for current candidates.
We intend to begin this process with the assessments included in our Student Outcome Assessment Plan, which is part of our ongoing university program review. From there, we will move on to other key measures included in our analysis. Additionally, once we have our Advisory Board established, we intend to share these assessments with the Board members to get their input. Doing so will help us to increase their validity.
Key Data Measures can be found in Appendix E