2.1 The IR indicates that programs in KSOEHD use TaskStream to collect and score student work. It is unclear at this time what other technological tools and resources are used to create and disseminate reports for faculty and other stakeholders.
Other technological tools and resources used to create and disseminate reports include:
• Tools and resources associated with data collection and analysis instruments such Qualtrics and Survey Monkey
• Excel Spreadsheet Software
• SPSS Statistics Software
2.2a Based on the evidence, the unit appears to be at the emerging stage of moving toward Target. There is evidence that the unit is performing at target level on some aspects of the standard, but there are no plans or timelines for attaining consistent performance at target level for all programs.
We strategically selected Standard 2 for our Initial Programs (Multiple and Single Subject and Special Education) to move toward target because the Kremen faculty and administration felt we were ahead of many other institutions in the country in regards to assessment. We had over 10 years of program quality data from our graduates (completers) and their employers after one year of teaching. We had an established TPA system (FAST) with higher levels reliability than any other TPA system in California, including PACT, the pre-cursor to the edTPA currently being piloted nationally. Our degree programs have SOAPs (Student Outcome Assessment Plans) that are reviewed by the university assessment committee on an annual basis. We have a unit-wide assessment system, KLASSI, that moves us through inputs, processes, and outputs, to “closing the loop” and making evidence-based decisions all in the name of continuous improvement. While many other accreditation coordinators and education deans have identified Standard 2 as the most difficult standard to reach target level, we were, and still remain, confident in our decision to move toward target in Standard 2.
Action Plan and Timelines. The Kremen School action plan and timeline for continued movement toward Target for this standard and all others is highlighted in the Kremen School Strategic Plan: 2012-2015. The plan includes goals in the following areas:
• Enhance the Student Learning Environment
• Commitment to Student Transformation and Success
• Transformational Scholarship
• Developing the Kremen School Community
• Engagement with the Region
These areas, and the goals subsumed under them, evidence our current trajectory toward Target. Next year, 2015, the university will be undertaking the development of a new campus strategic plan. As that plan unfolds, the Kremen School will follow with a new strategic plan that not only aligns with the university plan, but positions us to move forward with the emerging CAEP standards.
Within out NCATE unit, our plan continues to capitalize on the expertise and experience of our Program Coordinators to understand the standards, implement the assessment system, generate and analyze data, and to close the loop by making well-informed decisions that lead to continuous improvement. To accomplish this we have established a pattern of regular program coordinator meetings. The table below describes the meeting agenda, time frame, and outcomes.
|Meeting Topics||Time Frame||Outcomes||Next Steps|
|Analyzing data as prescribed by SOAP or assessment plan||Fall semester: August/September||Data aggregation||Revise SOAPs if required. Report to department/program|
|Data interpretation and reporting out||Fall semester: October/November||Identifying areas for improvement||Report to department/program|
|Closing the loop||Spring semester: February/March||Implement continuous improvement||Report to department/program & community partners|
|Gather data as prescribed by SOAP or assessment plan||Spring semester: April/May||Generate disaggregated data||Revise data gathering instruments and report to dept/program|
2.3 AFIs continued from last visit
Candidate performance data on dispositions outside the KSOEHD have not been systematically aggregated and summarized
Feedback: More information needed to make a decision. Link to disposition data was not active.
The link to depositional data has been restored. Initial programs’ pre- and post-assessment of dispositions is available at: http://kremen.fresnostate.edu/ncate/documents/standard1/DispositionAnalysis.pdf
For advanced programs, dispositions are assessed as a subset of the Exit Survey. Please see pages 7-9 in the Exit Survey results available at: http://fresnostate.edu/kremen/ncate/documents/standard1/ExitSurveyAnalysis.pdf
2.5 Evidence for the BOE Team to validate during the onsite visit
1. Aggregated dispositions data for advanced programs
Initial programs pre- and post-assessment of dispositions is available at: http://kremen.fresnostate.edu/ncate/documents/standard1/DispositionAnalysis.pdf
For advanced programs, dispositions are assessed as a subset of the Exit Survey. Please see pages 7-9 in the Exit Survey results available at: http://kremen.fresnostate.edu/ncate/documents/standard1/ExitSurveyAnalysis.pdf
In our discussions with Program Coordinators, we have become dis-satisfied with the more current rates of return on the Exit Survey. The Program Coordinators determined that we need better approaches to fail safe the system for higher return rates. Consequently, each advance program Coordinator has been charged with reviewing their programs sequence of courses and completion procedures to determine a more efficient program-specific strategy for maximizing return rates. Strategies to consider include:
• Tying the survey to a culminating experience
• Adding the survey to a signature assignment in a capstone course
• Requiring proof of completion of the survey to apply for a credential
2. Evidence that assessment results data are shared with community committees such as the Kremen School Professional Advisory Committee on a regular basis and that these committees have the opportunity to provide input on programmatic changes.
During the on-site visit, NCATE/CCTC team members will have access to evidence such as meeting agendas, meeting follow-up notes/minutes, artifacts from meetings, conversations with committee members.
3. Evidence that faculty are provided opportunities to review assessment results and that they have opportunities to provide input on using results for program improvement, especially degree program faculty
All degree programs have a “Closing the Loop” process described in the program’s SOAP. Evidence can also be obtained through interview conversations with faculty during on-site visit.
4. Evidence that a process is in place for the management and resolution of candidate complaints, including review of any formal complaints that have been made since the last NCATE accreditation visit
During the on-site visit, Associate Dean Marshall will provide evidence of information related to the management and resolution of candidate complaints maintained in his office.
5. Evidence that key assessments are managed and maintained in Taskstream
Recognizing the Taskstream is only used for Initial Program, evidence will be provided for key assessments (TPA) during the on-site visit by the Taskstream administrator, Dr. Macy, and TPA coordinator, Dr. Behrend.
6. Evidence that procedures for ensuring assessments are accurate, consistent, and free from bias are applied to online programs, as well as to other assessment measures other than FAST
During the on-site visit, evidence will be provided to demonstrate that assessments, other than FAST, are accurate, consistent, and free from bias. For instance, the Center for Teacher Quality in the CSU system has conducted annual reliability and validity studies of the one-year out completer and employer survey administered to all initial credential candidates employed as teachers.
Additionally, the Kremen administrators, together with program faculty, have conducted validity and fairness studies of the one-year out surveys. These studies have been published in several widely-recognized, peer-reviewed journals. Citations include:
Beare, P., Torgerson, C., Marshall, J., Tracz, S., & Chiero, R. (2014). Examination for bias in principal ratings of teachers’ preparation. The Teacher Educator, 49 (1), 75-88.
Beare, P., Torgerson, C., Marshall, J., Tracz, S., & Chiero, R. (2013). Surveys of teacher education graduates and their principals: The value of the data for program improvement. Teacher Education Quarterly, 40 (3), 143-161.
Beare, P., Torgerson, C., Marshall, J., Tracz, S., & Chiero, R. (2013). Examination of Alternative Programs of Teacher Preparation on a Single Campus . Teacher Education Quarterly, 39 (4).
Chiero, R., Tracz, Marshall, J., Torgerson, C., & Beare, P. (2012). Learning to teach: Comparing the effectiveness of three pathways. Action in Teacher Education, 34 (4).
Marshall, J. , Beare, P. & Newell, P. (2012). U.S. Department of Education’s Teacher Education Reform: How Does Your Program Rate? Educational Renaissance, 1 (1).
Beare, P., Marshall, J., Torgerson, C, Tracz, S. & Chiero, R. (2012). Toward a culture of evidence: Factors affecting survey assessment of teacher preparation. Teacher Education Quarterly, 39 (1).
Articles will be available for review during the on-site visit.