Skip to main content Skip to main navigation Skip to footer content

AAQEP Accreditation 2022

Standard 1 Aspect D

Standard 1d: Evidence shows that, by the time of program completion, candidates exhibit knowledge, skills, and abilities of professional educators appropriate to their target credential or degree, including: Assessment of and for student learning, assessment and data literacy, and use of data to inform practice


Data Sources & Analysis

Data Source 1

Comprehensive Exam

Perspective Captured from Data Source: Faculty

Rationale for using Data Source:
Every year, for a culminating experience, students in the MS in School Counseling may elect to take a Comprehensive Exam during the final stage of their program. The comprehensive exam consists of two sections: multiple-choice items and essay responses to a vignette. 

Specific Elements of Data Source:
We decided to use the comprehensive exam essay, specifically the five items mentioned in the rubric below, because each of these items gauges candidates’ ability to assess K-12 students’ concerns to develop specific, measurable, achievable, realistic, and time-bound goals to help the K-12 student, and implement and evaluate effectiveness of the interventions suggested in promoting K-12 students’ academic, career, and socio-emotional wellbeing. These items also evaluate candidates ability to engage in data-driven practices, and further use data to evaluate the effectiveness of their counseling practices. We use the following rubric to analyze students’ comprehensive exam essays: 

K-12 Comprehensive Exam Essay Rubric

info
Item No. Contents 10 pts max. @ Total Scores
1. Describe the 4 themes of the ASCA National Model and how they all apply to this scenario. 10=insightful and thorough; 8-9=complete and specific; 6-7=addressed the question but vague in answers; 1-5=missed significant issues.  
2. Based on the data, what SMART goals would you establish? 10=insightful and thorough; 8-9=complete and specific; 6-7=addressed the question but vague in answers; 1-5=missed significant issues.  
3. List and describe systemic interventions you would implement as it relates to your SMART goals listed in #2. 10=insightful and thorough; 8-9=complete and specific; 6-7=addressed the question but vague in answers; 1-5=missed significant issues.  
4. Set timelines and responsibilities among you and your team to carry out your interventions in #3. 10=insightful and thorough; 8-9=complete and specific; 6-7=addressed the question but vague in answers; 1-5=missed significant issues  
5. What data would you collect to measure the success of your interventions listed in #3? 10=covered four strategies thoroughly; 8-9=covered three to four strategies in specific ways; 6-7=covered two strategies in specific ways; 1-5= covered less than two strategies  

Definition of Success for Each Element:
For our department, the definition of success for students in the comprehensive exam is for them to get 85 points or above (70%). We hence define success as at least 85% of students passing the essay portion of comprehensive exam every year. 

Displays of Analyzed Data:
Following are the tables that demonstrate the number of students who passed comprehensive exam by getting 85 points or higher on their comprehensive exams essay during Fall 2019, Fall 2020, and Spring 2021

info
Semester Total Numbers
Students Evaluated
Number (& %) who Passed
Fall 2019 24 22 (91.6%)
Fall 2020 17 12 (70.65%)
Spring 2021 12 11 (90.9%)

Link to Full Dataset: Link to Google Folders (full dataset of essays and rubrics)

Interpretation of Data:
Based on quantitative analysis of the data, we can conclude that usually more than 90% of the students pass the comprehensive exam’s essay portion. However, in Fall 2020, only 71% students passed. We believe that one of the reasons for five students to fail that semester was the sudden transition to virtual learning and examination due to COVID-19 pandemic as well as mental health concerns that students might have been dealing with due to the pandemic. We provide students three opportunities to participate in the exam so that they can continue to make efforts to expand their pedagogical and professional knowledge and ultimately pass and successfully graduate. So far, all students have passed the comprehensive exams within three attempts. 

Apart from quantitative analysis on the number of students who passed or failed, we also interpreted the data qualitatively by focusing on students’ responses to the following questions, which are directly related to this aspect of using data to inform practice: 1. Based on the data, what SMART goals would you establish? 2. List and describe systemic interventions you would implement as it relates to your SMART goals listed; 3. What data would you collect to measure the success of your interventions listed? 

As evident from the samples of essays provided through the google drive link below, students’ responses to these questions demonstrate their ability to set specific, measurable, achievable, results-focused, and time-bound goals based on the data provided within the vignette. For example, some of the SMART goals that students articulated are: (1) K-12 students to have the ability to raise their grades by recovering credits from missing and incomplete assignments; (2) providing social emotional curriculum for K-12 students by the end of the second quarter of the Fall semester and assess students’ stress levels by sending out self-report surveys to students 1-2 times a month; and (3) increasing K-12 students’ sense of belonging as assessed by school connectedness and belonging surveys and attendance rates.   

Students’ responses also included plans of specific data that they could collect and analyze while engaging in individual and systemic level interventions to support their clients to meet their goals. These plans included collecting data on students’ attendance, GPA, teacher or parent reported behavior and academic updates as well as self-report measures on career choices, academic identity, self-concept, self-esteem, and hope. For instance, a student mentioned that to evaluate the effectiveness of their data-driven school counseling interventions, they would  “measure progress by monitoring their assignment completion and overall grades percentages through the online database used by the school (example: powerschool).” Other students referred to using self-report measures of self-esteem, stress, self-concept, career identity, and academic motivation to evaluate the effectiveness of school counseling interventions. 

Data Source 2

Site Supervisor Program Evaluation Survey

Perspective Captured from Data Source: Site Supervisor

Rationale for using Data Source:
Assessing candidates’ assessment of and for student learning, assessment and data literacy, and use of data to inform practice is an important aspect of their internship evaluation conducted by their site supervisor who has closely watched and supervised their efforts as counselors-in-training. We have used the COUN 249 Site Supervisor Evaluation through which site supervisors evaluate candidates’ fieldwork placement as a data source for this standard because some specific items in this evaluation directly evaluate candidates’ ability to engage in evidence based practices, use data to inform their practices, and use specific tests and assessments that are helpful for K-12 students. 

Specific Elements of Data Source:
The following items from the Site Supervisor Evaluation were selected for analysis:

  • Understand and use data and information systems on student learning and achievement
  • Understand and use career development materials.
  • Understand and use information on colleges and universities.
  • Understand and use school technologies for information access, teaching and learning.
  • Understand and use tests and measures used in assessing student learning and achievement, development of school, family, and community partnership.

Definition of Success for Each Element:
Site supervisors evaluated students on a 4-point likert scale (1 = Very Unsatisfactory, 2 = Moderately Unsatisfactory, 3 = Moderately Satisfactory, 4 = Very Satisfactory). Success for this data source will be indicated by achieving an average of 3.0 or higher.

Displays of Analyzed Data:

A total of 28 site supervisor evaluations were analyzed. All evaluations were from one academic year because previous academic year evaluations were only kept as hardcopies and could not be accessed due to COVID-19 pandemic safety protocols and working remotely. Only Spring 2020 (when COVID-19 pandemic started) and Fall 2020 surveys were conducted online and hence only those could be analyzed.  

Please see below a table that displays the means across survey items from the site supervisor evaluation that addresses this standard.

info
Survey Item Sample Size Mean
Understand and use data and information systems on student learning and achievement 28 3.71
Understand and use career development materials. 28 3.71
Understand and use information on colleges and universities. 28 3.89
Understand and use school technologies for information access, teaching and learning. 28 3.89
Understand and use tests and measures used in assessing student learning and achievement, development of school, family, and community partnership. 27 3.44

 Link to Full Dataset:

Interpretation of Data:
Means for five survey items from the site supervisor evaluation were calculated and examined. The range of the means for these survey items was 3.44 to 3.89. All means were above 3.0, indicating that site supervisors rated all students in the “moderately satisfactory” to “very satisfactory” range. This suggest that site supervisors generally perceived students to exhibit the knowledge, skills, and abilities related to assessment of and for student learning, assessment and data literacy, and use of data to inform practice.

Next Steps: 
Based on the data connected to this standard, our candidates exhibited knowledge, skills, and abilities of professionals appropriate to their target credential. Students learned ways to utilize assessment tools to support student learning and data to inform practice. In order to address what we found on our comps results, we will connect with students who are planning to take the comprehensive exam to check on their preparation. Program faculty will continue to maintain open communication with site supervisors about interns’ performance and improvement. To evaluate our efforts in this area, we will keep track of support attempts to connect with students to understand their preparation for comps as well as ask supervisors for suggestions to enhance our students’ preparation for assessment and data informed practices. 

Based on the results of this analysis, we also envision strengthening students’ knowledge and skill-set in using data to inform their practice, engaging in data-driven counseling interventions, and further evaluating the effectiveness of their interventions through utilizing various quantitative and qualitative measures. To assess our progress in this area, we will also analyze the means of scores on the specific 5 questions that focus on assessment and use of data to inform practice. 

Aspect E →