Skip to content

Social Work Program Assessment

Introduction

The BSW faculty and staff have been involved in ongoing efforts improve the program and its assessment and to enhance student performance based on the assessment results. The program contracted with an Accreditation Specialist and an Assessment Coordinator in 2006 to assist faculty with developing and/or refining course assignments to better align with course objectives, designing appropriate assessment instruments, articulating data collection procedures, and analyzing data. Group workshops and trainings, as well as individual consultations continue to be provided to faculty.

The BSW Program has collected and analyzed data for six years in order to monitor the quality and progress of the program as well as assess student performance. Each year, the Social Work department personnel have been deliberate in their consideration and use of the program assessment results to improve the quality of the program and its outcomes. This attention to the assessment outcomes continues to pay dividends in program improvement.

Although one must consider the caveats in regards to the strength of the evidence given our small student population and the relatively large variance in scores both within and across years, the assessment evidence suggests continuous improvement in student achievement. More precise comparison with previous years’ data using the overall benchmark attainment is difficult, given the many changes that have occurred in the program’s assessment over the last six years, including annual revisions of keystone assignments, scoring rubrics, and Internship Learning Agreement evaluation tool. The 2008 EPAS and focus on core competencies and practice behaviors have resulted in yet another significant revision of the BSW program and assessment plan in 2010. All syllabi, assignments, and assessment tools were changed to align with the new standards. Moreover, the small sample sizes for the SKC BSW Program assessments mean that individual scores easily skew the results. This situation can make it difficult to discern trends over the years when one considers assessment results type by type. In spite of these challenges, the BSW personnel and Assessment Specialist will continue to collaborate to maintain a high-quality assessment plan that aligns with the new guidelines and provides information-rich data that the program can use fruitfully.

All accredited social work programs are required to measure and report student learning outcomes. Students are assessed on their mastery of the competencies that comprise the accreditation standards of the CSWE. These competencies are dimensions of social work practice that all social workers are expected to master during their professional training. The following pages will provide information about the data indicators used in our assessment and the findings by academic year.

SKC Program Header

Data Indicators

2006-2010

In 2006-2010, the assessment plan was designed based on program objectives, learning outcomes, and previous CSWE EPAS. During those years, the data was collected from the following sources: Area of Concentration Achievement Test (ACAT) Internship Learning Agreement Outcomes Evaluation (ILAOE) Rubric scores and narrative feedback from keystone assignments, including the Cultural Competency Assignment which was a capstone project completed over three quarters during the students’ senior year that addressed all of the program objectives Assessment Specialist Regina Sievert, an external evaluator for the program, thoroughly analyzed the data and submitted an objective assessment report annually by August. In addition to data analysis, she also provided recommendations for improvement. Based on the assessment report mentioned above, the Department Chair compiled an annual comprehensive program assessment report, titled Learning Outcomes Assessment Report (LOAP), for submission to the SKC’s Office of Institutional Research. The reports thoroughly discuss the BSW program assessment, evaluation, and improvement efforts each year.

2010-2011

In 2010-2011, the program continued the data collection efforts based on the new assessment plan. With the new accreditation cycle, the program had to completely revise the previous assessment plan to align with the 2008 EPAS and core competencies established by CSWE. As a result, the following changes were made in the assessment plan in 2010-2011: Faculty members thoroughly reviewed and revised each assignment and the accompanying rubrics to address and align with 2008 EPAS. Outdated keystone assignments were removed and new ones were designed. The Internship Learning Agreement Evaluation tool was completely re-designed to address and align with 2008 EPAS. Due to the challenges associated with data entry and analysis based on keystone rubric scores, the Practice Behavior Competency Evaluation Instrument on Survey Monkey was designed and implemented. Consequently, with an exception of ACAT results, the assessment data between 2006-2010 and 2010-2012 are not compatible or suitable for comparison. The revised and improved assessment plan developed by the Social Work faculty is designed to strategically measure student competencies based on 2008 EPAS standards developed by CSWE. The data are collected from the following sources: the Area of Concentration Achievement Test (ACAT) completed by students; the Internship Learning Agreement Evaluation (ILAE) completed by the Internship Site Supervisors; and the Practice Behavior Competency Evaluation Instrument completed by instructors.

Area of Concentration Achievement Test (ACAT)

In 2006-2010, the assessment plan was designed based on program objectives, learning outcomes, and previous CSWE EPAS. During those years, the data was collected from the following sources: Area of Concentration Achievement Test (ACAT) Internship Learning Agreement Outcomes Evaluation (ILAOE) Rubric scores and narrative feedback from keystone assignments, including the Cultural Competency Assignment which was a capstone project completed over three quarters during the students’ senior year that addressed all of the program objectives Assessment Specialist Regina Sievert, an external evaluator for the program, thoroughly analyzed the data and submitted an objective assessment report annually by August. In addition to data analysis, she also provided recommendations for improvement. Based on the assessment report mentioned above, the Department Chair compiled an annual comprehensive program assessment report, titled Learning Outcomes Assessment Report (LOAP), for submission to the SKC’s Office of Institutional Research. The reports thoroughly discuss the BSW program assessment, evaluation, and improvement efforts each year.

Internship Learning Agreement Evaluation (ILAE)

The Internship Learning Agreement (ILAE), a contract collaboratively developed by the student and field supervisor on a quarterly basis to guide the student’s internship experience, was fully implemented in the 2010-2011 academic year internships. The ILAE is used in planning and assessing student performance in each of the three-quarters of the Internship field experience. The ILAE consists of various assignments, customized for each individual, that the student completes during their three-quarters of internship. Students negotiate the Learning Agreement assignments collaboratively with their field supervisor and the field education director to fit their field placement and to promote their personal and professional growth. The ILAE addresses all forty-two practice behaviors that operationalize the CSWE core competencies. At regular intervals throughout the quarter, students meet with their field supervisor and the field director to reflect on and receive formative feedback on their progress. The Internship Site Supervisor completes the ILA Evaluation at the end of each quarter, providing a grade and a number that is used for the program assessment. In 2006-2011, the ILA Evaluations were completed in hard copy and the data was then entered into the Survey Monkey. In 2011-2012, the Site Supervisors will complete and submit the evaluations electronically. The benchmark for success set by the SKC BSW Program is a minimum of 75% of students attaining an average score of at least 7 (70%) or higher in all practice behaviors. An assessment score at or above this benchmark is considered by the program to represent mastery of that particular competency.

Practice Behavior Competency Evaluation Instrument

In 2010-2011, SKC BSW Program developed and piloted the Practice Behavior Competency Evaluation instrument on Survey Monkey. This instrument replaces the previous system of collecting data from rubric scores and narrative feedback from keystone assignments, although the instructor’s assessment of students’ level of competency in each practice behavior is still based on those assignments. The direct data entry by faculty significantly reduces the possibility of subjective interpretation of the level of students’ skills and knowledge in specific areas during the data entry process by the Assessment Coordinator. Each keystone assignment rubric is aligned with specific practice behaviors addressed in the course. At the end of every quarter, faculty members enter their evaluation of the level of each student’s competency in those specific practice behaviors in the Practice Behavior Competency Evaluation instrument on Survey Monkey based on their rubrics. The tool measures students’ competency in each practice behavior on a 10-point scale, with 1 indicating lack of competency and 10 indicating the level of competency expected of a professional entering the field. The benchmark for success set by the SKC BSW is a minimum of 75% of students attaining an average score of 7 (70%) or higher in all practice behaviors. An assessment score at or above this benchmark is considered by the program to represent mastery of that particular competency.

Helpful Links
Additional Funding

This website is funded in part under a contract (#20043STIP0002) with the Montana Department of Public Health and Human Services. The statements herein do not necessarily reflect the opinion of the department.