Program’s Self Assessment Rubric

Meta Rubric: Major/Program Assessment

Department_____________________ Program:______________Assessor:_________________ Date completed:_____

To rate a program, create a score by giving points associated with each of the three levels—1, 2, or 3 (half points acceptable)– and then total at bottom.

Six  Elements

Essential Questions

Level 1: Awareness

Level 2: Development

Level 3: Proficiency


1. Purpose Statement
and Goals of major/program
Are the program’s purpose statement, values, and goals clear? Do they tie to, and support, the college mission? Purpose statement and/or goals are incomplete, insufficient, and not published on the department web site. Purpose statement and goals are in progress in draft form, but may be too wordy, inconsistent in style, or there are too many. Purpose statement and goals list are complete, published on the web site, and a basis for all student learning outcomes.
2. Student Learning Outcomes Is the major/program clearly communicating about what it expects students to do in measurable and observable terms, following program completion? Outcomes are incomplete; are not measurable or observable behaviors; are not student-centered. Stated outcomes are inconsistent or do not encompass students’ knowledge, skills, or attitudes shaped by the program; inconsistent use of observable and measurable behaviors, but there are some. Outcomes are clear and well written; there are an appropriate number, measurable, and they drive assessments for the program. Outcomes per major can be found on the departmental website.
3. Curriculum Alignment of Coursework and Outcomes Where in the curriculum are there opportunities for students to learn the expected content and skills? Are all outcomes covered? No completed alignment matrix presented in written form. No written attempt to ensure coverage. Alignment matrix exists but lacks some outcomes coverage in the curriculum.  Revisiting matrix is needed. Alignment matrix shows linkages between coursework and all learning outcomes.
4. Assessment of Student Learning Outcomes

Is it clear how (and how often) the program is measuring each outcome against an established expected performance level with direct and indirect  assessment methods? Assessment plan is not well developed, is mismatched with outcomes, or not implemented; outcomes are not assessed by responses of faculty, students, field supervisors, or alumni. No time-table noted. Assessment is underway but not fully implemented.  Plan may be missing one or more outcome assessments, criteria for good performance, or time-table is somewhat unclear. Assessment plan is fully developed and implemented through senior capstone seminar. All outcomes are assessed against expected performance using multiple measures (at least 2 directs for each) on a clear time-table (not necessarily annually).
5. Results of Findings are Summarized and Reported Is it clear in a brief year-end report that there are findings shaping the changes to help students? Direct measures of student learning beyond the classroom grades are not being collected or analyzed and thus not reported. Some summary data are reported but may not yet drive curricular improvement or departmental planning; current data summary analysis report may be incomplete (or completed but not submitted to Assessment Office). Summary data from previous year(s) are collected, analyzed, and the report is submitted to the Assessment Office. Changes to help students, if any, are being suggested.
6. Evidence Reported of Changes Made with Follow up Re-assessment Is it clear that the program monitors and reports the impact of changes made and uses these assessments to drive further improvement and planning? Either no report or report does not suggest changes based on assessments. No evidence that assessment data is used to drive change. Evidence of pertinent department meeting discussions is not available. Analysis of recent changes due to some outcomes assessment has begun, and may not yet be reported formally yet; impact may be limited; minutes documenting depart. meetings may not be present, although  may be occurring. Analysis of changes made in recent year(s) and their impact are further assessed and reported.  There is evidence (summary of dept. meetings; web site posts) that assessments are used to drive planning and improvement.


Total Program Actual Score —-

Program’s Mean Score
(total divided by 6)           —-

Print Friendly

Comments are closed