Detailed Outline of Program Review
The purpose of Program Review is for instructional units to perform an internal review of the programs and services they are offering to students. The purpose of this review is to make sure that course content and methodology are meeting the needs of both the students and the community. All faculty members within the unit are to be involved in the review process. At least one external advisory group and some students should also be involved. Programs are selected each year by the Vice President for Instruction to be reviewed and all programs will eventually rotate on a five-year schedule.
The term "Program" is loosely defined for the purpose of review. We will define a "program" as:
"a group of courses, services or activities designed and implemented by a specific group of people with a common purpose or core set of outcomes."
A program can be a degree-granting entity, a group of courses that lead to "adequate training" in an area or an instructional service delivery area of the college such as some of the non-occupational courses offered through CCE. No matter what type of program you represent, your review must address the following SACS Core Requirements and Comprehensive Standards if appropriate.
The institution engages in ongoing, integrated and institution-wide research-based planning and evaluation processes that incorporate a systematic review of programs and services that (a) results in continuing improvement, and (b) demonstrates that the institution is effectively accomplishing its mission (institutional effectiveness).
The institution identifies expected outcomes for its educational programs and its administrative and educational support services; assesses whether it achieves these outcomes; and provides evidence of improvement based on analysis of those results.
The institution demonstrates that each educational program for which academic credit is awarded (a) is approved by the faculty and the administration, and (b) establishes and evaluates program and learning outcomes.
The institution’s use of technology enhances student learning, is appropriate for meeting the objectives of its programs, and ensures that students have access to and training in the use of technology.
The institution identifies college-level competencies within the general education core and provides evidence that graduates have attained those competencies.
The institution regularly evaluates the effectiveness of each faculty member in accord with published criteria regardless of contractual or tenured status.
When evaluating success with respect to student achievement in relation to the institution’s mission, the institution includes, as appropriate, consideration of course completion, state licensing examinations, and job placement rates.
Required sections by all programs:
In order to address SACS criteria, sections III and IV on the Review outline will be required by all programs. Other elements of the Review process allow for some freedom among programs. This overall review process has been refined to assist each unit in gaining helpful information and insight from the process.
Description of the Process:
Part I. The Program Profile
Each unit will have an opportunity to describe their program in some detail. All programs should fit nicely into the Mission of CPCC. Most programs don't have a separate mission except for areas that must be accredited such as Nursing, Engineering, etc. These programs are often required to have a "department" or "program" mission. If your program has a mission, you may use it here. But if it does not have a mission separate from the College, do not take the time and energy to create one, instead, simply address how your unit's goals fall in line with the College's mission and goals. Goals must always lead back to the mission. If they do not, red flags go up in the minds of those reading your review. All units must link to the college's mission.
Due to the fact that the college is converting to a merit system, since "lifelong learning" is a goal of the college, and since 20 hours of professional development are strongly recommended for all faculty/staff, it was decided that this would be a good opportunity to discuss the following:
The credentials of all full and part-time faculty:
degrees, special training, certificates, etc.
The accomplishments of the faculty:
grants, recognition, awards, fellowships, community service, etc.
Professional Development activities of the faculty:
courses taken, special training received, conferences attended, etc.
C. The Students
It is important to discuss the type of student that your program serves. Many programs serve only one type of students…. e.g. Physical Therapy students must be admitted to the program and the program serves no non-majors. Therefore their students are all declared majors. They have a certain GPA and set of prerequisites to enter the program. Other programs serve a diverse group of students. The following are suggested (not an exhaustive list):
Planning and Research will provide the following for you in a table you can upload into your document (by program code or core set of course numbers):
- Headcount, assigned seats and FTE by term (since Summer 97)
- Demographic information on students (race, gender, age)
- Degrees/certificates/diplomas awarded
- Age, race, gender and credit hour loads of students
Other information you may want to include (from your records)
- Programs that require your courses
- Noticeable trends in enrollment
- Funds received to serve students previously not served
An explanation of any of the above would be welcome if an understanding can be reached as to why certain trends have occurred. The important issue here is that programs have an understanding of who they serve and how characteristics of those they serve impact programs (offerings, outcomes, etc.)
Part II. Program Content
Here units can define their "program." Some will be degree-granting programs and others will be a set of courses taken for training purposes or as core courses toward general education. Programs have freedom to explain themselves in terms of "function."
A. Definition of the program
B. Curriculum or coursework—this has more to do with the department/program's offerings. Some items that might be included are:
1. Service courses for general education core
2. Stand-alone programs (set of courses that don't lead to a degree)
3. Degrees, certificates, diplomas
C. External accreditation - the process, what is involved and status
Accreditation by whom and the process
D. Innovations, new programs, new courses, state-wide or national efforts, diversity applied to curriculum
The application of diversity to the curriculum
Curricular changes (innovations, new courses, recognized efforts)
E. Testing and remedial coursework
F. Distance education offerings and use of technology (include evaluation of efforts)
Expanded use of technology in the classroom
G. Funding for curricular changes or offerings
Funds received to meet curricular needs
Part III. Student Learning Outcomes
There are 3 types of outcomes that can be included in Program Review:
1. Student Learning Outcomes (required): The changes in knowledge, ability, skills, values, etc. that occur as the result of individual learning that takes place in the classroom. Some of those things might be the ability to write effectively (appropriate to discipline), ability to properly diagnose and treat a patient, content knowledge applied to real world settings, demonstrated skills, etc.
2. Program Outcomes (required): The benefits (changes in values, status, position, etc.) students receive as a result of completing the entire program of study (rather than a few courses). Some of those things might be employment, licensure pass rates, opportunities for advancement, improved conditions, lifelong learning issues or employer satisfaction with recently hired students.
3. Administrative Outcomes (optional): Benefits for the department, administrative unit or college as a whole. If you want to improve programs and services, attempt a new solution to an old problem, or help improve conditions for student, faculty and staff. Some examples are: recruiting a new faculty member with expertise in a needed area, improving turn-around time for hiring new employees, increasing student/faculty/staff perceptions of safety on campus by upgrading lighting, etc.
Education has moved into the age of assessment. We can no longer evaluate the effectiveness of our programs by FTE and numbers served. Our system office, accrediting agencies and government officials are interested in the outcomes our students see as a result of attending CPCC. It is no longer good enough that we offer programs; we must show that we are continually seeking to improve the content and methodology of those programs to better meet the changing needs of our students and the community. Many states have gone to performance-based funding to force colleges and universities to assess how effectively they are meeting student outcome objectives.
In this section, you must do the following:
- All credential-granting programs must identify program outcomes (no more than 2-3)
- All programs (basic skills, CCE and curriculum) must identify 3 student learning outcomes based on coursework; for curriculum programs, one of these student learning outcomes should reflect a core competency Detailed Core Competency Information
- Identify what assessment you will use to measure progress on the outcomes (e.g. surveys or the State Nursing Board exam results)
- Identify what constitutes success on that measure (e.g. 80% passing rate)
- Use of results to improve programs (Once you receive the data from your assessment - what are going to do with it?)
There are several steps involved in measuring outcomes:
A. Identifying outcomes: Ways to identify program and student learning outcomes are as follows:
- Utilize an advisory committee who understands the benefits students achieve as a result of your program.
- Use focus groups of former graduates or completers to get information as to the benefits students have received as a result of your program.
- Look at the syllabi of instructors to see what instructors as a whole expect students to achieve through coursework.
- Look at the literature in your field.
- Check with other schools with similar programs to see how they have assessed outcomes.
B. Decide which outcomes you want to measure. There are three categories of outcomes: student learning outcomes, program outcomes, and administrative outcomes.
Student learning outcomes are the benefits to students (changing in knowledge, skills, values, attitudes, etc.) as a result of learning that takes place in the classroom.
Program outcomes are benefits students receive once they complete an entire program or receive a credential. Sample program outcomes might be:
- Job placement rates
- Transfers to 4-year programs
- Knowledge/skills achieved that make one successful in the field
- Improved conditions (employed and moved from public housing)
- Values practiced
- Completion rates and retention rates
- End-of-course (post) test scores
- Critical success factors, licensure exam scores
- Lifelong learning indicators
- Professional service and participation
Administrative Outcomes: Some programs may also want to identify Administrative Outcomes which are outcomes set by the program faculty/staff but don't necessarily have to do with student learning. If you choose to set administrative outcomes, set no more than 2-3. Examples of administrative outcomes would be:
To apply for and receive accreditation from ?????
To retrain two faculty members in the area of (something needed for your program)
To increase the number of students completing courses by 10%
What outcomes "are not":
- Grades from courses
- A list of the 400 learning objectives off the syllabi (don't include copies of your syllabi as proof)
- Program outputs (Headcount, FTE, Assigned Seats, number of graduates, etc.)
C. Assessment of selected outcomes
Determine a method for regularly assessing outcomes. Some typical methods are as follows:
- Pre and post-test students upon entry and exit from a program
- Grading rubrics or assessments used with specific assignments in courses designed to demonstrate outcome skills, portfolios, eportfolios, graded capstone projects, culminating experiences
- Include a program survey in the general follow-up surveys of graduates and employers conducted by Planning and Research
- Use focus groups
- Use archival data (credentialing boards, state exams)
- Surveys and feedback from graduates
- Surveys and feedback from completers
- Surveys and feedback from current students
- External feedback - advisory committee members, employers of graduates, supervisors of internships, clinical and apprenticeship work
D. Results of Outcome Assessment
Summarize conclusions reached as a result of outcomes assessment.
Part IV. Need for Change
Based on Student and Employer Feedback
Using outcome assessment and accountability measures results to improve programs and services is the most important aspect of review. By assessing outcomes, programs often find that students are not doing well in certain areas or that changes need to be made to keep up with trends in the field. Finding program weaknesses or need for change is a "good thing". This gives a program direction for making changes and the ability to document the effort taken to make program improvements (true institutional effectiveness). Results from measuring outcomes should be used in this section.
Most programs in higher education feel strongly that they are offering a good program that is state-of-the-art in their field. Often this is not true and programs would benefit in taking a frequent inventory of program effectiveness, strengths and weaknesses and make regular feedback part of their planning process. Students and employers are excellent sources of perceived program strengths and weaknesses. Five sections that must be included are:
A. Strengths identified by students and employers
B. Weaknesses identified by students and employers
C. Recommendations and strategies for change
D. Suggested ways for improvement
E Strategies for change - ways to better serve our students
During CPCC's 1992 SACS visit, we were told that we needed to "close the loop" or use feedback to improve programs. Programs often claim "on paper" to use student feedback to make programmatic changes but evidence of those changes is never recognized. This is why sections III and IV were included (starting in 1998) in the required portion of the review process. During the 2002 visit, we received no recommendations in institutional effectiveness.
Previously, programs that were reviewed in a given year were required to submit a brief document in the Spring of the following year identifying all the programmatic changes made as a result of assessing outcomes the previous year during their review. In Spring 2012, the follow-up process was changed to a three year interim review.
Part V. Future Issues
This is an opportunity for programs to discuss what they will need for future growth, where their program is going, or anticipated future changes. Resources needed for future efforts can be discussed here. Some other issues that can be discussed are:
A. Anticipated future curricular changes and needs - this may include the development of new courses or a new emphasis track.
B. Market trends within the program area
C. Equipment, space and faculty needs for future growth or continuation
D. Future plan