Skip to Main Navigation | Skip to Content




Academic Affairs / Provost

Administration 203
Warrensburg, MO 64093
Phone: 660-543-4116





Chapter 5. Criterion 4. Teaching and Learning: Evaluation and Improvement

The institution demonstrates responsibility for the quality of its educational programs, learning environments, and support services, and it evaluates their effectiveness for student learning through processes designed to promote continuous improvement.

Core Components

4.A. The institution demonstrates responsibility for the quality of its educational programs.

4.A.1. The institution maintains a practice of regular program reviews.

All academic programs undergo review every five years. For undergraduate programs, the review is conducted by the Academic Program Review Committee (APRC) composed of faculty from each of the colleges and the Vice Provost for Institutional Effectiveness. Each program must respond to a set of questions that focus on program viability and quality. The Office of Institutional Research (IR) provides each department with a Data Pack that contains program information over a five-year period in three areas: Academic Quality, Productivity, and Finances. Programs are evaluated based on their responses to the program review questions and the results presented in the Data Packs. Graduate Programs are reviewed by the Graduate Council using a set of standards similar to those employed by the APRC. All programs, undergraduate and graduate, are on the same review cycle. The recommendations of the APRC and the Graduate Council are forwarded to the Provost and Chief Learning Officer who then meets with the appropriate parties to discuss whether the program is to be revised, continue as is, or deleted. This review has led to the termination of several degree programs over time.

All new programs and courses must go through a uniform curriculum review process before the program or courses can be offered. All new courses must be approved first by the department's curriculum committee and then the colleges before they are sent to the dean for final approval. This process was modified in 2013 to include the opportunity for faculty from other colleges to speak for or against new proposals at the college curriculum meetings. This was initiated in response to concerns that faculty only learned of new courses in other departments after they were approved and inserted into the undergraduate and graduate catalogs. There were instances where faculty in other departments believed that a particular new course was redundant or impacted their majors as it sometimes replaced an older course that was part of other disciplines' programs of study. Now, all new course proposals are posted and all departments notified prior to approval by the college curriculum committee so that concerned faculty from other disciplines may attend the college curriculum committee and state their position for or against the new course proposal.

This new process has significantly increased the dialogue between faculty from different disciplines and led to changes in the structure of new courses and new programs. For example, in 2013, a new program in Sport Management was proposed. The faculty from the management program in the Harmon College of Business were concerned about the redundancy of several management courses that were to be taught in the new proposed program and those already offered by the business faculty. These discussions led to the inclusion of several management courses from the business program into the Sport Management program of study.

As mentioned earlier, the primary criteria in the initial review of programs are demand and resources. Historically, both the APRC and Graduate Council were involved in the review of all new major program proposals. This was changed recently as the APRC concluded the Provost and Chief Learning Officer was in a better position to determine whether the department proposing the new program adequately demonstrated demand and the resources to support the proposed major. The faculty on the respective committees did not feel they were informed enough on these matters to deny or approve new programs going forward. This change has increased the demand on departments to address need and costs in a more direct and concrete manner than in the past.

In addition to the internal program review process, the Missouri Department of Higher Education (MDHE) created its own program review process based on a single metric - number of graduates. Every public institution in the state was given a list of programs that failed to meet the minimum number of graduates as established by the MDHE (i.e., an average of 10 graduates per year for the previous three years for undergraduate programs and 5 per year for graduate programs). UCM submitted a report as part of the Statewide Academic Program Review, and based on this graduation metric, UCM eliminated 11 undergraduate programs (phased elimination over a 3-year period). The University utilized an appeal process that was part of the review methodology to seek extension for 14 additional programs that did not meet the MDHE graduation criterion. In the fall of 2013, UCM submitted a follow-up report on the 14 programs submitted by UCM under appeal for continuation. MDHE has not yet responded to this request for continuance of the 14 programs.

4.A.2. The institution evaluates all the credit that it transcripts, including what it awards for experiential learning or other forms of prior learning.

UCM credits are hours earned via on-campus courses, online and hybrid courses, courses taught at UCM's Innovation Campus and Central Summit Center in Lee's Summit, Missouri, and courses taught at Whiteman Air Force Base in Knob Noster, Missouri. All credit is rigorously evaluated before it is transcripted. This review is primarily conducted by the staff in the admissions and registrar's office in accordance with the Principles of Good Practice for Transfer and Articulation as adopted by the Missouri Department of Higher Education (June, 1998), Credit Transfer Guidelines for Student Transfer and Articulation Among Missouri Colleges and Universities, revised October 13, 2005, and UCM's Transfer Policies.

Page 7 of our undergraduate catalog describes the various types of credit that are accepted by UCM to include transfer credit, credit by exam or special credit (nationally normed exams such as College Level Proficiency Program, Advanced Placement Examinations, International Baccalaureate Program), validated credit (i.e., demonstration of competency by exam or performance), and other types of credit based on licenses, certifications, military service-related credit, and work experience as approved by the Vice Provost for Academic Affairs. These types of "other credit" do not count toward residency hours or upper-division hours at UCM and are denoted with a CR on the degree audit and transcript. There is a 30-hour limit on the amount of credit by exam or special credit (SC) and a maximum of 30 CR credit hours students may use toward their degree.

Review of courses, training, and experiences on the part of military personnel and veterans is performed by a staff person in the Office of Extended Campus and Distance Learning explicitly dedicated and trained in the application of ACE standards.

Review of credit for our international students is performed by UCM's International Center. A more detailed description of the admission process for international students can be found on page 5 in the University's undergraduate catalog and on our website. Academic departments and /or the Vice Provost for Academic Affairs are consulted in all cases where there is any uncertainty about the awarding of credit.

4.A.3. The institution has policies that assure the quality of the credit it accepts in transfer

To ensure the quality and consistency in the application of standards with regard to transfer credit, UCM only accepts credit from regionally accredited institutions. Transfer credit is handled by four University offices: the Office of Admissions, the Graduate School, the International Center, and Extended Campus & Distance Learning. The Office of Admissions reviews and posts all transfer credit while the other offices review and post work selectively. Extended Campus & Distance Learning posts military credit and military experience as transfer credit.

Students are able to access our website to determine course equivalencies but the burden for review of such transfer credit resides with our well qualified professional academic advisors. These advisors do the initial evaluation of general education courses that students wish to transfer to UCM. In the vast majority of cases this is relatively straightforward due to our extensive articulation and course equivalency tables. In instances where there is no established equivalent course, the advisors frequently request additional materials from the student to include course descriptions, syllabus, and the textbook used in the course. In some cases, the advisors contact the instructor from the sending institution to seek additional information to help them in their deliberation. Final decisions related to upper division courses that apply to a major are evaluated by the faculty from the appropriate department.

Office of Admissions

Review of transfer credit is guided by UCM's Transfer credit policies and guidelines disclosed on in the undergraduate catalog (page 7). The undergraduate catalog (page 7) also describes the process for students who wish to appeal the way regionally accredited transfer work is accepted.

The Graduate School

The graduate catalog describes credit transfer policies on pages 14 (graduate certificates) and 16 (master's degree programs). A student may transfer a maximum of 50 percent of the required hours of graduate credit from another institution to a graduate certificate and a maximum of nine hours of graduate credit from another institution to a master's degree program. Graduate transfer must be approved by the student's advisor and the Office of Graduate Studies. For quality control, students can appeal credit transfer decisions by following the instructions on page 6 of the graduate catalog. The catalog also lists the graduate policies that are not subject to appeal by students as they help UCM ensure compliance with federal and state laws, codes, regulations, and accreditation requirements.

The International Center

The University of Central Missouri awards transfer credit for courses completed at an appropriately recognized university or post-secondary institution of higher learning. An appropriately recognized university is one which is officially recognized by the home country's Ministry of Education as an institution with full university degree granting authority. Transfer credit is considered acceptable for courses deemed appropriate to University of Central Missouri's baccalaureate programs. Credit is not generally awarded in English courses or coursework in subjects where a minimum grade of "C" or equivalent was not earned.

As part of our acceptance procedures, UCM does not accept credit for programs and/ or courses completed at non-university post-secondary technical institutions. No direct coordinated link exists between technical post-secondary institutions and university education. Thus, students in technical education programs may progress to higher levels of technical education but do not receive advanced standing, transfer credit or exemptions at the University of Central Missouri. Students wishing to appeal acceptance of international transfer credit must meet with the Assistant Director of International Admissions. This policy currently is not shared with the public on the website or in the undergraduate catalog.

Office of Extended Campus & Distance Learning

The undergraduate catalog describes acceptance of military experience as credit on p. 8 under "Credit for Official Certifications, Licenses, Diplomas and Work Experience" and "Military Service-Related Credit for Personal Interaction." Credit is given for experience from the Army, Marines, Navy and Coast Guard as transcribed on the ACE transcript. Once a course has been reviewed and approved, it will be added to our course equivalency list along with all college institutions that transfer credit. Currently, not all branches of the military are represented on this list as UCM has only recently adopted this practice.

4.A.4. The institution maintains and exercises authority over the prerequisites for courses, rigor of courses, expectations for student learning, access to learning resources, and faculty qualifications for all its programs, including dual credit programs. It assures that its dual credit courses or programs for high school students are equivalent in learning outcomes and levels of achievement to its higher education curriculum.

All new courses, prerequisites, co-requisites, and programs are subject to the University's "curricular review process." New courses are first vetted at the department level and if approved, sent to the college curriculum committee for review and approval. To eliminate redundancy and to ensure faculty from other disciplines the right to comment on any new course proposal, all department chairs and program coordinators are notified of all new course proposals prior to the college curriculum committee's meeting and may attend to comment on any new course proposals and any prerequisites that are listed for the new course. This process has had a significant impact on new course development and any accompanying prerequisites. Faculty outside the college from which courses emanate can inform the faculty proposing the course of possible impact on their programs.

New programs are first reviewed by the department's curriculum committee. If the college dean then approves the proposed program, the Provost and Chief Learning Officer evaluates the proposal to determine if there is a demonstrated need for the major and what, if any, new resources will be needed to deliver the program. Following the Provost's approval, new program proposals are sent to the appropriate college curriculum committee and then to the Faculty Senate University Curriculum Committee (FSUCC). If appropriate, new programs are also reviewed by UCM's Teacher Education Council or the Graduate Council.

The Teacher Education Council (see Section II I of the Faculty Guide, page 18) is the Executive Board of the Professional Education Faculty (PEF) and serves in an advisory capacity to review and recommend on policy, programs, curricular, and programmatic assessment matters related to the Teacher Education Program. The Teacher Education Council (TEC) serves the PEF and is representative of the PEF. The TEC reports to the Dean of the College of Education in his/her capacity as the University's official Unit Head.

The Graduate Council's primary responsibilities are to: develop and review University policies and procedures for graduate education; review and recommend graduate curriculum; monitor the quality of graduate programs of the University; and advise the graduate dean on university, college, and department policies and procedures as set forth in the Graduate Catalog. The Graduate Council is composed of eight faculty who hold full graduate faculty status, two graduate students representing different colleges, the Dean of the Graduate School and Associate Dean of the Graduate School.

The FSUCC is composed of faculty from every college as well as a number of key administrators from the registrar's office, graduate school, international center, academic advising, and assessment. The FSUCC Handbook provides detailed descriptions of the processes to be used for curricular changes such as new courses, new programs, program revisions, and program deletions as well as general information for faculty with regard to the responsibilities of the FSUCC. As part of the curriculum review process, the FSUCC evaluates prerequisites, course outcomes, program rigor, assessment methodology, and credit hours required for the degree as well as the resources needed to support the course requirements such as library materials, labs, adequate supervisors, etc.

If the FSUCC endorses the program, the proposal is sent to the Provost for approval. Major curricular changes, such as new program proposals, substantial revision of programs, title changes to programs, program deletions, must go for review by UCM's Board of Governors (BOG) and approval before being sent to Missouri's Department of Higher Education (MDHE). The MDHE posts all new program proposals on their website for public comment. UCM (as are all institutions in the state) is required to respond to all comments/concerns raised by other institutions. If there are significant concerns, the MDHE mediates any disputes between UCM and the plaintiff. Once MDHE approves the program, it is sent to Missouri's Coordinating Board for Higher Education (CBHE) for final approval. The Higher Learning Commission is then informed of the new program via the Annual Institutional Data Update. If the new program is to be offered at a new site, involves other institutions in a consortial or contractual arrangement, the University sends the proposal to the HLC for review and approval. For example, this past year UCM's Board of Governors approved the offering of our BS in Aviation at Whiteman Air Force Base (WAFB) in Knob Noster, which was not an approved site for UCM. The University sought and achieved approval from the Higher Learning Commission for this program to be taught at WAFB before we formally offered courses in the program.

The University increased the requirements for how course credit is awarded by requiring all departments to follow the recommendations for determination of credit hours as defined by the United States Department of Education and endorsed by the HLC. The approved definition of credit hour at the University of Central Missouri is presented in our Faculty Guide, and reads as follows:

"When designing a course, it is particularly important that the amount of student engagement or work in the course's intended learning outcomes assigned by faculty and verified by evidence of student achievement, is appropriate for the credit hours awarded.

A credit hour is an amount of work represented in intended learning outcomes and verified by evidence of student achievement that is an institutionally-established equivalency that reasonably approximates not less than: (1) one hour of classroom or direct faculty instruction and a minimum of two hours of out-of-class student work each week for approximately fifteen weeks for one semester hour of credit or the equivalent amount of work over a different amount of time; or (2) at least an equivalent amount of work as required in (1) of this definition for other activities as established by the University of Central Missouri, including laboratory work, internships, practica, studio work, and other academic work leading toward the award of credit hours; or (3) institutionally established reasonable equivalencies for the amount of work described above in paragraph (1) of this definition for the credit hours awarded, as represented by verifiable student achievement of intended learning outcomes." At the encouragement of UCM's Chief Learning Officer, many department websites, publications, and syllabi contain statements that clearly define student expectations with regard to engagement and corresponding credit hour allocation.

Under the leadership of our president, Charles Ambrose, our University is moving towards demonstration of competency as a means of determining readiness for graduation. This performance-based assessment of learning has been a cornerstone of our assessment model, Central's Quality Improvement Program (CQIP), since 1991. However, the final component of our model, which asks faculty to not allow students to graduate until they have demonstrated competency in the program outcomes has not been fully implemented. Our current definition of credit hour does allow faculty to award credit for student demonstration of competency in the outcomes that have been defined for a given course. Although not required of students, this addition to our definition of the credit hour will allow for programs to move towards a competency-based model of delivery.

The University's Dual Credit policy ensures course quality by establishing minimum qualifications for students and faculty alike and by requiring faculty who teach in our dual-credit program to utilize the syllabus and instructional materials prepared by the UCM faculty. High school teachers who meet UCM's specific educational requirements, with supervision by the University of Central Missouri's faculty, teach the majority of our dual credit courses. University faculty teaches additional dual-credit courses using cutting-edge technology methods to reach schools outside the university. UCM's faculty mentors help dual-credit faculty prepare and teach their classes and to ensure the standards established by UCM are met.

The credentials of faculty are reviewed at time of hire and again as part of promotion and tenure evaluations. As mentioned previously, graduate faculty status is determined and assigned by the Graduate Council. To help ensure the quality of its online instructors, the University established a certification process for faculty who wish to teach online (AP&R 11). This process also includes a review and assistance in the design of courses to be offered online as described earlier in 3 C.

UCM works hard to ensure that all students (on campus, online, at the Missouri Innovation Campus and Central's Summit Center and at Whiteman Air Force Base) have access to comparable learning and student support resources. UCM provides on-site advising, textbooks, library access, and even a courier service to provide transportation from our main campus in Warrensburg to Lee's Summit, Missouri, for those students taking classes at both sites. Similarly, UCM's office of Extended Studies surveys our online students every few years to determine student needs. As a result of feedback obtained from the survey administered in 2009, the University hired an additional online academic adviser. The Standards and Regulations outlined in the University's catalogs (pages 15-24 of the 2013 undergraduate catalog) apply equally to all students regardless of location or mode of delivery.

As part of Central's Quality Improvement Program (CQIP), all academic programs are asked to implement summative assessments to all of their graduating seniors. The results of these assessments are reviewed by the program faculty to determine strengths and weaknesses of their program and their students. Although not all programs use nationally normed tests or licensure exams, many do include such an exam as part of their overall assessment program. A listing of the assessments administered by the Office of Testing Services shows that the nature and type of assessments used is quite varied. UCM's assessment model and program are discussed in more detail in 4.B.

4.A.5. The institution maintains specialized accreditation for its programs as appropriate to its educational purposes.

The University of Central Missouri has a history of seeking and maintaining appropriate specialized accreditation for its academic programs. UCM's Theatre programs received accreditation from the National Association of Schools of Theatre (NAST) in March, 2013. In April of 2013, the Art Department was notified of their successful application for accreditation by the National Association of Schools of Art and Design (NASAD). Our Athletic Training program was notified last spring that it had received accreditation from the Commission on Accreditation of Athletic Training Education, bringing the number of programs with specialized accreditation to 37 (29 undergraduate and 8 graduate). The University sees such accreditations as confirmation of the quality of our degree programs, and UCM strongly supports and encourages all programs for which accreditation is available to pursue such recognition. A complete list of specialized accreditations is maintained in the Provost's Office and posted in UCM's website, our catalogs, and in our Fact Book. In addition, UCM's bachelor's degree program in Chemistry is approved by the American Chemical Society.

4.A.6. The institution evaluates the success of its graduates. The institution assures that the degree or certificate programs it represents as preparation for advanced study or employment accomplish these purposes. For all programs, the institution looks to indicators it deems appropriate to its mission, such as employment rates, admission rates to advanced degree programs, and participation rates in fellowships, internships, and special programs (e.g., Peace Corps and Americorps).

The securement of highly reliable and valid data relative to the success of our students once they leave the institution continues to be a goal. The vast majority of our programs have advisory boards, many of which have existed since the early 1990s. These boards typically consist of employers, alumni, faculty from other institutions, members of professional organizations, and site reviewers from accrediting bodies. The advisory boards help validate the student learning outcomes of our programs and serve as the main mechanism for learning which outcomes need to be added or deleted from the program's set of learning expectations. Some advisory boards provide us with feedback as to the preparedness of our graduates as they enter the workforce. Some programs, especially those for which there is not a well-defined employment classification, track the success of their graduates in terms of admissions into and completion of advanced degree programs. Programs that have been the most successful in tracking their graduates are those for which there is a well-defined employment niche, such as nursing, teacher education, business, and applied technology. For example, the Teacher Education Council surveys its currently enrolled students, alumni, and principals from schools that employ our graduates on an annual basis.

Our Office of Career Services tracks the employment history of our graduates and publishes an annual report that provides detailed information as to the employment and salary trends of our students. For the past 19 consecutive years, more than 90% of our graduates have been employed within 6 months of graduation. Although this information has proven quite useful at the institutional level, it has not been very informative at the program level. For that reason, each department is asked to track its graduates as one measure of program quality. The most common methods used to obtain this type of information are alumni and employer surveys. The problems we have encountered with these methods center mostly on low response or return rates and a lack of qualified evaluators in the workforce. It has proven difficult to find evaluators who are both trained to provide appropriate feedback and who are in the position to have the necessary information/contact with our graduates needed to perform such an assessment.

Indirect evidence of the quality of our graduates is measured by the number of employers who attend our Career Expos each fall and spring. UCM expects more than 150 employers to attend our spring 2014 Career Expo held on campus in our Multi-Purpose Building. In 2011, that number was 94. In March of 2014, the Office of Career Services will host the annual Teacher Placement Day with more than 100 school districts in attendance. The number of potential employers attending these events has increased steadily over the past three years and serves as testimony to the quality of our graduates.

The enthusiastic willingness by our business partners to join the Missouri Innovation Campus (MIC) undertaking is another indirect measure of the success of our graduates. Representatives from these businesses have contributed hundreds of hours to the development of the program outcomes for the curriculum. In addition, as part of their commitment to the MIC initiative, these business partners have pledged to offer recurring internships to our students while enrolled in this unique program with the promise of a job at degree completion provided the student demonstrates adequate proficiency in the designated outcomes in the company setting. These businesses, which have a history of employing our students, would not have made such a commitment had not the UCM graduates they hired in the past been competent.

The University does not collect employer satisfaction measures at the institutional level. The Faculty Senate University Assessment Council (FSUAC) recognized the lack of information on our graduates as a significant weakness in our student information system and approved the administration of an employer satisfaction survey beginning in the fall of 2014. The survey will measure, among other things, the employer's likelihood of hiring additional graduates from UCM; employer satisfaction with our graduates' knowledge in their major field; and their ability to apply that knowledge in the work setting, to work in teams, and to solve problems.

4 B. The institution demonstrates a commitment to educational achievement and improvement through ongoing assessment of student learning.

4 B 1. The institution has clearly stated goals for student learning and effective processes for assessment of student learning and achievement of learning goals.

The University introduced its current assessment model in 1991. The Continuous Process Improvement model or CPI as it was called then has since undergone numerous revisions. The current version of our assessment model, CQIP (Central's Quality Improvement Program) has been in place since 2009. Our CQIP model is based on the assumptions that assessment's primary purpose is for learning (both student and faculty), and that demonstration of such competency defines readiness to graduate. Therefore, in our model, every student is to be assessed. Sampling is NOT the preferred method and is not commonly used to measure student learning. Assessment information derived from assessment of all students provides: the means to document student learning; the opportunity for valuable feedback information to students; and excellent program evaluation information.

CQIP directs faculty responsible for a program to:

  • Identify and validate student-learning outcomes for its programs;
  • Create faculty structures and processes to assure student learning of complex outcomes;
  • Identify methods to assess student achievement and determine how the results of assessment will be used to improve learning;
  • Implement student assessment, documenting student progress, and how the results are used to improve teaching and student learning.

The guidelines for implementation of Goal A.1 of the CQIP model read as follows: "Each department responsible for a program will define, validate, and communicate the student-learning outcomes for its major programs." Faculty have defined and published student-learning outcomes for all majors that serve as the organizing principles of their programs. These outcomes are presented in the University catalogs and many departmental publications and websites. The outcomes are communicated to students through a variety of mechanisms to include orientation courses to the major, publications such as discipline specific student handbooks, and most course syllabi (Goal A.3 of CQIP). Many departments have a formal orientation process for new faculty that includes not only a discussion of the student learning goals for the program, but also of the outcomes assigned to courses in the program. The vast majority of departments developed curriculum matrices that identify the specific outcomes addressed in specific courses. The outcomes for the various programs can be found in the Virtual Resource Room under CQIP, Goal A.1. Examples of curricular matrices are presented under Goal A.4 of CQIP.

UCM's Faculty Senate University Assessment Council (FSUAC) is responsible for oversight of the University's assessment system and for establishing, monitoring, and reviewing assessment processes and data. It has been in existence since 1987. The FSUAC is responsible, along with the Director of Assessment who serves on the committee, for ensuring academic and student support services are engaged in effective assessment practices as prescribed by CQIP. The FSUAC is composed of the Academic Programs and Curriculum Group and the Administrative and Support Services Group. In addition to a university-level assessment committee, each college and academic department has an assessment committee that is responsible for implementation of CQIP to include the collection and review of assessment information.

The University does not require departments or offices/units to develop and submit a formal assessment plan each year. Instead, departments are asked to include their assessment plans or goals as part of their annual reports relative to implementation of CQIP. The Office of Institutional Research provides each program with a data pack that contains five years of data on measures of academic quality and productivity/cost. As part of the annual reporting/review process, department chairs meet with their dean to determine their department's progress on the previous year's goals and to develop action plans for the forthcoming year.

The self-study team determined that assessment processes and review of assessment information are not occurring at the depth and scope needed in all academic units. Too many departments have not developed adequate formative and summative assessment systems for their programs. The problem appears to not be with the model or the lack of valuing of assessment. The team found the culture of assessment to be well established at UCM but the University lacks a system by which faculty and program coordinators could readily record and access student information, particularly as it pertains to student learning. Furthermore, faculty continue to express the desire to have a system in place that allows them to connect assessment, faculty, and student information directly to accreditation standards as they enter the data into a University-level data collection system.

Approximately four years ago, the University purchased an accountability management system (Outcomes Suite) from Blackboard in an attempt to better coordinate data collection and assessment of student and faculty performance, and to help us prepare for this re-accreditation effort. For a variety of reasons, this system did not provide the desired accountability management outcomes we hoped for, and we discontinued our contract with Blackboard. Since then the University has restructured the Office of Institutional Research (IR) to better deliver services to our faculty and staff with regard to documenting institutional effectiveness, student learning and development, and faculty production. The IR office now has a director and three research analysts who provide data, reports, and analyses for the campus. However, the need for an accountability management system continues.

As part of UCM's efforts to address this problem, the IR Office has started the process of developing a data warehouse with initial training slated for April of 2014. A data warehouse is a read-only database that allows users to retrieve and manipulate data for analysis. The warehouse will allow IR to begin collecting historical snapshots of data throughout time. This will allow for cleaner time-series analyses on important University metrics. Moreover, as these historical snapshots are archived, the IR staff will be able to construct usable trend analyses for projects we do not currently imagine. The warehouse will also be more effectively mined as we expand our use of business intelligence tools (e.g., forecasting, demand analysis) without concern for querying a `live' database. In addition, the IR staff has implemented a pilot project that will allow faculty to enter student performance data from their classes directly into a data warehouse. Each of the 8 pilot programs created a table based on their major's expected student learning outcomes. These tables include a short title for each outcome and a longer description of that outcome. The long description also describes how the data are recorded in the table. For example, faculty may assign a student a metric based on outcome performance or a qualitative description, such as "High," "Low," or "Non-Satisfactory."

With this system, faculty may submit individual student performances into the database. The data must be submitted as a csv file that includes the columns headed as "Student ID," "Outcome," and "Value." The Student ID column must be populated with the student identification number. The column headed Outcome must be populated with the relevant outcome (either consistent with the short title defined by the program or an outcome number). Finally, the column headed Value must be populated with the assessed student performance. Programs and Faculty may also include a column headed "Assessment ID." This Assessment ID may be used to track the date, instructor, assessment number and/or course number relevant for program tracking purposes. This system will allow deans, chairs, and faculty to query data directly from the system. With a program's assessment data posted into the assessment database, faculty may develop reports such as a histogram of student attainment on the tracked outcomes. Summary statistics on student outcomes can also be constructed. Reports that explore correlation between outcomes can also be developed and displayed.

Presented in the Virtual Resource Room are data reports developed by the Office of Institutional Research on selected programs participating in this pilot effort. The Provost and Chief Learning Officer, the Vice Provosts, the Director of IR, and deans continue to discuss the University's options, to include further development of the in-house accountability management system or the purchase of a product from an established vendor.

IR has also developed a draft of an Academic Affairs Dashboard to include a set of key performance indicators developed by the administration of UCM and the assistance of a consultant from the RPK Group. The data set that will comprise the Dashboard will be finalized in the spring of 2014. Included in this Dashboard are the metrics chosen by UCM as part of the Missouri's Performance Funding Initiative and the key data elements from the data packs (described earlier in Section 3 A 1). A draft of the Academic Affairs Dashboard for selected departments can be found in the Virtual Resource Room. The Dashboard will consolidate and display data already being gathered in various information systems throughout the University. The information presented in the Dashboard will be used in the academic planning and program review processes.

4 B 2. The institution assesses achievement of the learning outcomes that it claims for its curricular and co-curricular programs.

All students who graduate from UCM must take and pass a nationally normed test of general education knowledge and skills. The Faculty Senate University Assessment Council (FSUAC) established this requirement in an attempt to ensure minimal competency in the knowledge and skill areas that define our general education program. For teacher education majors, this test (until the fall of 2013) was the College Basic Academic Subjects Examination (CBASE), which measures a student's knowledge and skill in English, writing, mathematics, science, and social studies. The test used for all other undergraduate students at UCM is ETS's Proficiency Profile.

The passing score on the CBASE used for admission into teacher education was established by the Missouri Department of Elementary and Secondary Education (DESE). In order to have comparable standards for teacher education and non-teacher education majors, the FSUAC established a passing score on the Proficiency Profile test that is the same percentile as the passing score on the CBASE. DESE has implemented a new test for admissions into teacher education, but at the present time, no statewide passing score has been established. Through March of 2013, the average first-attempt pass rates of UCM students on the CBASE are as high or higher than the average pass rates overall for Missouri institutions. The disparity is most pronounced for our students categorized as Asian-Pacific Islander and African American, where the first-attempt pass rates are substantially higher than the state averages. For example, our Asian-Pacific Islander students' performance in writing and social studies was 15% higher than the Missouri average. UCM's African-American students scored 8% higher than the state average in English, 10% higher in mathematics, and 9% higher in social studies. The average pass rate on ETS's Proficiency Profile exam is 92.69%.

Since its inception, the primary purpose behind the development of CQIP was to implement a performance-based educational delivery system that defined readiness for graduation as the demonstration of competency in well-defined knowledge, skills, and attitudes or values (KSAs) rather than an accumulation of course credits with a minimum GPA. This goal, established more than 20 years ago, remains largely unfulfilled with the exception of Teacher Education, Nursing, and a few other programs, which have implemented the demonstration of minimum levels of competency in their program's outcomes as a condition for graduation.

It is the intent to make all of the programs to be offered at our Missouri Innovation Campus (MIC) in Lee's Summit, Missouri, competency based. This goal has an excellent chance of being realized since the curricula for the programs to be offered at MIC are heavily skill-based and designed with competency-based learning in mind. The program outcomes and program of study are co-designed by faculty from UCM, the Metropolitan Community Colleges of Kansas City, Lee's Summit R-7 School District, and representatives from our business partners in the Kansas City area. We are currently offering two programs at the MIC. The first program we developed was Systems Engineering Technology. All of the programs to be offered will have extensive internship experiences for the students, beginning the summer after their junior year in high school. It is our goal to have developed structured learning experiences that address the outcomes of the major program that will be practiced and assessed in the industry settings to be completed by the end of the spring semester, 2014. These exercises will include the application of faculty developed rubrics by on-site personnel who have been trained in the use of the rubric. We hope to validate student learning in this manner beginning with the summer 2014 internships.

As part of CQIP, all academic departments are required to conduct both formative and summative assessments of their students. The majority of our programs assess 2-3 of their program outcomes as part of their formative assessment processes after the student has completed 12-15 hours in the major. The nature of these formative assessments varies considerably from program to program. Some, such as Music, Art, and Theatre, have rising-junior reviews for their majors that involve a thorough evaluation of a student's portfolio to include student products in addition to classroom work. A number of programs have specific criteria that must be met as a condition for admission into the program. These criteria typically include review of traditional measures like GPA in the major and scores on nationally normed exams (e.g., CBASE and ETS's PP exam), but also student products collected in the context of major courses. Formative assessment remains an area of focus of our assessment efforts as many programs do not have well developed formative assessment systems.

The vast majority of our programs have implemented summative assessment, some more developed than others. As in the case of formative assessment, most programs use multiple measures to both document student competency and to evaluate program effectiveness. Our teacher education program, for example, uses evaluations of their student teachers, PRAXIS scores, GPA and exit interviews as their summative assessment. The Department of Psychological Science uses scores on a major field test, GPA, and student performance on projects and papers imbedded in their major's capstone course, History and Systems of Psychology. Nursing students take a series of exams beginning in their junior year following admission to the program. These exams serve both a formative and summative assessment purpose as they provide the students and the nursing faculty with valuable feedback as to the knowledge and skill levels of their majors. The ATI exams have been shown to be excellent predictors of student pass rates on the nursing licensure exam, the NCLEX.

Students in the construction management (CM) program participate in a number of assessment activities towards the end of their major. In the third year of their program, students are interviewed by their program advisor. The purpose of this interview is for faculty advisors to determine their expectation of the student's success in internship with possibilities of placement. Students also participate in a Bid-Day competition. Bid-Day is an activity allowing the student to contribute to a team approach of bid preparation and submittal. Bid-Days are conducted through the CMGT 1301 seminar, CMGT 4325 Advanced Estimating, and at the Associated Schools of Construction competition. Construction management majors undergo a fourth year exit-interview conducted by CM faculty, to give the student the opportunity to identify strengths and weaknesses of the program. Professional career possibilities and future educational needs are also discussed. Lastly, as part of CM's post-program assessment, students are administered a survey and test completed in the ICAP 4109 Construction Operations course.

Other examples of formative and summative assessment activities of our programs are presented in the CQIP section and the Virtual Resource Room. However, as mentioned above, improvement in the design, collection, and evaluation of student performance data remains a priority.

The University of Central Missouri provides a variety of co-curricular activities through which students may further hone professional skills, grow additional relevant content knowledge, and gain germane application and experience.

This past year we introduced a co-curricular transcript program that will eventually allow us to measure student competency, particularly in the skill areas of programs in settings outside the formal classroom. Currently we are logging student behaviors to provide a more detailed account of the range of high impact student experiences while at UCM. We anticipate the addition of a formal evaluative component within the next year. This will allow us to link feedback regarding student performance in co-curricular programs directly to the outcomes in their major.

Student Affairs Assessment provides support to student service departments whose primary purpose is to foster and promote student learning. In keeping with the University mission, continuous evaluation is provided to ensure fiscally sound, quality services and programs for students and other university constituents. Utilizing original and benchmarked research and assessment, Student Affairs Assessment develops strategies to address student needs and wants in a cost-efficient and effective manner.

Although not a direct measure of student knowledge and performance, the National Survey of Student Engagement (NSSE) has provided UCM with useful information regarding student behavior as it pertains to student involvement in their studies, campus life, and our faculty. Results from the 2013 administration of the NSSE show that our seniors' averages on eight of the ten engagement indicators were significantly higher than those of our peer group, and not significantly different on the remaining two indicators. The results show a particular strength of UCM to be the degree of interaction between our faculty and our students. Previous administrations of the NSSE showed our students to be less significantly challenged than our peers or at best, not significantly different. These results were shared with the chairs and faculty over the last five years. UCM has improved in all four categories to the point where both our freshmen and senior averages in higher-order thinking, reflective and integrative learning and learning strategies are significantly higher than our peers. There were no differences between UCM students and our peer group on quantitative reasoning.

The University also regularly administers the Noel-Levitz Student Satisfaction Inventory (SSI) to its students. One of the major reasons UCM chose this survey was because of its ability to not only provide specific information about student satisfaction with various aspects of student services, but also because it provides measures of perceived importance of those services. One area that had been identified as needing improvement in earlier administrations was student advising, both in terms of overall satisfaction and gap score (i.e., the difference between a student's rating of an item on satisfaction and the rating on its importance). Since advising is routinely rated as extremely important by students and since research, both nationally and at UCM, shows that quality advising correlates positively with student success and retention, UCM devotes significant time and resources to improving its academic advising services. The results from the 2013 administration show that our advisors (on the five items related directly to advising) were not significantly different from the national average for four-year public institutions on four of the items and were significantly higher than the national average on the item, "Major requirements are clear and reasonable." Furthermore, none of the gap scores exceeded one, although two of the items did approach one ("My academic advisor is concerned about my success as an individual" and "My academic advisor helps me set goals to work toward"). As part of our Learning to a Greater Degree initiative, we fully expect measures of satisfaction with advising to be even higher on the next administration of the SSI, scheduled for 2015.

Other notable results from the 2013 SSI were that UCM was significantly higher than the national average for four-year public institutions on the following three measures: the percentage of students who rated their college experience much better than expected (53% versus 45%); the percentage of students who were satisfied thus far with their college experience (81% versus 75%); and the percentage of students who responded they definitely would enroll "here" again (78% versus 72%). Overall, UCM scored satisfaction ratings that were significantly higher than the national average on 31 of the 70 items, to include, "There is a commitment to academic excellence on this campus." The only item on which the University scored significantly lower on satisfaction than the national average was "The campus is safe and secure for all students." This result was surprising given students rated UCM higher on "Parking lots are well-lighted and secure" and "Security staff respond quickly in emergencies."

Although UCM administers a variety of surveys such as CIRP, YFCY, NSSE, SSI, CORE, HERI, and others, there is a definite need for the institution to make better and more extensive use of the information gathered from these instruments.

4 B 3. The institution uses the information gained from assessment to improve student learning.

As stated earlier, the primary purpose of assessment is improvement of learning of both students and faculty. This improvement in student learning is accomplished primarily in two ways. First is in the form of feedback to the student. Precise and timely feedback on performance is essential to learning. Faculty are educated on the importance of assessment beginning with New Faculty Orientation, conducted by a team of veteran faculty known for their excellence in instruction. The Network for Instructional Support and Enhancement (NICE) conducts a three-day in-service workshop for all new faculty each fall. Major themes of the workshop are student engagement and feedback. This emphasis is also part of workshops offered by our university's Center for Teaching and Learning and various department meetings and retreats. This principle was also championed by our Student Success Committee which identified and distributed to faculty and staff, a list of Best Practices designed to retain students in good academic standing. Included in that list was the notion of early assessment of student learning. The Provost, deans, and chairs have made early assessment and feedback a priority for faculty.

CQIP directs faculty to assess students often, to provide feedback to students, and to review the results of those assessments to improve their teaching and assessment practices and in turn, student performance. Our Office of Institutional Research distributes a list of high risk general education courses to deans, chairs, and academic advisors. These are courses with high withdraw (W), drop (D), and failure (F) rates. Faculty who teach these high risk courses are encouraged to modify their instructional and/or assessment techniques in an effort to improve student learning and success. We have seen significant drops in our W/D/F rates in these courses as a result of these efforts. These reports are discussed in more detail in 4.C.

For example, the faculty of the mathematics department was concerned about the poor performance of students in intermediate algebra. They submitted a grant and received support from the National Center for Academic Transformation (NCAT) to do a course redesign. Faculty roles changed from dispensers of information to student mentors, and with the use of a new mathematics lab, instructional software and support, student performance improved dramatically in this course. Previously, only 68% of students received grades of C or higher. Following the course redesign, 85% of students received a grade of C or higher. This improvement in grades was not the result of a lowering of standards as the score on the final exam improved from 62.65 to 84.97. A similar effort is underway for College Algebra. In response to low pass rates in our anatomy and physiology courses, our faculty submitted an application and was approved by NCAT for a second course redesign in Human Anatomy. This redesign effort resulted in a 7% decline in the percentage of D and F grades. A third course redesign, not sponsored by NCAT, was undertaken by the faculty from the Department of English and Philosophy who were able to significantly improve student writing through their redesign efforts. The faculty from the Department of Psychological Science, after reviewing and discussing student performance in statistics and experimental psychology, realized that the separation of statistics and experimental design into two separate courses which students often took with a semester intervening between the classes, made it difficult for students to make the connection between the two. The faculty revised their curriculum to offer a one-year sequence of Research Design and Analysis that must be taken in consecutive semesters. This has resulted in students being better able to not only learn the statistics but to be able to apply them to the appropriate design. Students who major in any of our business programs must take a course entitled Integrated Business Experience (IBE). This course was the result of the faculty evaluation of student performance that showed students were having difficulty integrating theory and application. As part of IBE, students must design a product, develop a business plan, secure a loan to produce the product, market and sell the product, and write a summary report. All proceeds from the sales of these items are donated to charitable organizations. In the past years, IBE students have raised more than $240,000 for community non-profit organizations and performed more than 17,000 hours of community service.

Our Student Success Center (SSC) was the direct result of faculty review of student performance data in key gateway courses. The SSC is the newest addition to UCM's commitment to student learning and success and is the central location for students to ask questions about a subject or a concept, get individualized help with a class, talk about the best strategies for studying and learning, and find resources that can help students reach their academic goal. The Student Success Center offers free tutoring and assistance for more than 35 courses. Help with mathematics, from Introductory Algebra through Calculus and Trigonometry, is available every day. The Success Center also works with students on an individual basis to improve their note-taking, test preparation, and time management. The SSC sees more than 9300 students per year and offers a variety of tutoring and support services to UCM students.

The Nursing department's faculty regularly reviews student performance to determine optimal pedagogical and assessment methods. A careful review of student feedback and pass rates on the NCLEX led the Nursing faculty to implement a different series of formative assessments that allows them to predict with great accuracy the success of their students with regard to NCLEX. Other examples of how faculty and staff have modified teaching, assessment, and services in response to assessment information can be found in the Virtual Resource Room.

Perhaps the best evidence UCM's faculty has used assessment information to improve student learning ironically comes from our reports on student failures. A PSD and DFW Report prepared each year by IR for the Office of Enrollment Management lists the number of courses with 20 or more students and DFW rates of 30% or more. The report for the 2012-2013 academic year, which compares the DFW rates for the 2012- 2013 year to the previous four-year rates, shows that UCM is making progress in the area of student performance. Although the number of courses on the DFW list for fall 2013 is basically the same as the number of high DFW courses for fall for the previous four years, there were substantially more course offerings in 2013 as compared to the fall of 2010, indicating an overall reduction in the percentage of classes with high DFW rates. For example, despite the fact there was an increase of 11.9% courses in general education alone since 2010, the number of high risk courses remained the same.

Furthermore, the number of students placed on probation in fall 2013 declined by 6.8% as compared to fall 2012. The number of students who were suspended in fall 2013 was essentially even with fall 2012. However, since the fall 2010 there has been a decrease of 11.8% in the number of students suspended in the fall semester. Similarly, the number of students who were dismissed in fall 2013 was down only slightly when compared to fall 2012. The difference though between the fall of 2013 and the fall of 2010 is quite substantial. Since 2010, the number of students dismissed has declined by 26.5%. It is important to note these data are aggregate P/S/D totals, and as such do not take into account increases in student enrollment. For example, between fall 2010 and fall 2013, UCM's undergraduate enrollment increased by 8.6%. These P/S/D trends are positive indicators of improved student success.

4 B 4. The institution's processes and methodologies to assess student learning reflect good practice, including the substantial participation of faculty and other instructional staff members.

Although we do not have the types of assessment systems in place in all academic departments or at all levels of the institution as desired, we have established a culture of assessment whereby faculty and staff generally see assessment as a significant aspect of their responsibilities. Assessment has been a major topic of discussion and activity on this campus since 1987. The faculty has fully accepted their responsibility and ownership of assessment. The institution's processes and methodologies to assess student learning reflect good practice, including the substantial participation of faculty and other instructional staff members. The current assessment model, CQIP, was validated in 1991 with the awarding of an implementation grant from the Fund for the Improvement of Postsecondary Education (FIPSE). Three years later, UCM received a dissemination grant from FISPE to assist eight other institutions in implementing an assessment model on their respective campuses based on CQIP.

The Center for Teaching and Learning offers workshops and training sessions on teaching and assessment methods designed to improve the success of our students. These are open to all faculty at no cost. An examination of our CQIP model and the information presented in the Virtual Resource Room will show that faculty and other instructional staff play a key role in the selection and development of assessments used to measure student competency in the outcomes of their programs. The University has a long standing assessment committee composed mostly of faculty, the Faculty Senate University Assessment Council (FSUAC), established in 1987. However, this group does not make decisions relative to which assessment methods departments should use. The only areas in which assessments are prescribed for all students are in general education and surveys that assess student learning experiences (e. g., NSSE, YFCY) and student satisfaction (e. g., SSI and Senior Exit Survey). Through CQIP, faculty are directed to collectively review and evaluate student performance and to make changes in their curriculum should problems be identified.

The FSUAC's policy and procedure manual describes the FSUAC's primary responsibilities: oversight of the assessment of our general education program, our major programs, and student satisfaction. It was this committee that recommended and procured approval from the Faculty Senate for a campus-wide general education assessment and the creation of a minimum passing score. The FSUAC reviews, approves, and authorizes the funding of assessment activities carried out through our Office of Testing Services. Testing Services administers ETS's Proficiency Profile (PP) exam to all of our students after completion of 65 hours. All students, native or transfer, are required to take and pass this assessment as a condition of graduation. This test is administered to more than 2000 students annually. The exception is the teacher education majors who must take and pass the College Basic Academic Subjects Exam (CBASE) as one of the conditions for admission into our Teacher Education programs. The FSUAC also reviews the results of the general education assessments and surveys to identify points of concern in the curriculum and student support services. For a number of years, all students took the CBASE as their general education assessment. Review of student performance by the FSUAC revealed that many of our students were having difficulty passing the assessment due to the fact that the test was content heavy and so students who had not taken certain content courses, like geography or literature, were disadvantaged on the exam. The FSUAC reviewed a number of the extant standardized assessments of general education and chose ETS's PP exam which emphasized intellectual skills more than specific content knowledge. The committee believed this to be a better approach not only because students with different educational experiences would no longer be handicapped when taking the exam, but also because development of basic intellectual skills is fundamental to success in any academic major. The FSUAC previously reviewed the progress of each department annually with regard to implementation of our assessment model, CQIP. Now that the model is firmly entrenched in our institutional culture, the college deans and the Academic Program Review Committee are responsible for performing the annual and five-year reviews of each department's implementation efforts.

The faculty on the FSUAC serves as liaisons to their college assessment committees who in turn are represented by faculty from each department within the college. This structural arrangement has not been as effective in recent years due to significant change over in administrative leadership with corresponding different emphases on assessment. Under our current leadership, however, there has been a return to an emphasis on student learning and assessment, and we expect to see our department and college assessment committees resume a more active role in developing and communicating assessment practices, policy, and data to their constituents. Several committees have not met in a number of years and this has had a negative impact on our overall assessment efforts.

The Office of Student Experience and Engagement (SEE) likewise conducts assessment activities to measure the effectiveness of programs and services provided by student support areas such as residential life, campus activities, student health services, and public safety. The direct reports to this office: the Director of Student Activities; the Director of Student Auxiliaries, the Associate Vice Provost of Student Services; and the Chief of Public Safety comprise the student support services component of the FSUAC. The University recently adopted MAXIENT, which is a software program designed to document and validate problematic student behaviors. One of the distinct advantages of this system is that we can aggregate student information from across campus to determine if a given student is exhibiting a pattern of behavior that suggests an intervention by our staff and/or faculty. This system will also be used to document and track student complaints from across campus which currently are addressed and archived in a varietyof offices.

4.C. The institution demonstrates a commitment to educational improvement through ongoing attention to retention, persistence, and completion rates in its degree and certificate programs.

4.C.1. The institution has defined goals for student retention, persistence, and completion that are ambitious but attainable and appropriate to its mission, student populations, and educational offerings.

The University of Central Missouri has a longstanding commitment to student success as measured and reflected in its persistence, retention and completion rates. In its 2002- 2007 Strategic Plan (Framework for the Future: Progress by Design), student retention was established as a strategic direction because of its importance in sustaining enrollment projections.

As a result of the 2002-2007 Strategic Plan and discussions that occurred in the 2004 Winter Planning Retreat, a university task force representing a cross-section of the campus was selected and charged with developing a retention plan. As a result of the work of the committee, a set of recommendations was developed and published in 2004. The plan was intended to guide institutional efforts over the subsequent three years to establish retention goals, identifying strategies for goal attainment, and outlining an action plan for implementation and evaluation.

In September 2010, under the leadership of President Charles Ambrose, the Strategic Resource Model was developed and in 2011 the Strategic Positioning Platform was implemented. The Strategic Positioning Platform noted the need to develop key performance indicators including student satisfaction, retention, graduation and employment rates.

In summer 2012, the Student Success Committee was created, comprised of 40 faculty, staff and students from across the university. The charge of the committee was to identify immediate and long term strategies to improve the persistence, progression, retention and completion of UCM students. Divided into five rapid response teams, the committee's first task identified immediate actions for improving retention that were implemented in the fall 2012 semester. Provost Curtis and the college deans subsequently identified three student success focal points for faculty beginning in fall 2012. They are as follows:

1) Start the semester with a formative assessment within the first 4 weeks

  • encouraging faculty to give feedback early and often
  • holding students accountable to use this feedback to guide their efforts

2) Provide syllabi that have clear objectives, expectations and policies

  • assuring that each departmental office has these syllabi at their disposal
  • asking that faculty members post and maintain regular office hours

3) Reinforce the importance of regular attendance

  • expecting that faculty members will participate in enrollment validation
  • counting on faculty members to report problems with attendance

Most of these points of emphasis are contained in the Faculty Guide and have been for some time. However, the Provost determined that a renewed focus on these issues and processes needed to be made in light of drops in retention and freshmen success rates (percentage of first time, full time freshmen who successfully complete 24 hours in their first year) over the past several years.

On March 14, 2013, the Student Success Committee produced a final report containing several recommendations for improving persistence, retention, progression, and completion (Student Success Committee Final Report). Recommendations made by the committee were numerous; however, illustrative examples are cited below since they subsequently were implemented or are in the process of being implemented:

  • Increase on-campus student employment opportunities.
  • Promote, facilitate and reward co-curricular engagement.
  • Develop a personal responsibility campaign.
  • Integrate the early alert system with the student conduct system by using the Maxient software system.
  • Lower the student-to-advisor ratio.

With the leadership of Dr. Charles Ambrose, in spring 2014, the Learning to a Greater Degree (LTAGD) Contract was developed and thereafter implemented. At the broadest level, the objectives of the LTAGD contract are to graduate from the incoming freshmen class an additional 100 students in four years, and an additional 150 students in six years. To accomplish this the first year retention target was set at 72%, a four graduation goal was set at 32%, and the six year graduation goal was set at 58%.

As a means of reaching these goals, students are asked to take a minimum of 15 hours each semester, attend class, live on campus for the first two years, engage outside of the classroom, and consult with faculty and advisors. For its part, the university committed to employ additional advisors, provide scholarships for students averaging 15 hours each semester, deliver a seven-day-a-week campus experience, and build a new mixed-use facility.

Last year the University identified a list of Peer Institutions/Aspirational Models (see Virtual Resource Room) that were matched on a number of key variables such as selectivity, Carnegie classification, academic offerings, mission, and to a lesser degree, size. The average retention rate of the upper third of these institutions is 77%. Our retention rate for the last five years has fluctuated between 67.7% and 72.8%. Our current retention rate of 68.1%, while an improvement over last year, is still substantially below our desired long-term, stretch goal of 75%. The University is a moderately selective institution. Our stretch goal for retention represents a sizeable increase, but we believe it is attainable with the implementation of the recommendations of the Student Success Committee, the Faculty Senate University Assessment Council, the Department of Academic Enrichment, recommendations made by the president's Strategic Leadership Team, and local efforts within the academic departments. Other schools in our aspirational peer group, of similar size, selectivity, student composition and Carnegie classification have achieved this level of retention. We believe this retention goal, although ambitious, is attainable and appropriate to our mission, student population, and educational offerings, particularly as they apply to our full admit population.

UNIVERSITY OF CENTRAL MISSOURI / RETENTION AND GRADUATION RATES

Cohort Year

Number of Freshmen

Retention

Graduation Rates




Fall to Fall

Four Year

Six Year






2004

1,436

71.0%

26.4%

51.0%

2005

1,485

68.9%

28.2%

50.4%

2006

1,507

70.7%

27.6%

49.3%

2007

1,427

71.9%

30.0%

53.5%

2008

1,586

72.8%

29.2%


2009

1,504

72.1%

30.5%


2010

1,570

72.1%



2011

1,689

67.7%



2012

1,787

68.1%



 

Similarly, UCM has established persistence and completion goals of 94% and 58% respectively for May of 2019 which represent the levels of our aspirational peer group and would seem attainable given our student population and the focused efforts (described below) of our staff and faculty to improve our student success rates.

4.C.2. The institution collects and analyzes information on student retention, persistence, and completion of its programs.

Members of the president's Strategic Leadership Team (STL), the Provost Council, Academic Council (i. e., department chairs), and other key members of the University's leadership team receive regular reports on enrollment and retention. Enrollment updates, including reports on student retention, persistence, and completion are made to the Board of Governors on a regular basis. Samples of these reports can be found in the Virtual Resource Room.

The University of Central Missouri (UCM) has a long history of promoting student success. More recently, in summer 2012, the president and provost created the Student Success Committee (described in detail in 4 C 1), comprised of 40 faculty, staff and students from across the University. The charge of the committee was to identify immediate and long term strategies to improve the persistence, progression, retention and completion of UCM students. Recommendations have already been implemented, and the committee continues its work.

In February 2013, UCM's Board of Governors approved in principle a comprehensive plan to increase student success known as the "Learning to a Greater Degree Contract." The "contract" assures students they will graduate in four years by adhering to curricular initiative including:

  • Completing an average of 15 hours each semester, enrolling in the "right" courses.
  • Using guided pathways to ensure students have clear paths to degree completion.
  • Emphasizing class attendance.
  • Ensuring students have easy and ready access to advising services.

To implement the Learning to a Greater Degree Contract, several key initiatives are underway at UCM. These include:

  • Five additional academic advisors were hired in summer 2013. This helped to establish advising loads enabling academic advisors to proactively use the system described in the following points to assist students with education planning, counsel and coach students, and target and intervene in cases where students are at risk of failure.
  • Co-curricular elements (co-curricular transcript, 2-year residency requirement, and expansion of 24/7 programming and services), and institutional commitments (construction of a new mixed-use facility, focus on affordability, and scholarships for students completing an average of 30 hours each academic year).
  • UCM is in its second year of implementing a software system (Maxient) that integrates student conduct information, reported classroom absences, early alert reports, GPA, key demographic information, and mid-term grade reports. This information enables the identification of at-risk students, and development of support and intervention strategies to promote success. UCM hired one additional counseling psychologist for fall 2013 to ensure adequate counseling resources are available for students needing assistance.
  • In March 2013, UCM implemented a new degree audit system (Degree Works) that provides a comprehensive set of web-based academic advising, degree audit, and transfer articulation tools. This system enables academic advisors to provide "real time" advice and counsel to students. Degree Works also allows advisors and students to create interactive scenarios for degree completion.
  • UCM is a member of the Educational Advisory Board's Student Success Collaborative. The Student Success Collaborative (SSC) is an important retention vehicle at UCM. The collaborative allows the mining of university data to identify at-risk students, and determine systemic obstacles to degree completion. The collaborative enables focused and proactive advising, matching students with the right majors, and fixing degree bottlenecks.

More specifically, the Student Success Collaborative uses data analytics to identify majors in which students are likely to enjoy the greatest probability of success. This provides advisors with information to help students declare majors that not only match their career aspirations and interests, but also their academic strengths.

In addition to the measures identified above, the IR office also collects and analyzes a host of variables to help identify key processes, student characteristics and behaviors, faculty teaching and assessment methods, even weather patterns, which predict problems for our students. For example, IR has shown that assigning mid-term grades has a positive impact on students who are not doing well at mid-semester. Specifically, students who are performing at a D or F level when mid-term grades are assigned and who in fact receive a mid-term grade, improve their course grade by an average of .86 of a letter grade. It is analyses such as this one that have led to changes in practice and policy. We are now in the process of making faculty participation in the submission of midterm grades mandatory.

Chairs and designated faculty and staff can also access student information by means of Argos reports. We are in the process of developing a data warehouse which we believe will significantly improve the ability of end users such as deans, chairs, program coordinators, directors, etc., to access data easily and directly from BANNER. As part of the materials to be reviewed annually, each academic department receives program Data Packs that contain 5-years of information for each of their programs in the areas of Academic Quality and Productivity/Cost. The University also publishes on an annual basis a Fact Book that contains a host of information related to student, faculty, and academic and support services programs. Our IR and IT Offices respond to many ad hoc requests for data from the academic departments and assist departments in setting up program specific Argos reports.

As described in 4.B.3, the Office of Enrollment Management regularly distributes D, F, W rates for each class to the deans and chairs in an effort to identify high risk courses and the initiation of interventions to reduce these rates. The result has been a reduction in the percentage of high-risk classes over the past three years.

We initiated an Enrollment Validation Process (EVP) a number of years ago in response to data which indicated students were not attending classes early on in the semester and falling behind as a result of their nonattendance. Faculty members validate their enrollments by submitting attendance records during the first week of class. Our Office of Student Experience and Engagement then contacts any enrolled students who missed two or more of their classes. Student attendance has dramatically improved as a result of this program. We also established an Early Alert System to make it easy for faculty and staff alike to report those students in need of intervention for any reason. Hopefully, UCM's students are benefiting from our efforts to improve success. The aforementioned decreases in students on probation, suspended, and/or dismissed from the University would suggest they are.

Our Office of Institutional Research continually mines our student information system to identify students at risk at every stage of their stay at UCM, beginning from the time they apply to when they graduate. The IR office evaluates information from surveys such as CIRP, YFCY, SSI, and NSSE to determine what combinations of student enrollment patterns, student attributes and habits, courses and course sequencing, etc., contribute to student persistence and graduation. IR recently started examining data from Blackboard to see if there are any faculty behaviors that correlate with student success. One of the early findings is that students in classes in which faculty do not take attendance, do not utilize our Early Alert System, and do not turn in mid-semester grades, are significantly less likely to succeed.

4.C.3. The institution uses information on student retention, persistence, and completion of programs to make improvements as warranted by the data.

Data and information are regularly used to make informed decisions to improve persistence, retention and completion. As a programmatic illustration, several years ago it was noted that retention rates were substantially lower for first-time full-time students admitted with borderline credentials. These students, admitted into our "Success Program," receive developmental courses matched to their unique needs to include courses in reading, writing, mathematics, and study skills coupled with one or two carefully selected gateway courses. The Success Program for conditionally admitted students was subsequently developed. Today, the program serves 140-170 students each year, and retention rates for participants, while slightly lower than our full admit students, are substantially better than conditionally admitted students who do not participate in our Success Program. Retention rates for conditionally admitted students in UCM's Success Program are typically 17 to 30 percentage points higher.

More recently, as a second illustration, data revealed that in recent years an increasing number of freshmen have entered UCM as "open-option" or undecided as to their major. At the same time, data show that open-option students have lower retention rates than students entering with a decided major (Retention Graduation Rates For Open Option). A course designed for open-option students (AE 1401 Exploring Majors and Careers) has been in place for a few years with data showing that students completing the course retained at higher levels than student with decided majors. Based on these data, a policy requirement (page 50) pertaining to open-option students was implemented. The policy requires open-option students to complete the course within a designated time frame.

A rich variety of data are regularly disseminated across campus to a broad group of decision makers. These data are crucial in enabling managers and administrators to make informed decisions impacting student success.

4.C.4. The institution's processes and methodologies for collecting and analyzing information on student retention, persistence, and completion of programs reflect good practice.

As described in 4.C.2 and 4.C.3, the University produces a variety of reports for both internal and external constituents. UCM's processes and methodologies for collecting and analyzing information on student retention, persistence, and completion of programs do reflect good practice. As part of UCM's federal reporting requirements, the University uses IPEDS definitions for all official state and internal reports, not just for the measures listed above. For example, as a public institution in Missouri, UCM submits data regularly to the Missouri Department of Higher Education (MDHE) as part of the Enhanced Missouri Student Achievement Study (EMSAS). UCM also submits other data to the MDHE dealing with performance indicators on such measures as freshmen success rates, which are directly related to student persistence, retention, and completion. Definitions of these and other measures used by UCM are consistent with good practice.

Evaluative Summary

Areas of Growth and Success

  • UCM has a robust curriculum and program review process in place. The program encourages conversations across disciplinary and organizational boundaries, and provides peer review and validation of program quality.
  • The University has a well-established system in place for evaluating and articulating credit from other institutions, including dual credit courses taken by high school students. UCM has worked closely with colleagues across the state to develop articulation tables that allow prospective students to determine how their coursework will transfer. Specialized assistance is available for International Students and the Military.
  • UCM adopted the CQIP model, designed to guide faculty in development and implementation of student outcome-based assessment. Program who have followed the model consistently since the last visit have identified measurable student learning outcomes, developed key formative and summative assessments, collected student performance data, reviewed their data regularly in collaboration with stakeholders, and used data to advise students and inform curriculum and program improvement efforts.
  • In recent years, UCM has implemented a number of initiatives focused on promoting student success (e.g., recommendations from the Student Success Committee, Learning to a Greater Degree Contract).

Areas for Improvement

  • UCM lacks a robust institutional process for collecting follow-up data concerning graduate employment and graduate/employer satisfaction with the quality of UCM's programs.
  • Implementation of the CQIP model has been inconsistent across the University. While some areas have collected numerous years of student outcome data and have documented program improvements, others are still in the process of defining student learning outcomes and developing assessments. It appears the two major obstacles have been (1) the lack of an ongoing administrative system for providing faculty professional development and monitoring CQIP implementation, and (2) the lack of a University-wide data management system that would allow program-specific formative and summative assessments to be archived and analyzed by faculty.
  • Faculty implementation of the Student Success Committee's recommendations has been inconsistent (e.g., taking attendance, using the Early Alert System, conducting formative assessments early in the semester, posting midterm grades). The data indicate that students in classes where the recommendations are not followed tend to be less successful.