2 0 0 9 A n n u a l M e e t i n g P r e s e n t e r P a c k e t Sample Papers Examples "rom the 2008 Collection o" Papers on Sel"-Study and Institutional Improvement 30 North LaSalle Street, Suite 2400 | Chicago, Illinois 60602 312-263-0456 | 800-621-7440 | Fax: 312-263-7462 www.ncahlc.org Serving the common good by assuring and advancing the quality of higher learning Sample Papers from A Collection of Papers on Self-Study and Institutional Improvement, 2008 3 Inviting the Public In: Performance Outcomes and Accountability Stephane E. Booth and Laura L. Davis The demands on institutions o7 higher education 7or accountability continue to grow.
However, those demands do not come with mo re time, resources, or personnel to address the requests. There7ore, it is imperative that processes developed to respond to accou nt- ability issues serve multiple purposes, such as promoting student learning, moving the institution through a continuous improve ment process, and responding to the regulations o7 pro7essional accrediting and national accrediting bodies while exhibiting opennes s to accountability to legislators and the public. Participation in the Higher Learning Commission 9s Academic Quality Improvement Program (AQIP) set ... more. less.
the stage 7or Kent State Univ ersity to integrate these processes.<br><br> Embedding the values o7 accreditation in the strategic plan is the basis 7or the integration proc ess. Posi- tioning accreditation through AQIP as the means by which the university aligns its processes allows units to develop Action Pro jects that 7urther university goals while responding to the demands o7 accreditation without engaging in reporting that may seem either re petitive or at odds with other initiatives. Further, pursuing a continuous cycle o7 assessment and using results in the continuous loop o7 quality improvement keeps meaning7ul data and concrete results readily available.<br><br> The university can respond productively and with agil ity to mandates o7 the state o7 Ohio that cinvite the public in d to review per7ormance and 7uture plans that speak to statewide con tinuous improvement in public higher education. In June 2005, the Planning Committee 7or Higher Learning Accountability and Productivity was convened by the Ohio Board o7 Rege nts (OBOR) cto promote an atmosphere o7 responsiveness to issues concerning assessment and accountability among Ohio 9s colleges and universities d (http://regents.ohio.gov/colleges_universities.php). Kent State, because o7 its attention to assessment and accou ntability through AQIP and the strategic planning process, was able to take a leadership role in developing the student success plan that each Ohio higher education institution is cencouraged d to make available publicly.<br><br> To arrive at the issuance o7 the student success plan, a series o7 statewide meetings was held in June 2006, 7ollowed by ve regional workshops during the 7all o7 2006, another serie s o7 ve regional workshops in the winter o7 2007, and a statewide Student Success Summit on June 24, 2007. The progress Kent State had made in these areas was highlighted at the state meetings and the relevant regional meetings. Kent State was the rst instituti on to be able to post its plan to the Ohio Board o7 Regents (OBOR) Web site (http://regents.ohio.gov/StudentSuccess).<br><br> This leadership role was possible as the in7ormation had already been compiled through the university 9s AQIP process and was available publicly on t he university 9s Web site (http://kent.edu/aqip/). The student success plan has the 7ollowing characteristics: " It de nes outcomes and assesses student achievement o7 those outcomes in general education. " It de nes learning outcomes and assesses student achievement o7 those outcomes in undergraduate majors.<br><br> " It identi es and measures the impact o7 special 7eatures o7 the undergraduate learning experience that occur in institution- wide programs such as rst-year experience programs, residential learning communities, undergraduate research, study abroad, internships and co-ops, and service learning (http://regents.ohio.gov/StudentSuccess/accountability.html). In addition to these elements o7 the student success plan, Kent State also makes public through its Web site the results o7 its annual survey o7 baccalaureate graduates (both university and college summaries), results 7rom the National Survey o7 Student Engageme nt (NSSE), and the annual assessment report. With a change in the Ohio statehouse in January 2007, new parameters were established 7or higher education in Ohio.<br><br> The chancel lor o7 the Ohio Board o7 Regents became an appointee o7 the governor and a member o7 the governor 9s cabinet. Governor Ted Stricklan d, in August 2007, announced a structural reorganization 7or public institutions o7 higher education in the state under a University System o7 Ohio. Chancellor Eric Fingerhut was charged to develop a ten-year master plan 7or higher education by March 2008.<br><br> Housed in thi s plan will be measures to assess productivity. These measures are currently under discussion and may include the adoption o7 the Volu ntary System o7 Accountability (VSA) authored by the American Association o7 State Colleges and Universities (AASCU) and the National As- sociation o7 State Universities and Land-Grant Colleges (NASULGC). I7 the plan is adopted by the Ohio Board o7 Regents, Kent St ate is in a position to respond quickly.<br><br> While the change in the statehouse brought new requests 7or in7ormation 7or development o7 the governor 9s plan, previous mandat es continued. Thus, January 2007 also saw the submission o7 Kent State 9s second biennial cResults Through Productivity d report to the Ohio 4 Sample Papers from A Collection of Papers on Self-Study and Institutional Improvement, 2008 Board o7 Regents 4a report designed by the previous governor 9s Commission on Higher Education and the Economy. For its summary, the state chose to highlight Kent State 9s $3.5 million savings in 7acilities, construction, and public sa7ety in the two-year perio d, temporary and permanent savings through management o7 an early retirement incentive program 7or 7aculty, downsizing o7 academic programs no longer meeting public need, and, as described below, course redesign in math and English (http://regents.ohio.gov/per7rpt/in dex.<br><br> php). While preparation o7 the productivity report was in progress, the university engaged in year two o7 a ten-year state-mand ated doctoral review and reallocation process aimed at stimulating the state economy. Chie7 executives o7 the university also partic ipated in extensive meetings o7 the latest commission o7 the state legislature 4the Northeast Ohio Universities Collaboration and Innovati on Study Commission 47ormed in the waning days o7 the previous state General Assembly (the 126th).<br><br> Not surprisingly, the study commission requested 7requent ad hoc reports during its cycle o7 administrative and academic meetings that contributed to the nal commiss ion report (http://www.neostudycommission.org/FinalReport.htm). The ad hoc reports had the added 7eature o7 being collaborative wit h the 7our other public higher education institutions in northeast Ohio. The pattern clear, o7 course, the 127th state General Assembly came out o7 the starting gate with a new report and productivity mandate (http://www.legislature.state.oh.us/BillText127/127_HB_119_I_N.html).<br><br> This time, universities were asked to demonstrate e7 cien cy in delivery o7 undergraduate program, projecting at least 1 percent in savings 7or scal 2008 ($2.1 million 7or Kent State) and at least 3 percent 7or scal 2009 ($6 million 7or Kent State). And, once again, because o7 our commitment to quality improvement, which in cludes operating more e7 ciently and e77ectively, and our integrated processes, the university was readily able to show signi cant sav ings. Two months a7ter the publication o7 the mandate, Kent State reported $6.8 million in continuing savings 7or scal 2008, with an add itional $3.2 million in one-time savings 7rom collaborative initiatives.<br><br> Administrative initiatives totaled $7.7 million in savings. Ac ademic sav- ings were about $1.3 million. And salary savings 7rom vacant positions were about $1.9 million, 7or total e7 ciency savings o7 over $17 million in continuing savings, or $21 million overall.<br><br> While letting the public in leads to reporting demands that can consume great amounts o7 time, no matter how agilely an institu tion may respond, stakeholder input and assessment results are producing true academic quality improvement changes at Kent State. Studen ts and 7aculty members, along with the new university president, 7elt that the current university orientation course (required o7 all students) could be better geared to helping students know what it means to be success7ul academically at Kent State. There7ore, a redesig n o7 the course is underway.<br><br> University orientation became rst-year colloquium e77ective 7all 2007. Some o7 the basic in7ormation 7 rom the cold d course is provided in electronic modules, and class time is spent exploring a particular theme. During 7all 2007, 40 to 5 0 sections 7eaturing the new 7ormat were o77ered, with all 180 sections at the Kent Campus eventually being converted.<br><br> College, department , and 7aculty ownership o7 this course is essential to its makeover, and discussions are ongoing regarding the objectives and content o7 the course during this transition period. Based on the 7eedback received 7rom the pilot study, the university 9s initiation o7 a new budgeting process (RCM), and new executive leadership (president and provost), the university community is rethinking its thinking about this course once again, showing the ability 7or strategic and academic planning, assessment, and accountability to interrelate and b uild on each other to 7urther enhance the Kent State experience. The department o7 English completed a course redesign project that extended through scal 2005 and 2006 7or its rst-year liber al education requirements (LER) composition courses, with that redesigned curriculum e77ective the 7all 2007.<br><br> The redesigned cours es respond to the need 7or active and success7ul participants in today 9s economic, social, and democratic environments to sharpen com- munication and in7ormation skills that extend beyond the traditional modalities o7 reading, writing, and speaking. Today 9s grad uates also need to be able to create, understand, and critically analyze a variety o7 visual, auditory, and textual 7orms, o7ten via computer- mediated systems. In addition to making LER courses more e77ective in the way they prepare students, the English composition co urse redesign is more institutionally e7 cient.<br><br> The redesign has resulted in the projected o77ering o7 7ewer three-hour sections on the Kent Campus 7or academic year 2006 32007, resulting in an approximate savings o7 $230,000 during scal 2007. Also during scal 2005 and 2006, the department o7 mathematics completed a redesign project 7or its two introductory math cours es. Results during the pilot showed improved student success rates (students receiving grades o7 A, B, or C).<br><br> Academic year 2006 320 07 was the rst 7ull implementation o7 the new curriculum. The student success rate on the Kent campus during 7all 2006 was 70 per cent, compared with 50 percent under the previous curriculum. The Math Department is also completing an online textbook that will be available to students at a low cost and exactly ts the curriculum at Kent State.<br><br> A major continuous improvement initiative currently underway is engaging 7aculty members 7rom all colleges in discussions that will succinctly de ne the philosophy o7 an undergraduate education 7or the twenty- rst century at Kent State. On the conclusion o7 t hese conversations, the 7aculty, based on the newly stated philosophy, will examine the university 9s liberal education requirements and curricula. At the same time, discussions continue relating to the content o7 a success7ul rst-year experience.<br><br> Kent State 9s desire to continuously improve and its ability to match this desire with a mode o7 accreditation (AQIP) that integ rates this approach continue to place it in a position to be proactive and transparent as accountability demands become more pressing. Stephane E. Booth is Associate Provost for Academic Quality Improvement, and Laura L.<br><br> Davis is Associate Provost for Planning and Academic Resource Management at Kent State University in Kent, Ohio. Sample Papers from A Collection of Papers on Self-Study and Institutional Improvement, 2008 5 Assessment Oversight: For Nonperfectionists Who Are Seeking Perfection Marie Baehr and Jennifer J. Fager In almost any research document or presentation, most o7 what is communicated ignores the ts, starts, and ine77ective paths.<br><br> T his can lead to the audience 9s belie7 that strategies can be e77ective, e7 cient, and meaning7ul 7rom the very start o7 and throughout the process. This paper is written to help those engaged in assessing student learning understand that, based on our observations in mentori ng dozens o7 college teams at the Higher Learning Commission Assessment Workshops, developing and sustaining use7ul assessment practices is rarely ever clean, e7 cient, and e77ective. With this knowledge, however, should come the 7reedom to try various s trategies with the understanding that there is worth in nding out what does and does not work.<br><br> This is not a research document, but rather a refection o7 what the authors have learned in mentoring dozens o7 college and uni versity teams sent to the Assessment Workshops. These observations are organized into three distinct, yet related categories: commitmen t to assessment processes and procedures, compliance with regional and specialized accreditation, and the search 7or the per7ect ass ess- ment tool. The paper concludes with our advice (in no particular order), based on these observations and experiences.<br><br> Compliance Is Not Enough Many colleges and universities use a lot o7 resources (time and money) trying to comply with perceived Higher Learning Commissi on mandates. However, without valuing and embracing the message o7 the mandates, it rarely works well. For instance, anyone can wr ite expected student outcomes, but i7 the outcomes are not valued by those who teach, whether they are met or not will not be value d.<br><br> Anybody can ask students to respond to surveys, but unless the responses are valued by the person responsible 7or understanding them, they will rarely be analyzed in a timely 7ashion. I7 all assessment e77orts are the responsibility o7 one or a very 7ew i ndividuals, keeping the process going i7 the individual leaves the institution will be very di7 cult. I7 all assessment e77orts are the res ponsibility o7 one or a very 7ew individuals who have other major responsibilities (such as institutional research, department chairing, or te aching), assessment will likely not be a priority o7 the institution as a whole.<br><br> I7 the impetus to assess occurs only a year or two be7o re an ac- creditation visit, it is probable the process will stall a7ter the visit, unless 7ollow-up is required. I7 an assessment proces s is described in a sel7-study as a per7ectly operating process, peer reviewers will be suspicious. Commitment Is Not Enough Many institutions understand the need to improve student learning and are committed to doing so, but they nd the necessary doc u- mentation and ability to make sense o7 the patterns time-consuming, tedious, and not particularly use7ul.<br><br> In this case, the ins titution 9s commitment to improving student learning is not enough. No matter how committed the college 7aculty and sta77 members are to improving student learning, they must document what they nd out and what they improved 4not just 7or their accrediting institutions, but 7or parents and 7or current and prospective student s, who are also interested in how you know that your students are learning what you say they are learning. Sometimes thinking you know and knowing you know are not the same thing.<br><br> By creating a way to discover actual student learning, a college will o7ten get unexpe cted, and sometimes welcome, ndings. Keeping records o7 actual student learning is good practice and is becoming more and more integ ral to good teaching strategies. The expectation o7 cogent evidence by accrediting agencies is becoming increasingly important and it is not going to go away.<br><br> No Assessment Tool Is Perfect No institution can nd out what all students know all the time. To try to nd out is expensive and time-consuming; and it rarel y tells you more (and sometimes tells you less) than does looking at a sample o7 students on a sample o7 topics. It is di7 cult to nd an a ssessment tool that measures all levels o7 knowledge equally well.<br><br> Very simple, 7undamental questions can let you know how poorly student s understand a topic, but they can tell you little about how well students understand the topic. Conversely, complex questions ca n tell you how well students understand a topic, but they can tell you little about how poorly they understand it. Assessment tools ra rely work the way you think they will the rst 7ew times o7 use.<br><br> Part o7 the assessment process is assessing the e77ectiveness o7 the too l used to 6 Sample Papers from A Collection of Papers on Self-Study and Institutional Improvement, 2008 collect evidence o7 student learning. Some tools, identi ed as assessment tools, either institutionally created or purchased 7o r use, are not, in 7act, measures o7 student learning. They are o7ten use7ul to get a handle on issues such as campus climate, but the y provide little use7ul in7ormation on actual student learning and typically do not aid in determining academic program improvement.<br><br> Advice Based on these observations, we would like to provide some advice as you build or tweak your assessment processes: " Collect a little in"ormation be"ore building an entire process. This allows you test all components o7 the process be7ore collecting and analyzing evidence that might not get you the in7ormation you planned to collect. In this way, you can test your mode o7 collection, review types o7 responses you get that are both expected and unexpected, determine the value o7 the responses to you and others, and gure out how to make sense o7 the collected in7ormation.<br><br> This will allow you to make changes in the data collection process be7ore spending resources on analyzing specious data. " Be willing to change plans i" things don 9t work as expected. Many institutions created a plan endorsed by the Higher Learn- ing Commission and 7eel compelled to stay with the plan even when it is not providing use7ul in7ormation to the institution.<br><br> The Commission most probably approved the plan assuming it would supply use7ul in7ormation. " Make sure that those who need to care about the results care about the stated outcomes and the methods "or collecting the evidence. I7 you are at an institution where much in7ormation is collected, but nobody takes the time to analyze it and nobody takes the time to urge the appropriate people to analyze it, there are indications that nobody cares very much about the results.<br><br> In a good process, appropriate people are eager to discover the ndings, 7or they care about and trust the ndings enough to base decisions on them. " Plan not only the tools you will use to collect evidence, but also who will take responsibility "or analyzing and dissemi- nating the in"ormation to the appropriate people, all o" whom should also be defned ahead o" time. Many institutions do a great deal o7 planning in all areas except the human power needed to make meaning out o7 the collected in7ormation.<br><br> Part o7 the assessment plan should include who does what and when they do it, so the system doesn 9t break down 7or lack o7 time. " Stop collecting in"ormation i" you are not using it. Whatever time and money you spend on collecting in7ormation you do not use is wasted.<br><br> " See assessment like teaching 4a process where you can improve with time, but you must start somewhere, even be"ore it is a per"ect system. As in teaching, the process gets better as you nd more e77ective ways to traverse it. As in teaching, no matter what, the process is never per7ect, and some things work better than others.<br><br> " Don 9t reinvent the wheel. Begin developing an assessment system with a needs analysis. Examine what you have already accomplished.<br><br> Look 7or models within your institution that have worked or that have allowed the institution to learn something about the process. " Remember that stated outcomes defne a program. Make sure outcomes are written 7or each program so that all stakehold- ers can understand them and the institution is capable o7 determining the extent to which they are met.<br><br> " Develop a plan that includes all areas o" the campus. Write policies and procedures that call 7or articulation o7 planned assessment processes. Make sure each person who is essential to the success o7 assessment has clear responsibilities.<br><br> " Hire leaders committed to assessment. Include assessment o7 student learning in job postings, 7or hiring people who sup- port the e77ort will help secure sustainability. " Celebrate successes and share models that succeed in collecting meaning"ul in"ormation that leads to understanding o7 strengths and need 7or improvements.<br><br> Marie Baehr is Vice President for Academic Affairs and Dean of the Faculty at Coe College in Cedar Rapids, Iowa, and Jennifer J. Fager is Chair and Assistant Professor of Teacher Education: Middle and Secondary at Saginaw Valley State University in University Ce nter, Michigan. Sample Papers from A Collection of Papers on Self-Study and Institutional Improvement, 2008 7 A Faster Track for Developmental Math Brian D.<br><br> Posler, Kathy V. Rodgers, and Amber Hughes Purpose and Scope This presentation 7ocuses on two intense and inexpensive programs designed to decrease the amount o7 time students must spend strengthening their mathematical skills and basic knowledge prior to enrolling in their rst college-level mathematics course. Both the Rapid Review initiative and the Summer Bridge program were created with the goal o7 eliminating one o7 the developmental math c ourses required 7or students who placed into the rst course.<br><br> The Rapid Review took place during the rst three weeks o7 the semester 4 students participated in individualized guided study in an open lab with a math instructor present to answer questions. The Summer Bridg e program was an intensive ve-week course prior to the beginning o7 the 7all semester 4the required meeting times 7or this course were expanded to include study time 7or additional practice and 7or individualized assistance. Our presentation includes an analysis o7 the results o7 these two initiatives.<br><br> We compare success rates o7 these students to si milarly prepared students enrolled in the same course, during the same semester, who did not participate in either o7 these initiatives ; we also present data pertaining to the success rates o7 these students in subsequent math courses. Last, we look at retention rate s o7 this cohort to ascertain whether limiting the number o7 developmental math courses impacts student retention rates. We are con dent that our work in this area is starting to pay dividends to both our students and our university.<br><br> We would like to share the logistics o7 our programs as well as our ndings with 7aculty and administrators at institutions where students are struggl ing through multiple semesters o7 developmental mathematics courses. The National Problem Developmental education has been a part o7 higher education since early colonial days; today, however, there is an increased aw areness o7 the need 7or additional developmental education due to increasing numbers o7 underprepared students entering college. As ear ly as 1995, over 75 percent o7 institutions o7 higher learning recognized the need 7or developmental courses, with approximately 30 p ercent o7 rst-time 7reshmen enrolled in at least one developmental course, most o7ten mathematics (Merisotis and Phipps 2000).<br><br> In 2000, the National Assessment o7 Educational Progress (NAEP) conducted a study o7 mathematical pro ciency in grades 7our, eight, and twel ve. This assessment revealed a decline in the average score o7 twel7th grade students during the years between 1996 and 2000 (Brasw ell et al. 2000).<br><br> These data, coupled with the 7act that approximately 60 percent o7 high school graduates go on to college, indicate that the number o7 underprepared students requiring remediation in mathematics at the college level is an issue that must be addressed. The evidence is compelling that remediation in colleges and universities is not an appendage with little connection to the miss ion o7 the institution but rather represents a core 7unction o7 the higher education community that it has per7ormed 7or hundreds o7 years. (Merisotis and Phipps 2000, 79) Data 7rom the National Center 7or Educational Statistics indicate that 22 percent o7 entering 7reshmen enroll in a remedial mat hematics course (Greene et al.<br><br> 2003; Braswell et al. 2000). Additionally, data 7rom our institution indicate that when a student 7ailed a devel- opmental math course, the probability o7 success on the second attempt was less than 20 percent.<br><br> We recognized that continuing to enroll students in a developmental math course such as Intermediate Algebra without employing some intervention tactics was una c- ceptable; 7undamental changes in the way these students approached the study o7 mathematics had to be made in order to increase their chances o7 success. Members o7 the Department o7 Mathematics developed a pro le o7 students enrolled in Intermediate Algebra through an interview process. We asked students how they approached the study o7 mathematics; how they took class notes and what they did with the notes a7ter taking them; i7 they read their math textbook; i7 they did homework on a regular basis; and what level o7 importanc e they placed on attendance.<br><br> Succinctly, we developed math autobiographies o7 these students. A7ter collecting this in7ormation, we de veloped strategies to encourage behavior characteristics common to students who were success7ul in mathematics and strategies to addres s major behavior characteristics common to students who were not success7ul in mathematics. Two major initiatives 7rom this study were a mandatory attendance policy 7or all developmental mathematics courses and an expanded, 7our-credit o77ering o7 Intermedi ate Algebra 7or those who had been unsuccess7ul in their rst attempts.<br><br> Having achieved some success with these initiatives, our university recognized that there was a second problem to address 4the n umber o7 students needing two developmental math courses. 8 Sample Papers from A Collection of Papers on Self-Study and Institutional Improvement, 2008 The Rapid Review Initiative We initiated a pilot program called Rapid Review 7or students with placement scores within 7our points o7 the score required 7o r Interme- diate Algebra. Rather than enroll in an algebra review course, these students could elect to participate in a guided, individua lly paced review session 7or three weeks.<br><br> (There was no cost to the students and likewise no credit hours associated with this review; st udents were required to complete the guided study in order to retake the placement test. The guided sel7-study was conducted in an open lab with instructors 7rom the Department o7 Mathematics present to answer questions.) At the completion o7 the review session, these stu dents were permitted to retake the placement test; students scoring above the desired cut-o77 score were permitted to immediately enr oll in an intermediate algebra class that began on the 7ourth week o7 the semester. Among the bene ts to students participating in this 7 ast-track program was the opportunity to start their study o7 mathematics at a higher level, saving them both time and tuition dollars.<br><br> The Summer Bridge The purpose o7 the Math Summer Bridge program was to improve the success rates o7 entering 7reshman students in developmental math courses. Through the Bridge program, students enrolled in an enhanced developmental math course during the nal session o7 the summer term. Special 7eatures o7 this math course were: smaller class sizes ( 7teen students), a required hour-long guided- study session at the end o7 each class meeting, and an optional weekly workshop covering college transition topics.<br><br> The extra hour o7 class time, along with the smaller class sizes, allowed instructors to better engage students and to individualize instruction. Instr uctors incorporated multiple instructional techniques utilizing whiteboards, class board work, group work, and computer programs. Also as part o7 the Math Summer Bridge program, college transition issues were covered in various ways.<br><br> Instructors discussed math study skills, note-taking skills, and test-taking strategies in the classroom, helping prepare students 7or later math courses. Addition- ally, a weekly workshop was o77ered during the second Math Summer Bridge program to rein7orce study skills taught in the classr oom as well as other issues such as time management, utilizing campus resources, and transitioning to college. The casual atmosphere o7 the workshops allowed students to discuss their experiences openly with advisers and 7ellow classmates.<br><br> Results For our Rapid Review program, approximately seventy-eight letters were sent to students having placement scores within 7our poi nts o7 the required score needed to enroll in Intermediate Algebra. The letters explained the Rapid Review pilot program and the proce dures 7or enrolling. Twenty- ve students responded to the letters, and twenty-three o7 them participated.<br><br> A7ter three weeks, twenty-o ne o7 the twenty-three students improved their placement scores enough to enroll in a late-starting Intermediate Algebra, with the potential to save a semester o7 time as well as the tuition and textbook costs o7 the class. We credit much o7 the success o7 this initiative to the individual attention provided to each student. A program coordinator co nducted orientation sessions; each student was given a diagnostic test and an individualized plan 7or modules to review; each student 9s progress was tracked; each student was apprised o7 his/her progress throughout the program; and each student received weekly e-mail mess ages.<br><br> The coordinator worked at keeping the students motivated to complete their individualized programs. She learned their names eve n though it was an open lab, and she conversed with the students on a regular basis. Fi7ty-eight percent o7 the students completi ng the course earned grades o7 C or better; this is comparable to the overall success rate o7 all students completing the course.<br><br> Over the two 7all semesters that we have implemented the Summer Bridge program, 7ty-three incoming students have participated. O7 these students, 86.79 percent passed the developmental math course taken as part o7 the Math Summer Bridge, 7.54 percent withdr ew, and only 5.66 percent did not pass. These success rates are markedly higher than the typical rates 7or our algebra review mathe matics course.<br><br> We have tracked the students through their subsequent coursework as well to see how they 7ared. Following participation in the Bridge Program, thirty-nine o7 the 7ty-three students went on to take the next math course in the sequence that 7all semester. O7 these students, 56.4 percent passed their next math course, 12.8 percent withdrew, and 30.7 percent did not pass.<br><br> These rates o7 succ ess are almost identical to the overall population o7 students in the Intermediate Algebra course, so it is clear that these students w ere able to save a semester o7 developmental coursework without decreasing their opportunities 7or success in Intermediate Algebra. We are still tracking persistence to graduation 7or these students, but early signs seem to indicate an increased likelihood o7 persistence. Lessons Learned " For these programs to be success7ul, support must come 7rom every level 4the department, the college o7 ce, and the provost 9s o7 ce.<br><br> " For the Rapid Review, students need the 7reedom to choose times they will work in the labs, but there needs to be at least three scheduled hours per week in the lab. Sample Papers from A Collection of Papers on Self-Study and Institutional Improvement, 2008 \xe " Discounted tuition was granted to students during the rst year o7 the Summer Bridge program, but we 7ound that lowering the tuition 7or students in this program did not increase participation. " The weekly transition workshops should be mandatory rather than optional.<br><br> " In the Bridge, math instructors discussed math study skills, test-taking strategies, and general math tips at varying levels (depending on instructor). In the 7uture, the college transition workshops will be led in part by a math specialist and will 7ocus more speci cally on math-related skills in a better attempt to bridge students to the next level o7 math. The experience o7 our university shows that it is possible to implement rigorous, intensive programs that can be e77ective in d ecreasing the length o7 time students spend in developmental mathematics courses.<br><br> The intensive practice that occurs in these two program s o77ers signi cant improvement in scores, 7or the content o7 these courses is generally content that the students have 7aced ear lier in their academic careers. We nd that accelerating the preparation 7or college-level work increases student success, satis7action , and persistence. References Braswell, J., A.<br><br> Lutkus, W. Grigg, B. Santapau, and M.<br><br> Johnson. 2000. The nation 9s report card: Mathematics 2000.<br><br> http://nces.e d.gov/ nationsreportcard/pubs/main2000/2001517.asp#section2. Greene, B. R., et al.<br><br> 2003. Remedial education at degree-granting postsecondary institutions in 7all 2000. NCES 2004010.<br><br> http:/ /nces. ed.gov/surveys/peqis/inc/displaytables_inc.asp. Merisotis, J., and R.<br><br> Phipps 2000. Remedial education in colleges and universities: What 9s really going on? The Review of Higher Educa- tion 24: 67 385.<br><br> Brian D. Posler is Assistant Vice President for Academic Affairs, Kathy V. Rodgers is Chair of the Mathematics Department, and Amber Hughes is ACHIEVE Coordinator and an Academic Advisor at the University of Southern Indiana in Evansville.<br><br> 10 Sample Papers from A Collection of Papers on Self-Study and Institutional Improvement, 2008 Designing a Faculty-Centered Self-Study with Shared Leadership Richard L. Brown, Melinda S. Kreisberg, and John Giesmann West Liberty State College (WL), in the northern panhandle o7 West Virginia, is a small (enrollment approximately 2400 students ) state- assisted 7our-year institution o7 higher education.<br><br> The college undertook most o7 its sel7-study process during 2006 32007, culm inating in the team visit the rst three days a7ter the Thanksgiving holiday in 2007. The 7ollowing treatise summarizes the process uti lized at WL to success7ully provide substantiated evidence supporting the ability o7 the college to meet each Criterion established by t he Higher Learning Commission. What to Do First?<br><br> A7ter the appointment o7 tri-chairs to lead the sel7-study process during the 7all term o7 2005, WL arranged 7or two o7 these i ntrepid individuals to attend the Workshop on Sel7-Study at the Annual Meeting in Chicago, an excellent idea as not one o7 the individu als had participated in the last accreditation cycle. Two had been hired a7ter that cycle, and the other was in his rst year at WL dur ing that cycle. The third chair was unable to attend this rst workshop because o7 prior commitments in his pro7essional eld.<br><br> Upon retu rning 7rom the workshop, the two attendees were in agreement on the two all-pervasive take-home messages: one voice, and absolute, brutal honesty. One Voice? The two chairs in attendance split the workshops to maximize their exposure to the in7ormation on how to do your sel7-study.<br><br> We were in complete agreement when we le7t that we really still had no idea what we were doing. We did learn, repeatedly, and we report ed, repeatedly, that the sel7-study should be in one voice. In other words, the best sel7-studies are related by one ultimate write r.<br><br> We were warned (yes, repeatedly) that many sel7-study teams 7ound themselves looking at a document that was not cohesive because each section was written by a di77erent person. In all cases, the teams 7ound that they needed to rewrite the document to tie each s ection together seamlessly 4in one voice. Our decision was to 7orego this possible complication by appointing one person as the writer.<br><br> One might assume that this esteemed position would be bestowed upon an individual in the English program 4ah, assume . In our case, the role 7ell to the tri-chair hailing 7rom the sciences. Scary proposition, but, really, who best to write concisely, objectiv ely, and without ob7uscation?<br><br> Yes, I do write this tongue-in-cheek because as the scientist in question I have read too many science-based publi cations that exhibit the opposite o7 each o7 the above qualities. Search high, search low, but nd someone who will write to these goals. The alternative is to use editors who take the written ndings and meld them into a nal document.<br><br> The logistics o7 this endeavor 7airly boggle the mind. My 7ellow tri-chairs and I were in n o way interested in undertaking this avenue. Absolute, Brutal Honesty?<br><br> Well, yes. The second take-home message 7rom that rst sel7-study workshop gave us the honest, brutal truth 4i7 you do not lay i t all in the open, the team will just nd out anyway and then you will be up the proverbial creek without a 7unctional paddle. While brutal honesty seems dangerous, it is not.<br><br> This is a self-study 4the good, the mediocre, and the not-so-good. It is an excellent time, not to mention one o7 the ultimate goals o7 the sel7-study process, 7or the institution to review its progress or lack thereo7 in the Criteria est ablished by The Higher Learning Commission to de ne a success7ul establishment o7 higher education. The only way to improve is to establish whe re the institution excels and where more progress is needed, and also to explore ways in which that progress can be achieved.<br><br> Was That All We Learned? No, but de nitely the most important 7or our rst sel7-study workshop. We also returned with ideas on how to organize the proce ss.<br><br> We established, in addition to the tri-chairs group, a coordinating committee and seven working groups. The tri-chairs group in cluded the interim president and the provost in addition to the three appointed individuals. The coordinating committee included the a bove, Sample Papers from A Collection of Papers on Self-Study and Institutional Improvement, 2008 11 as well as each academic dean, directors o7 various campus 7acilities (library, physical plant, registrars, student a77airs, an d so on), and 7aculty, sta77, and student representatives.<br><br> Finally, the working groups were divided by Criterion, with Criterion Three se parated into Core Components A/B and C/D. The seventh working group was established to prepare the exhibits room 7or the team visit. Wh ile WL does o77er summer classes, the majority o7 the 7aculty members are not involved in these o77erings, so we also determined th at the best time to kick-o77 the sel7-study process o7 cially was 7all term 2006.<br><br> Many 7aculty members, as well as sta77 members and s tudents, were involved in the working group structure. Additionally, in7ormation 7rom all campus constituencies would be necessary to pr epare a comprehensive sel7-study. To begin this process over the summer months would provide more 7rustration than actual progress, h ence the 7all start.<br><br> O7 course, we were very aware that our timeline was now under a year 7or the actual document preparation. No pr oblem, we thought, human nature usually dictates that we save everything until the pressure is on. Best Laid Plans?<br><br> Fall 2006 approached, meetings were scheduled, and in7ormation was mailed to each member o7 the working groups. We met be7ore the start o7 the term and did the expected 4gave an overview o7 the accreditation and sel7-study process, de ned our roles as th e tri- chairs, presented the Criteria 7or Accreditation, established deadlines 7or submission o7 evidence, and allowed time 7or work s essions in which each group could begin to structure itsel7 with a group leader and assigned roles, determine possible types o7 in7orma tion available that would speak to their Criterion, and ask questions o7 the tri-chairs i7 needed. Oh my 4that question part 4needless to say, we were inundated with questions.<br><br> Most pressing and most understandable, but 7rustrat - ing, was cBut what do you want exactly? d There is no cexactly d about it 4it is incumbent upon each member o7 the campus communit y to determine how best WL meets each Criterion and where the problems are, and to produce the data to support these statements. In other words, 7or each institution, each working group, and even each member o7 the campus community, the interpretation o7 t he Criteria may di77er as will avenues to support those assertions. It is the job, rst, 7or the working group to collect as much evidence and data to illustrate the ability o7 the institution to meet the Criterion.<br><br> I7 the evidence or data are lacking, then the group mi ght provide possible options to aid in addressing the lack in the 7uture. A process in which there is no scripted approach 4amazing how that can cause such problems in a community in which 7ree-thinking is encouraged and modeled. Be prepared; it takes time, but eventually the process begins in earnest.<br><br> Deadlines? There is no such thing as a deadline except those imposed by the Commission. Internal deadlines are made to be broken and tend not to be just broken, but smashed to smithereens.<br><br> Our initial deadline o7 November 2006 7or all working group evidence and data wa s met by one group (bless them). We were able in their case to request more in7ormation as needed during the writing and upon reviewi ng the rst dra7t o7 their Criterion 4by both the tri-chairs team and the coordinating committee. Early in January 2007, one more group nished with its initial charge; the process was repeated with them.<br><br> The other groups 4we were not so 7ortunate. The initial timeline 7or the sel7-study process targeted March 2007 as the date 7or the rst 7ull dra7t o7 the document to be a vailable 7or open review by all constituencies. That did not happen.<br><br> Instead, as chapters were written and reviewed internally, they wer e then released to the working group 7ollowed by campus-wide release via a special Web page (http://7aculty.westliberty.edu/HLC) 7or t he sel7-study process. The bulk o7 the writing occurred during the summer months o7 2007. Remember, our team visit was scheduled 7 or the rst week a7ter the Thanksgiving holiday.<br><br> Nerve-Wracking? A resounding YES! In addition to deadlines that were not met, some constituents were not on board 7or the process 4a situation t hat is probably true everywhere.<br><br> Sometimes, these constituents merely sit back and allow the process to occur over and around them; in other cases, they actively work as blocks within the process. The rst are workable; the second are unpleasant. One o7 the best choices WL made was in the appointment o7 the tri-chair who dealt directly with the campus.<br><br> His blend o7 humor and diplomacy was a much needed attribute in working over, under, around, and/or through individuals who did not see eye-to-eye with the process 4mind yo u, not the process established by WL to complete the sel7-study, but the whole accreditation process o7 providing proo7 that WL is com- mitted to higher education and the principles established to provide guidelines 7or the presentation o7 that evidence. As the w riter, given my own delicate mental state (and the new gray hairs) by this point in the process, I would have been a bull in the prove rbial china shop. Having the ability to merely call or say (as he stopped by 7requently to check on me) cDick, I need help getting th is d was like manna 7rom heaven.<br><br> Dick would always say cno problem d and o77 he would go to solve the problem. The third tri-chair had hi s own nerve-wracking role o7 being the liaison with the Commission. Every time we had an o7 cial question, o77 he went to nd the ans wer.<br><br> For those last minute surprises 4like the section on 7ederal compliance 4he 7ound whatever we needed and handled the logistics o7 sending the documents and requested materials to the Commission and each team member. He also handled all pre-visit interaction with the consultant-evaluator team. 12 Sample Papers from A Collection of Papers on Self-Study and Institutional Improvement, 2008 Bottom Line?<br><br> We nished on time, sent the document, managed the exhibit room, and success7ully completed the team visit 4with a 7avorable outcome. What Did We Learn? Quite a 7ew things and, hope7ully, by writing this submission to the annual Workshop on Sel7-Study, we will have the written re cord 7or the next ten-year cycle.<br><br> First, honesty really is the best policy. Yes, it is trite, but, oh, so true. The point o7 the sel7-st udy (that is also missed by the nay-sayers o7 process) is to strengthen the institution using a no-holds-barred approach.<br><br> I mean, really 4while th e e77ort is valid and certainly should occur as a matter o7 course, it is rather too extensive an undertaking to rely on pure altruism. We are all caught in the demands o7 our regular duties, including evaluating our achievements within our roles at the college. We always t end to do these other types o7 review and analysis on smaller, individual scales within programs, departments, 7acilities, and o7 ces 4 but not on the institution-, constituency-wide scale required by the Commission.<br><br> So, yes, this whole process is a good thing 4the di 7 culty is explaining why. Second, you can work within a very tight timeline. Do not get caught in the individual deadlines or goals; it is the end produc t that is important.<br><br> You will get there simply because in those last hours, you can accomplish the seemingly impossible through sheer 7or ce o7 will. Third (to go along with the second), establish timed goals 7or the process. They do establish a check system wherein you will b e able to determine exactly where you are and how much remains.<br><br> Just remember not to get bent out o7 shape i7 the goals are not met. Agai n, it is the end product and the Commission deadlines that are o7 ultimate importance. Fourth, communication cannot be underrated.<br><br> We utilized electronic messages and a dedicated Web page 7or the sel7-study. Campus - wide e-mail messages were sent with updates; a quick link was placed on the home page 7or WL that led directly to the dedicated Web page; and the Web page address was sent campus-wide through the electronic notices. Additionally, throughout the process, we me t with the coordinating committee and working groups as needed.<br><br> The tri-chairs team met quite regularly 7or the rst wave o7 revi ew and analysis o7 each step. The entire campus was prepared through multiple sessions with each program, department, and/or school 7o r the academic end by meeting with the student government association, by advertising in the college news publication, and throug h third-party comment advertisements. Fi7th, the best approach is gentle humor.<br><br> That old adage cyou can catch more fies with honey d is certainly true. Our tri-chairs team was 7ortunate in that we worked incredibly well together and could laugh throughout the entire process. Above all else, the teams t hat ac- cept the most responsibility 7or the sel7-study should be able to 7unction seamlessly.<br><br> I cannot imagine undertaking this proces s without colleagues who are supportive and in all ways wonder7ul. We were 7ortunate in this, and I speak 7or all o7 us when I say we now share a special bond 4one that our respective spouses just accept. This approach also eased the preparation o7 the campus 7or the visit.<br><br> I7 we could relax and laugh and joke during this process, the message translated by the campus was relax, be yoursel7, be honest, and do your job. This seemed to soothe anxiety over the idea that anyone could be stopped on campus and questioned by the visiting tea m. Sixth, smaller to midsize groups work best.<br><br> Larger groups are impossible to get together during the busy academic year, and, ev en when they do get together, there is sa7ety in numbers and less is accomplished. Streamline the process, and you will be pleased with the result; we were. O7 course, we learned this a7ter we established a committee that was too large.<br><br> However, we were able to w ork with that committee, but less e77ectively than rst planned. Seventh, I personally learned not to be o77ended when campus colleagues approached me, all in earnest, and asked cHow are you d o- ing? d in such a way that I thought they had heard that someone in my 7amily had died or that I was terminally ill. None o7 thos e things had happened, o7 course 4I was just the sel7-study writer.<br><br> Richard L. Brown is Professor of Music, Melinda S. Kreisberg is Associate Professor of Biology, and John Giesmann is Director, Insti- tutional Research and Assessment, at West Liberty State College in West Liberty, West Virginia.<br><br>