jossey-bass

Building a Culture of Assessment: Ode to Musun, Baker, and Fulmer

Building a Culture of Assessment: Ode to Musun, Baker, and Fulmer

By Jo-Ellen AsburyAugust 18, 2010 | Print
The good news: my institution had a successful accreditation visit in the spring of 2008. The bad news: my institution had a successful accreditation visit in the spring of 2008. Why is that bad news? I call it the Whew! factor— as in “Whew, that’s over for another ten years; now, let’s get back to business as usual.” The search for ideas on how to combat the Whew! factor led me to an Assessment Update article by Linda Musun, Aaron Baker, and Jim Fulmer (2006), which planted the seed for what eventually became our event, Putting Your Best Foot Forward: Supporting the Mission Through Assessment. Our event morphed from one vision (poster sessions à la Musun, Baker, and Fulmer) into a completely different one (roundtables). Along the way, we learned a number of lessons that will likely resonate with other readers of this publication.

In the July–August 2006 issue of Assessment Update, Musun, Baker, and Fulmer described their experiences in organizing an assessment expo as an “avenue for communication about our assessment efforts and an opportunity to publicly celebrate our successes” (p. 1). This sounded like what we needed in the wake of our accreditation visit. I already knew—from various anecdotes that had reached my ears—that some members of our community saw assessment as something we did only to appease outside accreditors. Adding to that a decline in the rate of those who were submitting their unit assessment plans in a timely manner (from 83 percent to 51 percent), it was clear that we needed something to reenergize our assessment efforts. Armed with Musun, Baker, and Fulmer, we started planning for an event to take place in spring 2009.

The first steps were to secure funding for the event and buy-in from the rest of my staff. In retrospect, I see that while these steps were the easiest to accomplish, they yielded our first lesson. We (the staff of the Office of Institutional Research and Assessment) were confident that we had enough assessment converts to pull off the poster sessions that Musun and her colleagues described. The advisory group we later formed had a different perspective, and it appears that we were correct in accepting their judgment.

Lesson 1: Don’t get too steeped in your own PR. Always remember to take a step outside the warm, comfortable pond of assessment advocates to get a sense of how you are really doing. Our advisory group (faculty and staff who “got it,” in our view) encouraged us to consider hosting roundtables rather than poster sessions for two reasons: (1) they weren’t sure that enough volunteer presenters would emerge; and (2) they wanted to be sure that the presentations contained information that we actually wanted to promulgate. In the end, the advisory group recommended a survey of faculty and staff to determine which topics would be of greatest interest. They helped us generate a list of topics. Then we asked a random sample of faculty and staff to rate how likely they would be to attend each session if time and location were not factors. We allowed the survey results to guide us in deciding which roundtables to offer. Subsequently, the topics that were rated most popular on the survey were generally the best attended on the day of our event.

We decided that the optimal format would be to have an hour of roundtable sessions (four or five, simultaneously), come together for a keynote speaker, then have another hour of roundtables. We were able to secure a popular local assessment expert, Virginia Anderson of Towson University, to give the keynote speech. Anderson’s reputation no doubt helped with our off-campus recruiting efforts, which were by no means sophisticated. The response from beyond our campus was greater than we anticipated: 60 percent of those who attended came from other campuses.

Lesson 2: Many of your colleagues want a relatively painless way to learn about assessment. We encouraged this type of participant by making the titles of our sessions as catchy as our sense of professionalism allows. We wanted to send the message that all would feel comfortable, regardless of how much or how little assessment experience they brought to the table.

Lesson 3: Your assessment colleagues at neighboring institutions may be hungry for a forum or an outlet. The fact that three assessment colleagues from other campuses volunteered to present roundtables speaks volumes about the silos— perceived or otherwise—that many of us in assessment offices labor in. Connecting with assessment colleagues at institutions not far from ours was an unanticipated benefit of our event.

The evaluations of the day indicated that Anderson was a hit. Evaluations of the roundtables were also positive; attendees commented favorably on the amount of information covered in the time allotted, the usefulness of the information in relation to their work, and the level of the content presented (not too elementary or too complex). Based on attendance and the number of rating forms turned in for the respective roundtables, we found that sessions on in-house survey resources and general education assessment were most popular.

Lesson 4: Consider what question you should have asked. This lesson became clear only after some reflection on our evaluation results. In our attempt to keep the evaluation form short, we opted not to include demographic items on the forms. We should have asked participants to indicate how much assessment experience they brought to the event. It would have given us a clearer picture of our audience members and enabled us to sort out which group our format appealed to most. We did, however, get everyone’s title, which provided some insight.

Did we achieve our primary goal of continuing to build a culture of assessment? Given that we drew greater numbers from beyond our campus than from within, this remains a question. Those from our campus who did attend were consistently positive about their experience, which, obviously, is a good thing. Using the terminology of Peter Gray (1997), we are fairly confident that our event satisfied the early adopters. Did we bridge the chasm to reach the early majority or even the late majority? Based on the feedback we received—and, in some cases, whom we received it from—we think we made some progress.

Lesson 5: The quest continues.

Acknowledgment

I thank my colleagues at Stevenson University: Chris Arellano, Nicole Marano, and Eugenia Violante in the Office of Institutional Research and Assessment, as well as Jeff Elliott, Kelly Farmer, Jeff Kelly, and Carol Schmidhauser, the members of the Assessment Advisory Group.

References

Gray, P. J. “Viewing Assessment as an Innovation: Leadership and the Change Process.” In P. J. Gray and T. W. Banta (eds.), The Campus-Level Impact of Assessment: Progress, Problems, and Possibilities. New Directions for Higher Education, no. 100. San Francisco: Jossey-Bass, 1997.

Musun, L., Baker, A. D., and Fulmer, J. “Creating a Campus Community for Conversation About Assessing Student Learning.” Assessment Update, 2006, 18(4), 1–2, 12–13.

Jo-Ellen Asbury is assistant vice president for academic affairs at Stevenson University in Stevenson, Maryland.

Why Wait?

Get the current newsletter and
Join Our Email List
Sign up to receive exclusive content and special offers in the areas that interest you.
Send
Copyright © 2000-2015 by John Wiley & Sons, Inc. or related companies. All rights reserved.
Wiley