Assessment Program Framework Now Available

A framework outlining University Libraries Assessment program is now available for library faculty and staff in the document registry. This document includes the guiding vision for the program, identifies how assessment needs and priorities are determined, and lists . . .

  • individuals with assessment expertise for various areas and functions throughout the Libraries;
  • ongoing assessment activities; and
  • ad hoc assessment activities

Surfacing Library Collections Using Tableau

A few weeks ago I attended a Tableau training session which addressed how to embed hidden worksheets within dashboards. At the time I was putting together a blog to promote University Libraries’ Tuesdays @ Thompson diversity and inclusion program and was struggling to find a way to communicate the complexity of the Libraries’ collections related to human trafficking. So, I decided to experiment with what I learned during the Tableau training session and what I know about University Libraries’ catalog records. This interactive dashboard resulted.

 

HumanTraffickingTitlesWordCloud

 

To view how University Libraries collections on human trafficking and related subjects have evolved since 1894, click on the image above. Then click on a subject of interest to you to learn more about individual titles.

2014 ALAO Recap: Assessment

The Academic Library Association of Ohio sponsored a number of sessions related to assessment for its annual conference last Friday at the Kalahari Resort & Convention Center in Sandusky, Ohio.

Leah Lehman and Jennifer Donley of Ohio Northern University presented Low Tech Space Study on a Budget. Using cell phones and simple materials, such as laminated floor plans and dry-erase markers, ONU librarians and staff were able to assess student use of their library spaces. Their analysis led to reconfiguration of study spaces, the library entrance, and the opening of additional group study rooms. It will also inform future campus renovation plans.

Terese DeSimio and Ximena Chrisagis of Wright State University presented Rethinking Our LibGuides to Engage Our Students: Easy DIY LibGuide Usability Testing and Redesign that Works. This session was of particular interest to me as the OSU Libraries plans to migrate to LibGuides in the near future. I liked the idea of a mobile usability cart: Wright State librarians and staff wandered around campus and tested guides outside of the library. More information about this project is available at http://guides.libraries.wright.edu/LGUX

Later in the afternoon Maureen Barry, Mandy Shannon, and Piper Martin, also of Wright State University, presented Demonstrating Our Value: Using Assessment Data as Advocacy Tools . This session illustrated how local data, combined with data from national research initiatives, such as Project Information Literacy, can be leveraged to successfully increase student and faculty participation in library workshops and also the number of consultations with reference librarians.

I also attended the pre-conference workshop Who Gives? Advocacy & Outreach That Makes Things Matter led by Char Booth and came away with a number of ideas to follow-up on for the OSU Libraries.

Char offered two simple definitions for advocacy and outreach:

  • Advocacy = making people care
  • Outreach =  making people aware.

She also graciously shared her workshop slides.

 

Let’s Blend Our Data

During his keynote address at the 2014 Library Assessment Conference, David Kay enthusiastically asserted that libraries are data rich, but our ability to participate in university student or learning analytics initiatives remains “constrained by application silos.” Data blending (or weaving, using David’s terms) facilitates the integration of data from multiple sources, by combining two or more datasets that contain the same data elements, or joining two or more datasets by matching at least one data item.

For example, when a patron establishes an ILLIAD account with the OSU Libraries, he or she must identify a primary department, major or college from a pre-populated list provided by our library. This list, does not necessarily match the list departments, majors, or colleges maintained by various enterprise systems across the university.

If I want to create a profile of ILLIAD use by academic department or college and in relation to the number of undergraduate students who have declared a major within that department or collge, I first need to find a way to blend this data. The OSU Libraries lists Environment and Natural Resources under FAES (Agriculture, Food Science, Natural Resources, etc.) in its ILLIAD system for example, while the university uses the code ENR in its Master Schedule of Classes, and ENVNATR in its system which identifies the number of students who have declared a major.

To create a meaningful report which can be filtered by the librarian assigned to engage an academic discipline, I create a reference file with a column for each of the fields listed below. I then blend this data with information about the librarian who is assigned to engage the academic discipline, such as the library division, library department, or library sub-department.

Subject ENR
ILLIAD Department FAES(Agriculture,Food Science, Natural Resources, etc.)
Major (Academic Plan) ENVNATR
Major Name Environment and Natural Resources
Librarian Email librarian@osu.edu

Once this is done, I join the reference file to the aggregate data I have gathered from various university systems, and voila – subject librarians now can access an enhanced report which hopefully provides better, richer insights into the needs of their constituents.

2014 Library Assessment Conference Recap

I just returned from the 2014 Library Assessment Conference where colleagues from around the world shared a number of amazing strategies and ideas for demonstrating the academic library’s contribution to student success and the library’s value to the university’s research enterprise. With over 7 workshops and 27 concurrent sessions, I unfortunately couldn’t attend every program I wanted to see. Luckily the presentation slides are already online at http://libraryassessment.org/schedule/index.shtml Here are some of my largest takeaways:

The Library Cube: Unlocking the Value from Your Library’s Data – Margie Jantti, University of Wollongong

There was much chatter within the library assessment community when Brian Cox and Margie Jantti published their summary about the University of Wollongong’s Library Cube project back in 2012. Margie provided additional project details, along with a live demonstration of the Cube during her pre-conference workshop. I learned that the Wollongong Library viewed the Cube as a means to represent the library within the university’s enterprise system. Their project was also a response to their realization that libraries collect a lot of data, we’re just not very good at leveraging it. Further, library systems don’t typically talk to other university systems.

Wollongong has successfully blended data  from a number of platforms, including Innovative, EZ-proxy logs, and other sources and can now successfuly show correlations between e-resource use and student grades, segmented by gender, academic discipline, international vs. domestic students, and other demographic categories. They are also in the process of developing a marketing cube, to inform collection development, librarian relationships with faculty, promotional initiatives, and more.

This project could definitely be implemented at The Ohio State University, as it appears our university is using the same enterprise and library systems.

Who’s Asking What? Modeling a Large Reference Interaction Dataset  – Andrew Asher, Indiana University

The Indiana University Libraries is currently looking at unused and underutilized datasets for their assessment initiatives. This session focused on the stories which could be told from an aggregated dataset of approximately 500,000 reference transactions from 2006-2013. Email questions specifically were analyzed using topic modeling, a probabilistic approach to inferring themes by looking at the frequency of co-occuring words. We may be able to use this approach to better understand the nature of questions logged in our LibAnswers database.

Discovering the Pattern: Discerning the Potential: The Role of the Library in Unraveling the Cat’s Cradle of Activity Data – David Kay, SERO Consulting

I appreciate that this presentation noted that there are two competing philosophies for determining what data should be collected for library assessment. One camp advocates that we should collect whatever data we can and let the data tell its story, and other believes we should collect data specific to areas of interest or known to be useful. Both approaches are equally valid.

Data Management – It’s for Libraries Too! – Monena Hall, Virginia Tech

Virginia Tech deployed the Data Asset Framework methodology to determine what data the libraries collected, and where and how this data was stored (essentially a data audit). The studied revealed many data silos within their library system, along with little data oversight, and minimal data preservation planning. As libraries position themselves as data managers for the university research enterprise, we need to be able to implement the same policies and strategies that we advocate for our constituents.

Driving Partnerships for Assessment and Engagement – Katy Mathuews, Rebekah Kilzer, Shawnee State University

Serving a large population of first-generation college students, Shawnee State University has implemented an intrusive advising program on campus. The library is in preliminary discussions to participate in campus learning analytics interventions. Using current data, the library has been able to determine at an aggregate level which students are using the library and which are not, and may parse this data by a number of demographic attributes. They are also able to correlate student usage of library materials with GPA. As a Innovative/OhioLINK library, Shawnee State’s experience may inform our library’s efforts with these initiatives.

Show This, Not That: How to Communicate Assessment Results – Jen-Chein Yu, University of Illinois, Urbana-Champaign

Kudos to Jen-Chien Yu for this entertaining and enlightening presentation. I think the PowerPoint for this program speaks for itself! See http://libraryassessment.org/bm~doc/12YuLightningTalk.pdf

 

Logic Models and Library Assessment

I will be presenting  The Engaged Librarian: Crafting an Effective Assessment Plan to Determine the Impact of a Key Strategic Library Initiative at the Library Assessment Conference this August. This paper will focus on using logic models and corresponding assessment/data-gathering plans to evaluate the programmatic initiatives listed under focus area 1 of the OSU Libraries’ 2011-2016 Strategic Plan: “Advance transformative teaching and learning by engaging with OSU faculty and support units to integrate library resources and services throughout the educational curriculum.”

I wholeheartedly believe that logic models provide a flexible, yet structured approach for library assessment planning – and that the participatory, inclusive process for creating logic models helps libraries to “organize and systematize program planning, management, and evaluation functions.” (W.K. Kellogg Foundation, 2004,  p. 5) While the logic model to be presented at this conference is still in development, we have completed a logic model to facilitate assessment of the Libraries’ contribution to  STEP — the university’s Second-year Transformational Experience Program. This model is available on the Assessment Projects CarmenWiki page at:  https://carmenwiki.osu.edu/pages/viewpage.action?pageId=41878269