Assessment Program Framework Now Available

A framework outlining University Libraries Assessment program is now available for library faculty and staff in the document registry. This document includes the guiding vision for the program, identifies how assessment needs and priorities are determined, and lists . . .

  • individuals with assessment expertise for various areas and functions throughout the Libraries;
  • ongoing assessment activities; and
  • ad hoc assessment activities

Surfacing Library Collections Using Tableau

A few weeks ago I attended a Tableau training session which addressed how to embed hidden worksheets within dashboards. At the time I was putting together a blog to promote University Libraries’ Tuesdays @ Thompson diversity and inclusion program and was struggling to find a way to communicate the complexity of the Libraries’ collections related to human trafficking. So, I decided to experiment with what I learned during the Tableau training session and what I know about University Libraries’ catalog records. This interactive dashboard resulted.

 

HumanTraffickingTitlesWordCloud

 

To view how University Libraries collections on human trafficking and related subjects have evolved since 1894, click on the image above. Then click on a subject of interest to you to learn more about individual titles.

Visualization + Narrative

Lately, as I’ve been thinking about ways to provide more context for assessment reports, I’ve been experimenting with adding narrative to my data-visualizations. This experimentation partly recognizes that the primary audience for data-visualizations I create appreciate narrative more than numbers. It also recognizes that narrative with visuals is just more engaging and fun to read.

For some visualizations, I include broad, philosophical questions with an explanation of the data, its origins, and why it may be useful for librarians to consider:

BlogVisualization+Narrative1

 

For others, the questions are much more specific:

BlogVisualization+Narrative2

 

Right now, this experimentation is quite preliminary. I have no idea whether the audience for these visualizations will find the added narrative useful.  I do believe, however, that visualization + narrative is something worth exploring.

Assessment Reports

Three assessment reports exploring feedback the Libraries received in the 2011 and 2013 LibQUAL surveys are now available for OSU Libraries faculty and staff in CarmenWiki. Together, these three reports and studies create a picture of some of the Libraries online presence, surfacing those elements of library.osu.edu which work well, as well as opportunities for improvement. The reports include:

Institutional Research Consultants (IRC) Graduate Student & Faculty Search Task Study

To help improve the OSU Libraries understanding of graduate students and faculty search issues, IRC interviewed 24 individuals evenly distributed between the Arts & Humanities, Social & Behavioral Sciences, and Science disciplines between December 2013 and January 2014

https://carmenwiki.osu.edu/download/attachments/37914585/irc-OSUlib-interview-report-2014-revised-041014.pdf?version=1&modificationDate=1403714858677

Old Wine in New Bottles: UX Demystified

Summary of the April 3, 2014 User Experience Program for The Ohio State University Libraries

The Libraries invited Gretchen McNeely, a user experience (UX) researcher and strategiest to campus to provide a daylong UX professional development program for the Libraries’ faculty and staff. Ms. McNeely provided several recommendations for advancing UX in the Libraries’ virtual and physical environments.

https://carmenwiki.osu.edu/download/attachments/42594876/McNeely05202014OSU_UXreport_Final.pdf?version=1&modificationDate=1411397183502

2013 Usage Profile of library.osu.edu

In late January 2014, an informal group created a usage profile for library.osu.edu based on the 2013 calendar year using Google Analytics. The project focused on library-generated content delivered via library.osu.edu only, meaning usage data for domains such as liblearn.osu.edu, cartoons.osu.edu, and libanalytics.osu.edu was not considered for the analysis.

https://carmenwiki.osu.edu/download/attachments/42594876/2013GoogleAnalyticsReport%28Final%29.docx?version=1&modificationDate=1411398502089

Please note: Access to these reports is restricted to University Libraries faculty and staff. To view the links above, please first login to CarmenWiki in your web browser. Then copy and paste the link into your web browser.

OSU Tableau Users Group

Tableau Users Groups (TUG) offer the perfect forum for enthusiasts and practitioners to exchange ideas, work through challenges and learn from each other. I have participated in the Columbus Tableau Users Group since its inception, and am now pleased to announce the formation of a TUG at Ohio State. The next meeting of the OSU TUG will be held on Thursday, November 20th from 11:30am – 1:00pm in room 281 of the Student Academic Services Building at 281 W. Lane Avenue . Deb Burgess-Shaw will share how she used parameters in a large Tableau project she recently completed. We then plan to open the floor for sharing ideas, asking questions, and general discussion. Feel free to bring your lunch. I look forward to seeing you there.

 

 

2014 ALAO Recap: Assessment

The Academic Library Association of Ohio sponsored a number of sessions related to assessment for its annual conference last Friday at the Kalahari Resort & Convention Center in Sandusky, Ohio.

Leah Lehman and Jennifer Donley of Ohio Northern University presented Low Tech Space Study on a Budget. Using cell phones and simple materials, such as laminated floor plans and dry-erase markers, ONU librarians and staff were able to assess student use of their library spaces. Their analysis led to reconfiguration of study spaces, the library entrance, and the opening of additional group study rooms. It will also inform future campus renovation plans.

Terese DeSimio and Ximena Chrisagis of Wright State University presented Rethinking Our LibGuides to Engage Our Students: Easy DIY LibGuide Usability Testing and Redesign that Works. This session was of particular interest to me as the OSU Libraries plans to migrate to LibGuides in the near future. I liked the idea of a mobile usability cart: Wright State librarians and staff wandered around campus and tested guides outside of the library. More information about this project is available at http://guides.libraries.wright.edu/LGUX

Later in the afternoon Maureen Barry, Mandy Shannon, and Piper Martin, also of Wright State University, presented Demonstrating Our Value: Using Assessment Data as Advocacy Tools . This session illustrated how local data, combined with data from national research initiatives, such as Project Information Literacy, can be leveraged to successfully increase student and faculty participation in library workshops and also the number of consultations with reference librarians.

I also attended the pre-conference workshop Who Gives? Advocacy & Outreach That Makes Things Matter led by Char Booth and came away with a number of ideas to follow-up on for the OSU Libraries.

Char offered two simple definitions for advocacy and outreach:

  • Advocacy = making people care
  • Outreach =  making people aware.

She also graciously shared her workshop slides.

 

Assessment Interest Group @ ALAO 2014

On behalf of the Assessment Interest Group, I am pleased to be introducing Traci Moritz and Leah Lehman at the 2014 Academic Library Association of Ohio Conference. Traci and Leah will be presenting “Low Tech Space Study on a Budget” from 10:00am – 10:50am in the Aloeswood Room. Their talk will focus on a usability study of library facilities at Ohio Northern University which was conducted in alignment with a college-wide sustainability review. Please join us for an interesting and informing program.

Let’s Blend Our Data

During his keynote address at the 2014 Library Assessment Conference, David Kay enthusiastically asserted that libraries are data rich, but our ability to participate in university student or learning analytics initiatives remains “constrained by application silos.” Data blending (or weaving, using David’s terms) facilitates the integration of data from multiple sources, by combining two or more datasets that contain the same data elements, or joining two or more datasets by matching at least one data item.

For example, when a patron establishes an ILLIAD account with the OSU Libraries, he or she must identify a primary department, major or college from a pre-populated list provided by our library. This list, does not necessarily match the list departments, majors, or colleges maintained by various enterprise systems across the university.

If I want to create a profile of ILLIAD use by academic department or college and in relation to the number of undergraduate students who have declared a major within that department or collge, I first need to find a way to blend this data. The OSU Libraries lists Environment and Natural Resources under FAES (Agriculture, Food Science, Natural Resources, etc.) in its ILLIAD system for example, while the university uses the code ENR in its Master Schedule of Classes, and ENVNATR in its system which identifies the number of students who have declared a major.

To create a meaningful report which can be filtered by the librarian assigned to engage an academic discipline, I create a reference file with a column for each of the fields listed below. I then blend this data with information about the librarian who is assigned to engage the academic discipline, such as the library division, library department, or library sub-department.

Subject ENR
ILLIAD Department FAES(Agriculture,Food Science, Natural Resources, etc.)
Major (Academic Plan) ENVNATR
Major Name Environment and Natural Resources
Librarian Email librarian@osu.edu

Once this is done, I join the reference file to the aggregate data I have gathered from various university systems, and voila – subject librarians now can access an enhanced report which hopefully provides better, richer insights into the needs of their constituents.

2014 Library Assessment Conference Recap

I just returned from the 2014 Library Assessment Conference where colleagues from around the world shared a number of amazing strategies and ideas for demonstrating the academic library’s contribution to student success and the library’s value to the university’s research enterprise. With over 7 workshops and 27 concurrent sessions, I unfortunately couldn’t attend every program I wanted to see. Luckily the presentation slides are already online at http://libraryassessment.org/schedule/index.shtml Here are some of my largest takeaways:

The Library Cube: Unlocking the Value from Your Library’s Data – Margie Jantti, University of Wollongong

There was much chatter within the library assessment community when Brian Cox and Margie Jantti published their summary about the University of Wollongong’s Library Cube project back in 2012. Margie provided additional project details, along with a live demonstration of the Cube during her pre-conference workshop. I learned that the Wollongong Library viewed the Cube as a means to represent the library within the university’s enterprise system. Their project was also a response to their realization that libraries collect a lot of data, we’re just not very good at leveraging it. Further, library systems don’t typically talk to other university systems.

Wollongong has successfully blended data  from a number of platforms, including Innovative, EZ-proxy logs, and other sources and can now successfuly show correlations between e-resource use and student grades, segmented by gender, academic discipline, international vs. domestic students, and other demographic categories. They are also in the process of developing a marketing cube, to inform collection development, librarian relationships with faculty, promotional initiatives, and more.

This project could definitely be implemented at The Ohio State University, as it appears our university is using the same enterprise and library systems.

Who’s Asking What? Modeling a Large Reference Interaction Dataset  – Andrew Asher, Indiana University

The Indiana University Libraries is currently looking at unused and underutilized datasets for their assessment initiatives. This session focused on the stories which could be told from an aggregated dataset of approximately 500,000 reference transactions from 2006-2013. Email questions specifically were analyzed using topic modeling, a probabilistic approach to inferring themes by looking at the frequency of co-occuring words. We may be able to use this approach to better understand the nature of questions logged in our LibAnswers database.

Discovering the Pattern: Discerning the Potential: The Role of the Library in Unraveling the Cat’s Cradle of Activity Data – David Kay, SERO Consulting

I appreciate that this presentation noted that there are two competing philosophies for determining what data should be collected for library assessment. One camp advocates that we should collect whatever data we can and let the data tell its story, and other believes we should collect data specific to areas of interest or known to be useful. Both approaches are equally valid.

Data Management – It’s for Libraries Too! – Monena Hall, Virginia Tech

Virginia Tech deployed the Data Asset Framework methodology to determine what data the libraries collected, and where and how this data was stored (essentially a data audit). The studied revealed many data silos within their library system, along with little data oversight, and minimal data preservation planning. As libraries position themselves as data managers for the university research enterprise, we need to be able to implement the same policies and strategies that we advocate for our constituents.

Driving Partnerships for Assessment and Engagement – Katy Mathuews, Rebekah Kilzer, Shawnee State University

Serving a large population of first-generation college students, Shawnee State University has implemented an intrusive advising program on campus. The library is in preliminary discussions to participate in campus learning analytics interventions. Using current data, the library has been able to determine at an aggregate level which students are using the library and which are not, and may parse this data by a number of demographic attributes. They are also able to correlate student usage of library materials with GPA. As a Innovative/OhioLINK library, Shawnee State’s experience may inform our library’s efforts with these initiatives.

Show This, Not That: How to Communicate Assessment Results – Jen-Chein Yu, University of Illinois, Urbana-Champaign

Kudos to Jen-Chien Yu for this entertaining and enlightening presentation. I think the PowerPoint for this program speaks for itself! See http://libraryassessment.org/bm~doc/12YuLightningTalk.pdf