Digital Scholarship and Evaluation

Digital Scholarship and Evaluation

Beyond my design work, I serve as our team’s Digital Scholarship and Evaluation lead. In this role, I incorporate literature and evidence-based practices to support and refine our service portfolio (e.g., coaching our team to integrate evidence-based practices, contribute to our internal course quality initiative, etc.), assess the efficacy of our work, and share scholarly work both internally and externally.

ONLINE COURSE QUALITY INITIATIVE 

SoTL Foundational Literature for Course Quality

ODEE first constructed their online course quality rubric in 2015 incorporating findings from the literature, feedback from faculty partners, and online students. In February 2019, I began to develop public-facing annotated bibliographies to showcase key citations around the key best practices in online teaching. Below, I have provided the annotated bibliography, which can also be accessed at https://odee.osu.edu/instructors/distance-education/quality-assurance/teaching-strategies.

Online Course Quality Initiative Hand-off Survey Revision

As a part of my continued work on the course quality initiative at ODEE, I have contributed to substantial revisions of the end of course development hand-off survey. Instructional designers on the ODEE Distance Education complete this survey after completing a 14 week development or revision cycle with faculty. In the survey, instructional designers can choose from menus around the academic integrity tools used, details about multimedia production process, the kinds of teaching materials used, the kinds of assessments used, and the types of ancillary skills resources used in the course. These questions will be asked once IDs provide ratings for categories. 

With these changes to the hand-off survey, ODEE can begin to answer questions new kinds of questions with new sources of data:

  • Some example internal processes questions are: “What academic integrity tools are most used? Do we need additional partnerships with the Library or Writing Center for more resources?”
  • Some example program director questions are: “Why are our students disengaged” by reviewing the types of learning materials and assessments used; OR “How do we fix our problem with academic integrity” by reviewing if any tools/strategies are in place for deterring academic dishonesty.

Online Course Quality Hand-off Survey Calibration Activities

In order for the team to understand how we all use the OCQ survey and apply its contents, I’ve begun to develop calibration activities for use of the survey. By doing this at least once per semester, we can ensure that our instructional designers are on using the hand-off survey in the same way and that each instructional designer is reviewing courses for the elements our course design process enhances with faculty (e.g., presence of the teaching strategies elements). This will also ensure that the data produced by our survey is useful both internally and to our program directors.


INFORMING COURSE DESIGN THROUGH LITERATURE

Another key component of this role entails leading the creation of short reports that both curate and synthesize key findings around online course design considerations for various modalities (e.g., blended learning/hybrid courses, high-enrollment courses, etc.) an discipline-specific online (e.g., STEM) pedagogical considerations. As a result of COVID-19 in Spring 2020, my work has heavily focused on the development of resources for the effective design of online STEM courses at Ohio State.

Online STEM courses at Ohio State

Online Labs at Ohio State


EVALUATION & RESEARCH PROJECTS

Undergraduate Student Government Survey on Online Class Preferences

During Summer 2019, The Undergraduate Student Government partnered with The Office of Distance Education and eLearning to develop a survey to understand, generally, the Ohio State undergraduate student experience with their online courses at Ohio State and the students preferences for choosing to enroll in an online course. We have been working with institutional research and planning (IRP) in order to administer this survey under the umbrella of quality improvement for our online course quality initiative. The survey responses will help us, overall, to improve the Ohio State online course experience and to determine any potential areas of improvement between various colleges and units at Ohio State.

In addition to using the data for this quality improvement project, we plan to share out the findings with comparable units to ODEE and USG at other higher education institutions for a research project. The published literature does not provide many insights into the motivations for a “traditional” (i.e., student aged 18-24) college student to enroll in online courses. Most published studies focus on students enrolled in community colleges or working professional graduate programs. While these studies provide some reasonings for taking online courses, these may not be specific to the “traditional” college student enrolled at Ohio State.

I serve as the Principal Investigator on the IRB exempt research protocol (#2020E0141)

Data Literacy Research

From educational practice, we know that teaching data literacy skills can be challenging. From the field of instructional design and educational technology, we know that using interactive modules, which contain multimedia learning objects, can deliver content effectively and improve student outcomes. However, the literature contains few studies that elaborate on using online, interactive modules to teach specific data literacy skills such as graph and table interpretation.

In this study, we investigate the impact of an online, interactive module developed to help graduate students in BMI 7810: Design and Methodological Approaches in Biomedical Informatics to develop their data literacy skills.

In order to understand the instructional value added by this interactive, online module, we have also developed a non-interactive alternative, which follows the same basic pattern as our module of interest, but lacks multiple rounds of immediate instructor feedback, an outline of the student journey through the activities, and linked, clickable spots containing instructor feedback.

We will compare the impact of the interactive and non-interactive versions of the module on students’ mastery of data literacy skills and students’ perception of module’s efficacy in helping them improve their data literacy skills.

I serve as the Principal Investigator on the IRB approved research protocol (#2018B0279)