So you want to build a LMS plug-in?


It seems every couple semesters, the Office of Distance Education and eLearning,  AKA the folks who run Carmen, get a visit from someone who has a great idea for an add-on product.  Carmen is technically called a LMS, or Learning Management System. I had such a meeting this week with my wonderful colleague, Valerie Rake, and afterward I thought it might be wise to capture the suggestions we made to this up and coming developer for future generations of software developers and/or business people who might want to save time, money, and headaches on their LMS software development project. If this applies to you, a few notes might be in order.

By way of background, I work in Distance Ed and eLearning at Ohio State University, and I spend a good chunk of my time in pilot projects determining how outside vendor products might be coaxed into connecting with our LMS (think Carmen), along with any number of other issues.  Valerie manages the support team for Carmen, Top Hat, Mediasite, Turnitin, and a slew of other ODEE services that students and faculty use on campus every day, and has worked with enough bad LMS add-ins to make one’s head spin like a scene from a horror film.  Here are a few of our thoughts from our meeting this week.

1) If you’re developing, learn about LTI standards that will basically work with all LMS platforms. If you decide an API will work better, you’re going to be dependent on the client’s team of developers to care enough about your particular product to make it work with their systems. If what you’re working on is the best thing since sliced bread, that developer team may spring into action.   If it’s one of fifteen similar products that do similar things, they probably won’t.  Using LTI standards makes integration with the LMS a matter of a quick security review and a few settings changes within the LMS, not months of troubleshooting.

2) Security reviews and local accounts are another great reason to use LTI standards. If you start setting up a product where users create an account on your system, you’re opening a can of worms that may prevent your product from passing a university security review. We have no way of knowing whether ‘BuckeyeFan2015@roadrunner’ is actually a student in your class. Even if we know that hooker.24@osu is somehow connected to The Ohio State University, we still don’t know whether he is currently in the class using your app, or whether he dropped your class for advanced basket weaving on day one. And, in case this argument isn’t strong enough, how do you feel about 500 password resets a day for the first week of every semester? Every semester! Forever! Grab account information of the people who are enrolled in your class through LTI and you won’t have to worry about any of these issues.  And you’ll never have to reset a password.

3) While we’re thinking about security, consider your data. Are you moving student information securely? Is it secure as it rests on your server? Is enrollment data and scoring data moving directly from your server to our LMS (in an encrypted form), or are you asking faculty to download a copy of the student data to their desktop computer before uploading it to the LMS? If your solution is the latter, the hair on the back of our security team members necks will stand on end.  This is the point where many companies say ‘Hey, I can just use an API to move that stuff back and forth.’  Remember that will only work for one specific LMS.  Use LTI standards and it should work for any LMS, not just one.

4) Business tends to see participants in a class as either one faculty member or a set of students; if you or your developers see education this way you’re going to have trouble. Faculty co-teach classes. Staff members assist faculty with course set up.  Grad students may teach one class while being a student in another. Fixing and modifying  roles and permissions issues will give you grief later, so be flexible early in the process. Grab their role in the specific class via LTI, but also design your software with provisions for two or more instructors, 5 TA’s grading assignments, etc.  You get bonus points if your software can combine class sections into a coherent list of people at one time and break those sections back apart when different TA’s need to grade only students within their section.  If you think about this sort of challenge early in your design process, the headache you save later will be your own. (Or your development team’s.)

5) Collaboration is the way things move forward in a university setting.  Time and time again we encounter software with oddities like only allowing one admin account access at a time, including solutions from huge software companies.  Is it a hassle for me to find my boss to ask if they remembered to log out after getting usage statistics from some admin site the day before? You bet it is.  That same huge company I alluded to also rolled out a media authoring tool that initially didn’t allow faculty to invite their peers to collaborate on a project.  Working alone is the exception in Higher Education, rather than the rule.  Develop with collaboration in mind.  On the other hand, don’t assume faculty want to share everything with everyone.

6) Support is on the minds of most of the vendors who come calling at OSU, however the old 9-5 Eastern support days are coming to an end. Today’s students are studying across the state, and across time zones. Consider what calls can be handled by the in-house support offered by university as the first level of defense and build up your support for second, third, and fourth level.

7) Consider what happens to this student data when the semester ends and grades are filed.  Have you built an easy way for faculty to archive what they want, and securely remove what they don’t want to keep?  Think about data storage policies, and how long you want to guarantee faculty can get back to that data.  Unless you have unlimited funding you don’t want to store it forever.  Our security team probably doesn’t want you to have that exposure either.

This is just a quick list and there are undoubtedly many other things to consider, but if you begin thinking about these sorts of issues when you begin designing your amazing new LMS tool, I’m sure you’ll be ahead of most of your competitors.

iTunes U Bootcamp 2015

ODEE Learning Programs hosted a bootcamp on iTunes U course creation for OSU faculty and staff the last week of May.  Here are a few images I shot during the two day event which we held at Sparkspace in downtown Columbus.  I’m looking forward to seeing new courses go live this summer after the hard work our boot campers put in on their courses.

Innovate 2015 gallery-

Here are some of the images I shot during Innovate 2015.

Online Proctoring for Distance Education

A growing wave of distance education options across academia has spawned a crop of companies offering online proctoring services to ensure students taking courses online are held to the same academic standard during testing as those who take tests on campus.  This is an broad overview of the current techniques available from these service providers.

Live Proctor services: 

These services require a student to schedule their exam time.  At the requested time the student connects and shows a photo ID to the remote proctor via their computer’s webcam.  The student is asked to show their desk area and the room where they are taking the exam space with their webcam, after which they are prompted by the proctor to take their test.  If there is a second monitor in the room students are instructed to turn it off so it can’t be used for crib sheets or notes.  In this model the proctor may hold a code to unlock the exam, which they can enter remotely so the student never has the code.

Live Proctors monitor the webcam view and computer screen of the student taking the test, watching for any issues that might be construed at cheating.  If they see suspicious behavior the proctor may intervene or record the suspect behavior for later review.  A report is generated for the instructor, who may review the video and escalate issues further as needed.   Typical cost for a 1 hour live proctored exam in in the range of 15~20 dollars.

Test Recording and Review services:

In this solution, technology is used to record the student and their computer screen space without a live proctor present.  Typically a student takes a snapshot of their ID at the beginning of the exam, which is saved and compared against the ID used to set up the account.  The student is then asked to show their testing space with a 360 degree sweep of the video camera and a sweep of their desk space.  Both these sweeps are recorded and stored, along with the time spent actually taking the test.

The webcam and desktop recordings are evaluated either by human test reviewers who watch the recording at a faster speed looking for patterns that reflect cheating (like looking sideways every minute) or they might be initially reviewed by computer analysis, with suspect behavior flagged for review by a human evaluator.  When an incident is suspicious the proctor can escalate that incident for faculty review.   The primary advantage of recording services is a slightly cheaper price per test, and an ‘on-demand’ test environment where scheduling a time is basically unnecessary.

Unique alternatives:

One service called voice proctor offers  an alternative to the video camera and screen capture review system.  In this system a student registers their phone number and records a passage through their computer microphone.  This creates a ‘voice print’ of the student that is stored for subsequent tests.  The instructor can trigger a call to a student when they complete either a specific question or one of a random set of questions.  During this call the student is asked to verbally explain their reason for answering one of these trigger questions in the way they did.  This explanation is recorded and compared against the ‘voice print’ to confirm the identity of the student taking the test, and the instructor gets access the verbal exam answers to confirm student mastery of the topic.  The benefit of this sort of solution is the lower bandwidth needed to take an online test plus operate a video stream in places where fast internet is unavailable.  The other potential benefit is the opportunity to pose oral exam questions in a space where written exams or multiple choice are the norm.

Ultimately these services are quite new and evolving rapidly.  With the addition of fingerprint readers to smart phones we will soon be even better prepared to prove the identity of online students, which is great news considering the projected growth of online education.

The information for this post was collected in collaboration with John Muir of ODEE.

Thanks John!

Part 2: NMC Horizon Report: Significant Challenges and how OSU is addressing them

In my last post I discussed how The Ohio State University rates against the Key Trends section of the 2014 New Media Consortium Horizon Report. I am following up with a look at how Ohio State is addressing the Significant Challenges and Important Developments sections of the report. Part 1 of this post is available here and both posts were influenced by discussions with Digital First team members.

Horizon Report Section 2: Significant Challenges Impeding Ed Tech Adoption in Higher Education

This section of the report divides the challenges into defined categories. These are Urgent challenges, where we understand the problem and can solve the problem; Difficult challenges, where we understand the problem but solutions are elusive, and Wicked Challenges, which are complex to even define, much less address.

Urgent Challenges:
Low Digital Fluency of Faculty: This challenge has been a constant before since the first days of The Digital Union on the OSU Campus.  My observation points to technology user interface design, the part of the product that users see, has gotten much better over the years, both graphically and consistency working.  If more opportunities to use digital media arise, or faculty begin utilizing materials produced by projects like the OpenStax project,the skills to utilize these tools will follow.  Another component to improve digital fluency for faculty is teaching skills to produce their own materials, a process which has moved forward by leaps and bounds over the past five years.  Ten years ago a writer would need to learn Quark Xpress or Adobe Illustrator for an extended period in order to produce a decent looking book; today a faculty member can publish an amazing book in iBooks Author or put up a course on iTunes U after a day or two at a Digital First boot camp.  In spite of the growing flood of digital materials coming out our programs, there is another group of faculty who are unaware of this support at Ohio State, or simply have no interest in digital publishing.  This may be tied to tenure publishing requirements not being in tune with the rapidly evolving digital publishing boom.  Time will tell if the old requirements can be aligned to new techniques, it seems there is a movement on campus to recognize this work.

Relative Lack of Rewards for Teaching: The question of faculty compensation and the relative compensation for research versus teaching, along with the current attention media is giving to compensation for adjunct faculty, tenured faculty, and administrators is a broad conversation beyond the scope of this post.

Difficult Challenges:
Competition from New Models of Education: Online Colleges and MOOCs were the first salvo in shifting the model of higher education, more will come as schools sort out the challenges of granting credit and completion rates for online instruction.  Ultimately these systems must engage students to be effective; the New York Public Library MOOK support group plan is an early attempt to bring interaction and face-to-face support back to these very large classes where students may not feel connected.  My other though on this trend is that the students coming out of high school may have had technology in their hands for virtually their entire school career, so the jump to digital learning for new students will get easier.  Ohio State must be in a position to compete for the attention of these new college students, or they will go elsewhere.

Scaling Teaching Innovation: This is a problem where there is not a focused effort to roll out a technology that can be supported across campus; the greatest challenges are with the bleeding edge and innovative solutions that may not be developed enough for general use by all faculty.  Mike Hofherr has focused on scaling back a very broad set of services that were done for small numbers of faculty users, and has instead focused the efforts of his department to support technology that can be implemented across campus for all users.  The shift can be seen in the ODEE service catalog, which now offers 10 core services.  Another effort to scale up innovative teaching is the Innovate conference began five years ago with the goal of showcasing faculty who are using new technology well to better teach their students.

Wicked Challenges:
Expanding Access: The increasing numbers of students seeking education who may not be prepared for the demands of college level course work are identified as one wicked challenge, a challenge that is complex to define and address.  I have seen glimmers of hope in the work of Dr. Matt Stoltzfus, whose initial efforts with online chemistry instruction were created to bring up the skills of high school students who were interested in Chemistry.  This effort led to his iTunes U General Chemistry course, with 322 short lessons and hundreds of thousands of downloads.

Keeping Education Relevant: Fast innovation is technology fields continue to challenge universities.  The Horizon report calls out the issue of staying relevant in a time of fast paced change, however I would also suggest technology and support groups must also become more agile.  The days of a committee taking a year or two to evaluate a solution are fading, since new solutions or pricing models may be available before that year is done.  Unfortunately this may force us to make ‘best option at this time’ decisions that may have to be modified over a sorter time-frame than traditionally considered at large universities, however the traditional pace of decision making will doom us to selecting outdated technology or outdated content for our customers.

Conclusion:  As I look over this post, agility and willingness to change quickly are the common themes. A mindset where we adopt the best solution available today with the understanding that a better solution may come along in the next year or two is key to success in a rapidly changing environment.  Being willing to review and change as new solutions, technologies, instructional techniques, and source materials become available will allow universities to remain leaders in the face of these challenges.

Student Response evaluation 2014

As Innovation Lead for the Office of Distance Education and eLearning, my job involves setting up tests of new technologies that may be adopted by the university.  At this time I am putting the final touches on a test of Student Response systems.

In the past OSU supported clickers from Turning Technologies, however in this day when virtually every student has a laptop, smartphone, tablet, or music player that can show web pages or run apps, we have a number of new student response options that allow for more complex answers from students than A/B/C/D using devices they already have in their pockets and book bags.  Image heat maps, formulas, class discussions, and more can all be accomplished through modern student response tools.

Over the past three months a cross-university Student Response Committee evaluated potential vendors.  The committee felt two features were critical to adoption: painless export of student data to Carmen grade book and a single sign-on so students can use their university ID to sign in on the vendor system.

By choosing ResponseWare, TopHat, and ViaResponse for evaluation we believe these priorities can be met.  With this in mind I am interested in having a small group of OSU Faculty test as many of the three systems as you feel comfortable evaluating so we have as much data as possible as we consider the benefits and pitfalls of each system.  I would ask any participants to agree to test at least two.

The process of evaluation will be conducted in three phases; the first product to be evaluated is ResponseWare.

Calendar:
February 6 ~ 7:  Training with ResponseWare representatives
February 10 ~ 21:  Test ResponseWare in class
February 19 ~ 26: Release ResponseWare student and faculty polls

February 24 ~ 25: Training with TopHat
March 3 ~ 13: Test Top Hat in class
March 12 ~ 19: Release Top Hat student and faculty poll

March 17 ~ 18: Training on ViaResponse
March 19 ~ April 3: Test Via Response in class
April 2 ~ 9: Release Via Response student and faculty polls

Mid April: wrap up meeting with participating Faculty and Student Response Evaluation Committee members to discuss features, strengths, and weaknesses of each product.  We will test Carmen grade book integration and single sign-on capabilities in May and put forth a final recommendation before the start of the next fiscal year.

In addition to learning about student response systems, participating faculty and their students will be given the opportunity to test and offer feedback that will guide university decision making on this important classroom tool.  Access will be free for all students.

If this sounds like something you or a faculty member you know would be interested in trying please contact me at hooker.24@osu.edu with your course name/ number and the number of students in your class before February 7th and I’ll get you into the pilot.

Thank you!

Dave Hooker