October 24, 2007

Course Analytics - Prequel

Earlier this week, Stephen Downes picked up on an EduCause report on Academic Analytics, which "describes how data from sources such as a course management system (CMS) or a student information systems (SIS) can identify at-risk students through analytics and predictive modeling". Academic Analytics can also be used to predict student success, as well as retention.

In this post, I thought I'd set the scene for a series of course analytics posts using data that I "acquired" from an online course earlier this year. (I've posted previously about some of my early thinking on the topic.)

In contrast to the academic analytics, one of the things I set out to explore was how an off the shelf web stats analytics tool (Google Analytics) could be used to help me learn more about what students were doing with our online course materials, and help me identify what - if anything - a "learning site's" goals could be, and what the site might be optimised for.

Part of the difficulty lies with the site not being a "selling" site. Optimising a mini-site like Study at the OU is in the first instance all about securing registrations on courses. But for an online course, it's not so clear what the goal is, or what the important metrics along the way are.

(I asked a similar question at the Library Strategy workshop a couple of weeks ago: if the library website was Amazon, then success would be measured in taking someone's credit card number, shipping the goods, and not getting them returned. What's the success measure - or success measures - for a library website?)

Something I found useful in helping me think about this topic was this seminar on Google Analytics - Non-Ecommerce Sites: Beyond Averages from the Google Conversion University Playlist on YouTube (the Google Conversion University website also has some interesting articles...).

In particular, here are four distribution (rather than average) measures that are useful for analysing user behaviour on non-ecommerce websites:

  • Visitor loyalty - how often has each user visited the site over a given period;

  • Visitor recency - of all the people who have visited the site, how many have visited in the last N days;

  • Length of visit - how long do visitors stay on site;

  • Depth of visit - how many pages on the site are seen on each visit;

Over the next week or so, I'll do a series of posts looking at these measures as obtained from an online course, and go way beyond reports like this...

That said - there are a couple of interesting features in the above report - periodicity, for example (Tuesday's look busy...), and traffic peaks at the two assessment weeks...

PS Just by the by, if you use consultants to do your reading for you (i.e. you read through reports or sit through presentations that talk-up The Innovator's Dilemma/Solution, or Blue Ocean Strategies and so on) I'm guessing that Competing on Analytics will be the next "bestselling" Harvard Business School book they quote at you! (I'm not convinced that the Black Swan will fly? ;-)

Tags: , , ,

Posted by ajh59 at October 24, 2007 12:46 PM
Comments

If I have guessed the course correctly, Tuesday activity levels may have been influenced by moderator messages on Mondays saying " Week x starts tomorrow".

I suspect a similar weekly pattern is less likely to be seen on the longer 9 month courses, where there would be a greater divergence in the timing of studying particular parts of the course.

Although peaks around assessment deadlines would still be seen - similar to the patterns of activity seen in the student forums.

Guess who has just finished writing her annual moderator reports!

Posted by: Kate at October 26, 2007 01:11 AM