[UPDATED - with a whole load of qualifying statements and disclaimers!]
Google Analytics offers a couple of views of the same report that might be useful in identifying study patterns - the Visitor trending - Visits report.
(Note these reports are captured over slightly less than the 8 week period mentioned in the previous post.)
A more rigorous test would be to run this time series data through an autocorrelation function, or some form of spectral analysis, for example, to see whether it is "really" periodic. (But this is a quick blog post, not a formal paper, so here is not the place to run that sort of test... unless anyone knows of a quick'n'easy online tool I can paste my GA data into and get such a report out?)
That said - the report needs to be taken with a pinch of salt (also see [update] below...). I mentioned in the previous post that the number of unique visitors significantly outnumbers the number of students registered on the course (i.e. the number of people who have access to the course materials): that is, there is a many-to-one mapping from Google unique visitors to students. It's also worth remembering that the course in question requires at least as much study time again on book reading and desktop computer activity...
A consideration of data across the whole length of the course also suggests that the periodicity is not so clear cut. There is also a clear "end effect" around the time of the end-of-course assessment, as I will show in the next post on this topic.
Visits - By Time of Day gives just such a report (my analytics settings are set to UK time, so the following should be the "real" time of access for our UK students. (A map overlay (not show) shows that virtually all the students on the course were coming form UK IP addresses.)
The distribution suggests that we don't have so many night owls, nor a significant number of students studying early in the morning (before work or getting the kids ready for school, for example). Peaks around lunchtime might suggest online study at work (or there again, they might not...), with another bump in the evening between 7pm and 10pm.
A finer grained breakdown, e.g. giving time of day reports for weekdays, Saturday and Sunday, would help to unmask any differences between weekend (i.e. Sunday!) working compared to weekday working.
Two views of a single Google Analytics report - the Visitor Trending "Visitors" report - allow us to identify study patterns across time of day, and day of week.
Visual observation of reports collected over a two week period hint at patterns of behaviour across the week, with Sunday, Monday and Tuesday being popular study days, and Saturday being unpopular; and throughout the day - lunchtimes and early evening are popular, all night and early morning study unlikely.
[UPDATE: The "Saturday effect" may be completely misleading insofar as it describes how students study the whole of the course... I mentioned that the course includes book reading and use of a desktop application, which are being studied - somewhen; the analytics reported here therefore only relate to the online portion of the course, which correspond to maybe 40% of the time spent on the course... oops... I'll try to focus more on what is maybe safer ground in future posts, and concentrate on what the analytics appears to tell us about just the way the students appear to be interacting with the online materials, at least, without worrying too much about those uncounted students who didn't just print the materials off and study them that way!)]
"Confession": maybe not publishing in peer reviewed circles for a bit has led to me losing some of the rigour of academic analysis? A couple of times I've been tempted to write: "-maybe, this is only a blog post after all and not a proper academic publication....", as if that let's me off... Hmm... I wonder how this sits with Martin's thoughts on Academic Discourse and [Blogging]? Certainly, Grainne Conole's excellent new e4innovation blog is something I would say does tend towards "formal" academic credibility (references 'n' everything ;-)
Tags: analytics, courseanalytics
Posted by ajh59 at October 26, 2007 12:09 AMKate Sim posted the following comment to the "Course Analytics, Prequel" post, which is worth adding here too:
"If I have guessed the course correctly, Tuesday activity levels may have been influenced by moderator messages on Mondays saying 'Week x starts tomorrow'.
"I suspect a similar weekly pattern is less likely to be seen on the longer 9 month courses, where there would be a greater divergence in the timing of studying particular parts of the course.
"Although peaks around assessment deadlines would still be seen - similar to the patterns of activity seen in the student forums."
This point raises quite a few issues, I think:
- the population of the course in question is small for an OU course (around 100 or so) and there are potentially significant single point intervention effects from moderator postings. A complete set of analytics should include traffic and "lurking" monitoring from the course conferences/forums;
- the course is short compared to most OU courses (a 10 week, 10 CAT point short course as opposed to a 30 or 60 point 30week course);
- the course is presented/paced using online materials (although there is book reading and required use of a desktop application).
tony
Posted by: Tony Hirst at October 26, 2007 09:26 AM