Monday, April 9, 2012

OpenEd User Perspective - My survey results

Thanks for all of you who responded to my request to fill out a short questionnaire! In this blog I will summarize and discuss the results.

Research question
In a recent post I vented my frustration with the limited interaction in the Intro to Openness in Education course. Realising that there are different ways in which people participate in this course--e.g. on/off campus, grad students, professionals--I wondered if there is a difference in perception about factors such as interaction.

Hypothesis
Given some of the comments I had read in some blogs about class discussions, I would even go so far to assume that on campus participants in the course would be more positive about the interaction than off campus participants.

Method
Not wanting to reinvent the wheel (or better: knowing how difficult it is to develop a good instrument) I decided to see if one had been developed before. I selected one by Ward, Peters and Shelley (2010). They asked participants rate dimensions of effective instruction for different course formats. These dimension were used again and subjects were asked to rate them from low to high as to how they related to IOE12.

Three questions were added about location (on/off campus), intentions (university credit, badges, etc.) and a field for comments on interaction in the course.

Subjects
Participants in the course were targeted for the questionnaire. This was announced through my personal course blog, through twitter using the course #-tag and to individual participants where possible through email, blog comments or direct tweets.

Results
Forms in google documents produces a summary which includes a link to all data. In five days after the announcement of this short survey, eleven fellow participants filled out the questionnaire; two on campus and nine off campus participant, with a variety of intentions.

For further analysis, one submission was excluded. The participant had selected all 1's, stating "[..] I was very interested in the course but work overload made me difficult to follow it properly. [..]"

The extent to which the dimensions of effective instruction relate to the course is shown in Table 1 and graphic 1. 


Graphic 1 - Dimensions means and standard deviation. 

Table 1. Dimensions of instructional effectiveness. 

The participants score Intro to Openness in Education high on dimensions quality and amount of content, encouraging active learning, respecting diversity, ease of access, and minimising costs. The other dimensions score considerably lower. But there are big differences on opines in the individual scores. 

To examine differences between on and off campus participants, means for respective groups are shown in Graphic 2. It appears that the on campus participant score all dimensions considerably higher than off campus participant. Because of the small number of subjects, it is impossible to establish any statistical significance at this moment. 
Graphic 02. Dimension means per group. 

Comments about the interaction were mainly focussed on two areas; one technological and one about human interaction. The 'blog broadcasting' on the course page was mentioned a couple of time, as being a hinderance, for example "the DS106 posts were overwhelming and diluted the actual content of the course." In general participants commenting on the human contact had expected more interaction: "I would have loved to build more meaningful relationships with other participants, but it just didn't happen."

Discussion
The survey discussed in this blog originated from a feeling of lack of interaction between all actors in the course Introduction to Openness in Education 2012, combined with an interest in the user perspective on the different topics in the course (anyone else considering completing the user perspective badge? yes? ;-)) I assumed there might be similar frustrations in other participants, and suspected it might be different for on-campus students, who could talk about the topics at least once a week in class ( I assume; correct me if I'm wrong.)

To be honest, I feel the dimensions of effective instruction are quite a shaky part of this survey. Rating them and how they relate to the course ... low to high ... does anyone really know what that means? I assume you are all expert survey answerers, and went along nicely. Thanks for humouring me! One could question though if people had the same ideas when doing the rating. It would be good to develop a better instrument when repeating this with a larger audience.

Obviously the small sample size is the other reason why results need to be taken with a pinch of salt. OK and the fact that it was quite the convenience sample perhaps. The results could be used in a more qualitative study perhaps, when a survey would be complemented with interviews and or other data.

I do think the difference in responses between on and off campus participants is striking and could warrant a more serious study!

Thanks again for your interest and support!

Cheers,
Jeroen


References
Ward, Peters, and Shelley. 2010. Student and faculty perceptions of the quality of online learning experiences. The international review of research in open and distance learning, 11 (3).

3 comments:

  1. I'm pretty sure that the experience was quite different on campus. I think the biggest challenge was the small number of actual outside participants. I don't think we have reached the critical mass to take advantage of the crowdsourcing effect.

    To me, this class was not a MOOC, it was a mini-MOOC. I learned quite a bit anyway though, but mostly the way I would have learned through reading a book and taking notes on the side.

    ReplyDelete
  2. Agree, it is not an MOOC, probably not set up to be one either. I like the somewhat smaller scale, and still would have liked to get the other students a little better.

    ReplyDelete
  3. Your results describe really well how I felt about the course. I took it as part of my job with a group of others at my institution that were all on a OER taskforce. We incorporated a face-to-face component that was helpful. I am hoping to work on a sloan-c presentation about this in the fall if it is accepted that would be qualitative in nature. I may do something formal either way. I will let you know if I do.

    ReplyDelete