Thursday, August 07, 2008

Can Social Information save teachers' time when choosing interesting learning resources?

One of my research questions is aimed at understanding what so called Social Information can do to help teachers to choose the right learning resources from a seemingly overwhelming collection. By Social Information I mean information about previous users' interactions with the resource. I am mainly interested in explicit annotations like ratings and tags, and more implicit ones like bookmarks.

As I'm interested in the use of resources that come from different countries than users do, I think Social Information (SI) should display not only annotations, but also information from where the user comes from.

One thing that I hypothesise is that among other things, Social Information, when associated with conventional metadata about learning resources, can make the decision making process faster for teachers when, for example, looking at the search result list. As a multilingual context in a repository can result in metadata that is in different languages, it could be speculated that Social Information indicating the origin of the users who have previously annotated the resource, could help the other users to make up their mind (see the image for an example).

We were interested in two different aspects:
  1. Does the appearance of Social Information make the decision making process any faster?
  2. Does the appearance of Social Information make the users choose more resources?

Method

We had 25 users from five different European countries. These teachers are primary and secondary teachers in science, language learning and ICTs in Finland, Estonia, Hungary, Belgium and Italy. xx of them are females and xxmales. xx participant is under 30 years old, xx are under 40 years, xx under 50 years, xx under 60 years old.

They have been part of the MELT project since Summer 2007. In March 2008 they were invited to create a profile on the MELT portal, where they are able to access multilingual learning resources for different topical areas.

We designed an experiment where teachers were shown two different imitations of search results list with learning resources and their associated metadata. One of the lists showed what we call the conventional metadata, such as title, url, language of the resource, a short description, subject area, type of content and its target audience. Here is an example.

The other list had the same metadata, but we also added the Social Information from the previous users. This could be the tags in their original language, the number of times bookmarked (favourites) and the ratings. Also, for bookmarks we would mention from which country the users come from. As example of this was shown above, the first image in this post.

We had 48 learning resources that came from different countries and were in different languages. About half of them were in English and other half in other languages, this also seems to reflect the division of the resources that users have bookmarked on the portal. The resources were about language learning, primary education, ICTs and science material, those were the areas of the teachers. I'll prepare better information about this later.

We had 12 learning resources on a page imitating a list of search results that user could get on a repository. In total, there were 4 such pages for each user, we call them sets. Every second set had conventional matadata, and every other had additionally also Social Information as indicated above.

At the beginning of the each set the participants were asked to write their names and the time when they started with the set of 12 resources. At the end, when they submitted their results, the system recorded a time. To answer to our first question we were interested in how much time do teachers spent to evaluate the appropriateness of 12 resources for them.

The teachers were asked to look at the metadata of the resource and the resource itself if interesting, and were asked one single question: "Would you use this resources, or parts of it, in your teaching in next Fall?" They answered on a scale 1 to 5, 1 being "I don't teach the topic", 2= No, 3= Maybe not, 4= Maybe and 5= Yes. Looking at the number of resources that users choose in their topical areas would give us indication of whether resources that have more Social Information related to them were more often chosen than the onces without.

Because of the low number of participants (n=25) we decided upon a within-subject design for this experiment. This is the one where the same group of subjects served in both treatments, i.e. they received both the material with conventional metadata and with social information.

Moreover, we had the participants in two different "groups". Group 1 had 12 participants and Group 2 had 13. Group 1 started first with a set with conventional metadata and Group2 with a set of resources that had Social Information added to it. When analysing the results, we found that one user in Group 2 had consistently added incorrect times. We excluded these times from the counts for time spent per set, leaving 12 participants in each. Moreover, in both set there were a few cases where the start time was forgotten.

Results

Descriptive statistics

We had 1129 responses to our questions, which means 71 responses were left blank. In 53% of the cases the users had answered that they do not teach the topic, which means that they deemed the resources not suitable for the topical area that they were teaching. 25.6% of the users found resources that they said that they would use (yes or maybe yes), whereas 21.6% of the resources were not found of use in the upcoming school year (not, maybe not). The mean for the responses was 2.23 (Min=1, Max=4), standard deviation was 1.138.

Q1: Does the appearance of Social Information make the decision making process any faster? Time spent on 12 resources (i.e. set)

On the average, users spent a bit more time on the sets that did not contain Social Information. The average to review a set of 12 resources with conventional metadata was 9 minutes and 8 minutes with Social Information.

I do not know yet whether this is a significant difference (my SPSS license ran out), but one could assume it is at least a sign of good news for Social Information. We can imagine that users go through a lot of resources when browsing a learning resources repository (I currently do not have the logs about the number of resources that users review per session, but I will produce them). So if you think of small cycles and multiply that number with, say 1o times, you could come up to some significant time savings when Social Information is made available to speed the decision making process.

Individual differences

Still looking at the average times spent, we can see that there were many individual differences. In the chart below the blue lines show the amount of time that participants spent with conventional metadata and the red one with Social Information added to it. You can see that for some users one metadata setting seems like a faster way, but anyhow, the lines follow one another pretty closely, apart from some odd-balls (like user 23). You can also see that there seem to be a wide variety of personal ways, some users scrutinise resources with a great care (user 7 and 8), whereas some go through them very fast (user 16 and 17).

I'd like to mention that here it does not matter that some of the resources are not in the competence area of the participants. We focus purely on the time that they spent going through pages and making decisions whether some of the resources are useful for them in the upcoming school year or not. However, this becomes crucial to answer to our second question:

Q2: Does the appearance of Social Information make the users choose more resources?

Table below presents the results when I looked at the amount of resources chosen per set. There was 4 different sets and each contained 12 learning resources in different languages. The two different treatments meant that teachers reviewed 2 sets with Social Information available, and two sets without. As teachers were from different tpical backgrounds, I excluded the responses from users who said that they do not teach the topic of the given resource. In table below you can see the percentile of positive responses (maybe use, use).

It appears that consistently teachers chose more resources when the Social Information was not available. This is contrary to what I expected. I have not calculated the significance of these results, but the differences do look big. In some cases, like in the 2nd set, even about 15% in favour of no Social Information available.

In a way, maybe the appearance of SI makes the teachers more careful or critical to choose the resources?

What is needed now is a follow up study at the end of this school term to check whether these teachers actually used the resources in their teaching. Or, I could check if they have bookmarked these resources on the MELT portal. They know the resources are available there. MORE to follow...

No comments: