Monday, December 22, 2008

Share early: Paper on Evidence of cross-boundary use and reuse of digital educational resources

I finally sent off my paper to a journal. Exiting. The first comment was to cut it shorter by about 2500 words, even before they started reviewing it. Outch, I think I managed to do it, I have a copy of it here:

Vuorikari, R., Koper, R. (submitted). Evidence of cross-boundary use and reuse of digital educational resources. pdf
ABSTRACT: In this study we conducted an investigation on the server-end log-files of teachers’ Collections of educational resources in a number of content platforms. Our goal was to find empirical evidence from the field that teachers use and reuse learning resources that are in a language other than their mother tongue and originate from different countries than they do. We call these cross-boundary learning resources. We compared the cross-boundary reuse of educational resources to the general reuse figure of 20%, and find that it was either equal to or less than the general reuse. We further studied the coverage, the overlap and the pick-up rate of these resources, and propose steps that could improve the probability of discovery, use and reuse of cross-boundary resources.

I actually have a new academic homepage too, check it out http://elgg.ou.nl/rvu

Friday, December 19, 2008

How different is user behaviour on a portal from the ones who log-in to ones how do not?

I've recently done quite a few studies on users of learning resources portals, I've looked for example how do they tag resources in a multilingual context or how much use and reuse is there across the borders. In all cases the studies have concentrated on the small amount of the (minority) users who actually log in and had created Collections of resources: in Calibrate that was about 30% and in LeMill about 10%.

Now in MELT we've revised the logging scheme to collect the click-stream from users who don't log-in. We also have Google analytics, but I don't have those at hand right now. I looked at the data from last 3 months, from Aug 18 to Dec 18, and then only from the last month (Table 1).

What do users do on the portal?

The most popular activity on the portal is search, 64.29% of all actions on the portal are different types of searches. They result in "playing" the resources in 18.31% of all actions on the portal. 13.09% of all actions are contributing actions on the portal, this means adding a tag, bookmarking or rating it. The figure of contributing actions is actually a bit distorted, we count each tag, rating and bookmarking there. As each bookmark has average of 4.3 tags attached to it, it brings up the figure. Actually, the number of actions that contribute to "acting with an individual resource" is around 4.3% of all actions (i.e. add rate and bookmark). Other includes activities like view evaluation, view other users who have bookmarked the resources, etc.

Table 1


The downside here is not having the stats from Google analytics, so I cannot exclude our internal usage, which I know has been quite a lot, since we've been testing the portal internally. So the figures might be somewhat distorted...


What about users who log-in and the ones who don't?

About a month ago I invited some 260 teachers on the portal, so I was intrigued to see what had happened. 2 weeks ago I checked that 11% of these teachers had started their own account. But, it seems like much more have come about and cruised around the portal.

Table 2 presents the data from the last 4.5 months (Aug-December) where I have divided it in two slots: first months include pilot teachers and lots of testing, in the table it's erroneously called "First 2 months". The second slot covers the time from Nov 18 to Dec 18 when we invited the new teachers (Nov 18/19 in 4 different patches of invites). It is called "the 3rd month" in the tables (again, my mistake). Moreover, the top half of the table has data regarding users who log-in and the bottom with users who did not log in.

I have mostly the same attributes for both, how did they search; advanced, browsing by category and by tag cloud and how many resources they clicked on (play). The table also contains the number of sessions and number of actions. A session is one consecutive event when the user does something, it's logged. If left idel, the user is logged off in some time. An action is anything, a search, a click on a resource, on a tag, etc. Additionally, we have the contribution by logged-in users, these are tags, bookmarks and ratings.

Table 2


As you can see, most of the sessions (above 86%, the second last row) in both slots take place when users are not logged in. Actually, the percentage of sessions stays pretty regular in both slots. Moreover, regarding the actions, we can see that during first months they mostly (70%) came from non-logged in users. However, when we invited the new teachers, we see 10% increase in actions by logged in users (from 30% to 40%). That's positive, as it shows that some of the invited teachers were motivated to contribute.

There is actually quite big differences in what do these two groups of users do when they are on the portal. Where logged-in users spend about 1/3 of their actions in searching, non logged-in users spent about 2/3 of their actions in searching. Chart 1 shows this clearly, however, I must say that most likely the disparity between the number of searches and plays by non-logged in users in the first months are due to our internal testing. If you compare that to non logged-in users in the 3rd month, you see that there is already less searches and more plays.

There has been a difference since the new comers (3rd month): within the logged in users, the number of searches executed has gone down (10%), whereas the number of plays has gone up (from 17% to 23%). Among non-logged in users there is the same 10% drop in searches, but plays have gone up by 10% (from 16% to 27%)! That shows that the new comers were interested in seeing what kind of resources were out there in the portal.

Chart 1 can maybe be used to illustrate

Chart 1


One difference can be observed in how differently these two groups seem to search: with logged-in users the advanced search seems to be the more popular way to search (more than 50% of searches are advanced), whereas with the users who are not logged-in browsing (both by category and tag cloud) is more popular. During first months 54% of searches were browsing, which went slightly up (to 56%) during the 3rd month. The tag cloud was the biggest winner in both groups (logged-in and not) at the cost of advanced search. I assume the difference is due to the fact that people who are not logged in are interested in seeing what is out there and browse around to discover learning material.

In any case, if we look at the figures of non logged-in users within the 3rd month, it's intriguing how equally the searches are distributed across these different ways of searching. We'll keep an eye on this in the future (e.g. when we know that most non-logged clicks come from us testing the portal).

Consumers and contributors

In Table 3, where I again have data for users logged-in and not, and by periods of first months and the 3rd month, we see that when users are logged in, they do things differently. First of all, the logged-in users spare much smaller percentage of their actions in searching (average 33% to 75%), however, bizarrely, they still seem to "play" about the same amount of resources (around 20%).

Table 3


Within the 3rd month we see the percentage of plays growing. We can assume that the logs from the first months period are most likely influenced due to our internal testing of the portal, which often times includes making searches. We see that the percentage of plays go up for the non logged-in users within the last month (from 16% to 27%), which, I assume shows a more normal user-behaviour than what could be observed before.

This still indicates that there is lots of inefficiency when non-logged in users search: on the average during the 3rd month, for those logged-in, one "play" was a result of 1.2 searches, whereas with those not logged-in, one "play" was a result of 2.6 searches - lot of time lost in searching. From Table 2 we can observe that there was more browsing (non logged-in users 1 month), I wonder if that was the reason? Have to keep on eye at that one!

Most interestingly, 40% of actions by logged in users contribute are the ones that contribute something to the portal, they rate, tag and bookmark. Folks who do not log-in are consumers: they only search, click and leave (- which is fine too).

So all in all, if we look at all the actions on the portal, the contributing actions by logged in users amount to about 17%. Too bad that this figure did not go up in the 3rd month like some others did. Anyhow, it seems to follow the power-law of distribution (20-80), where small amount of people contribute a lot so that other people can take advantage of this work, also know as participation inequality by J.Nilsen (2006).



J.Nilsen (2006) Participation inequality: Encouraging More Users to contribute

Learning resources landscape

Learning resources come in all colours and shapes, that is for sure. They also come from all kinds of different places; repositories, portals, the web.... For a recent presentation and paper, I created this diagram to better depict the learning resources landscape. As I later had to remove this part from the paper to save place, I post it here.


Teachers use a plethora of ways to discover educational content online. Harvey et al. (2006) report on search strategies of 4500 US faculty members where Google-like searches are by far the most prominent (81%), second most important being own personal Collections of resources and also “portals” that provide links to disciplinary topics (55%). In our user group comprised of 45 language and science teachers in K-12 education, such diversity of strategies was also discovered: one third use national and regional educational repositories as their primary source of educational content, 28% use search engines, 21% said they create their own content, 7% use content from schoolbook publishers and 12% reported all of the above (Vuorikari, 2008a).

These search strategies also give an indication of the different types of resources that teachers use. Figure 1 illustrates a number of different sources of content that teachers use. First of all, on the horizontal axis we distinguish between platforms that have institutional support and the ones that are rather teachers’ community driven sources. On the vertical axis we distinguish between teacher-generated content and “other sources”. The latter encompasses a large number of providers from educational portals and repositories, schoolbook publishers to educational and non-educational sites created by a number of private and public stakeholders. This “other sources” category is essentially as large as a teacher’s pedagogical imagination is in taking advantage of the resources on the Internet.

This diagram allows us to draw a landscape for educational resources. In the upper left corner of the diagram, there are examples of institutional Learning Object Repositories (LOR), such as the ones managed by Educational Authorities (e.g. Learning Resource Exchange for schools and members of EdReNe) and other repositories that make educational content available. On the lower left corner we place initiatives like MIT OCW which is an institutional repository that makes available teacher-generated content. The lower right corner represents teacher-generated content in a community-driven environment (e.g., LeMill), whereas the upper right hand corner represents content that is found on the Internet from various sources and saved in community-driven environments like delicious.com. None of these boundaries are fixed and there are many in-between-models (e.g., LOR with both user-generated content and institutional ones). Our data sets for this study, which are presented in Table 1, cover a wide area of Figure 1. For learning resources we use Wiley’s (2002) definition of learning object as “any digital resources that can be reused to support learning”, as they vary greatly in granularity and other qualities.

Our evidence finding focuses on teachers in K-12 education in a European multilingual context. In the Europe Union area, where 497 million people (Eurostats) live from diverse ethnic, cultural and linguistic backgrounds, multilinguality has an important role (Council of Europe, 2007). There are 23 EU official languages, 3 alphabets, and some 60 other languages are part of the EU heritage and spoken in specific regions or by specific groups (COM, 2008). Multilinguality can be defined as a situation where several languages are spoken within a certain geographical area, as well as the ability of a person to master multiple languages. 56% of EU citizens say that they are able to hold a conversation in one language apart from their mother tongue, and 28% in at least two languages. English remains the most widely spoken foreign language throughout Europe (38%), second and third place is French (14%) and German (14%), whereas 6% have foreign language expertise in Spanish and Russian respectively. Over two-third say that they language lessons at school was the way they have learned foreign languages (COM, 2006).

..................

Harley, D., Henke, J., Lawrence, S., Miller, I., Perciali, I., and Nasatir, D. (2006). Use and Users of Digital Resources: A Focus on Undergraduate Education in the Humanities and Social Sciences. Available from
http://cshe.berkeley.edu/research/digitalresourcestudy/report/digitalresourcestudy_final_report.pdf

Vuorikari, R. (2008a). A case study on teachers' use of social tagging tools to create collections of resources - and how to consolidate them. In Wild, F., Kalz, M., Palmer, M (Eds) Proceedings of the First International Workshop on Mashup Personal Learning Environments. Available from http://sunsite.informatik.rwth-aachen.de/Publications/CEUR-WS/Vol-388/vuorikari.pdf

Wiley, D. (2002). The Instructional Use of Learning Objects. Online at: http://reusability.org/read

COM(2006). Europeans and their languages. Special Eurobarometer, European Commission.

COM(2008). 566 final. Multilingualism: an asset for Europe and a shared commitment, European Commission.

Council of Europe (2007). Un cardre Européen commun de référence pour les langues : apprendre, enseigner, évaluer. Division des Politiques Linguistiques, Strasbourg: France.