I've recently done quite a few studies on users of learning resources portals, I've looked for example how do they tag resources in a multilingual context or how much use and reuse is there across the borders. In all cases the studies have concentrated on the small amount of the (minority) users who actually log in and had created Collections of resources: in Calibrate that was about 30% and in LeMill about 10%.
Now in MELT we've revised the logging scheme to collect the click-stream from users who don't log-in. We also have Google analytics, but I don't have those at hand right now. I looked at the data from last 3 months, from Aug 18 to Dec 18, and then only from the last month (Table 1).
What do users do on the portal?
The most popular activity on the portal is search, 64.29% of all actions on the portal are different types of searches. They result in "playing" the resources in 18.31% of all actions on the portal. 13.09% of all actions are contributing actions on the portal, this means adding a tag, bookmarking or rating it. The figure of contributing actions is actually a bit distorted, we count each tag, rating and bookmarking there. As each bookmark has average of 4.3 tags attached to it, it brings up the figure. Actually, the number of actions that contribute to "acting with an individual resource" is around 4.3% of all actions (i.e. add rate and bookmark). Other includes activities like view evaluation, view other users who have bookmarked the resources, etc.
Table 1
The downside here is not having the stats from Google analytics, so I cannot exclude our internal usage, which I know has been quite a lot, since we've been testing the portal internally. So the figures might be somewhat distorted...
What about users who log-in and the ones who don't?
About a month ago I invited some 260 teachers on the portal, so I was intrigued to see what had happened. 2 weeks ago I checked that 11% of these teachers had started their own account. But, it seems like much more have come about and cruised around the portal.
Table 2 presents the data from the last 4.5 months (Aug-December) where I have divided it in two slots: first months include pilot teachers and lots of testing, in the table it's erroneously called "First 2 months". The second slot covers the time from Nov 18 to Dec 18 when we invited the new teachers (Nov 18/19 in 4 different patches of invites). It is called "the 3rd month" in the tables (again, my mistake). Moreover, the top half of the table has data regarding users who log-in and the bottom with users who did not log in.
I have mostly the same attributes for both, how did they search; advanced, browsing by category and by tag cloud and how many resources they clicked on (play). The table also contains the number of sessions and number of actions. A session is one consecutive event when the user does something, it's logged. If left idel, the user is logged off in some time. An action is anything, a search, a click on a resource, on a tag, etc. Additionally, we have the contribution by logged-in users, these are tags, bookmarks and ratings.
Table 2
As you can see, most of the sessions (above 86%, the second last row) in both slots take place when users are not logged in. Actually, the percentage of sessions stays pretty regular in both slots. Moreover, regarding the actions, we can see that during first months they mostly (70%) came from non-logged in users. However, when we invited the new teachers, we see 10% increase in actions by logged in users (from 30% to 40%). That's positive, as it shows that some of the invited teachers were motivated to contribute.
There is actually quite big differences in what do these two groups of users do when they are on the portal. Where logged-in users spend about 1/3 of their actions in searching, non logged-in users spent about 2/3 of their actions in searching. Chart 1 shows this clearly, however, I must say that most likely the disparity between the number of searches and plays by non-logged in users in the first months are due to our internal testing. If you compare that to non logged-in users in the 3rd month, you see that there is already less searches and more plays.
There has been a difference since the new comers (3rd month): within the logged in users, the number of searches executed has gone down (10%), whereas the number of plays has gone up (from 17% to 23%). Among non-logged in users there is the same 10% drop in searches, but plays have gone up by 10% (from 16% to 27%)! That shows that the new comers were interested in seeing what kind of resources were out there in the portal.
Chart 1 can maybe be used to illustrate
Chart 1
One difference can be observed in how differently these two groups seem to search: with logged-in users the advanced search seems to be the more popular way to search (more than 50% of searches are advanced), whereas with the users who are not logged-in browsing (both by category and tag cloud) is more popular. During first months 54% of searches were browsing, which went slightly up (to 56%) during the 3rd month. The tag cloud was the biggest winner in both groups (logged-in and not) at the cost of advanced search. I assume the difference is due to the fact that people who are not logged in are interested in seeing what is out there and browse around to discover learning material.
In any case, if we look at the figures of non logged-in users within the 3rd month, it's intriguing how equally the searches are distributed across these different ways of searching. We'll keep an eye on this in the future (e.g. when we know that most non-logged clicks come from us testing the portal).
Consumers and contributors
In Table 3, where I again have data for users logged-in and not, and by periods of first months and the 3rd month, we see that when users are logged in, they do things differently. First of all, the logged-in users spare much smaller percentage of their actions in searching (average 33% to 75%), however, bizarrely, they still seem to "play" about the same amount of resources (around 20%).
Table 3
Within the 3rd month we see the percentage of plays growing. We can assume that the logs from the first months period are most likely influenced due to our internal testing of the portal, which often times includes making searches. We see that the percentage of plays go up for the non logged-in users within the last month (from 16% to 27%), which, I assume shows a more normal user-behaviour than what could be observed before.
This still indicates that there is lots of inefficiency when non-logged in users search: on the average during the 3rd month, for those logged-in, one "play" was a result of 1.2 searches, whereas with those not logged-in, one "play" was a result of 2.6 searches - lot of time lost in searching. From Table 2 we can observe that there was more browsing (non logged-in users 1 month), I wonder if that was the reason? Have to keep on eye at that one!
Most interestingly, 40% of actions by logged in users contribute are the ones that contribute something to the portal, they rate, tag and bookmark. Folks who do not log-in are consumers: they only search, click and leave (- which is fine too).
So all in all, if we look at all the actions on the portal, the contributing actions by logged in users amount to about 17%. Too bad that this figure did not go up in the 3rd month like some others did. Anyhow, it seems to follow the power-law of distribution (20-80), where small amount of people contribute a lot so that other people can take advantage of this work, also know as participation inequality by J.Nilsen (2006).
J.Nilsen (2006) Participation inequality: Encouraging More Users to contribute
Subscribe to:
Post Comments (Atom)
2 comments:
Why don't you allow anonymous users to rate content? Maybe we should have rating in LeMill, too, now when the amonut of content is growing.
-Teemu
Moi, a good point! I guess there is no reason not to let them rate! Especially seen the big number of people who cruise by without logging in. Yes, would be fun to see something like that in LeMill, you could also use other things than rating, like thumbs-up or smileys. I think you could also leverage Collections, make a marking that a resource has been included in one :)
Post a Comment