Thursday, November 06, 2008

Mind those backups! - stolen laptops

Oh boy, my digital misery does not seem to be over yet! After having my home broken into and my laptop stolen, I was dead-happy that the evil-minded robber left my external, one terabite back-up hard-drive right where it was, about 50 cm away from from the laptop that s/he stole.

Now about 10 days later I'm typing away using my new MacBook Pro. I must say that I was pretty happy to have restored my "life" using Time Machine, only about a week's worth of back-ups was missing. It felt great to log-in to my own desktop, have my applications restored just like that, and just start using my laptop like nothing happened. I even pledged to always do my backups.

Then I started noticing little things. Ops, my address book is missing. My bookmarks and passwords are missing in Firefox. Then, I wanted to upload a file, and I realised that ALL MY FILES ARE MISSING! WTF?

Those files are, for sure, on the back-up hard-drive, I can access them there. But for some reason, when importing "my life" from Time machine, it omitted to import my files. Hey, big deal, at least the file structure is there..dah.

Now I'm trying to discover an easy way to do that. I am finding all kinds of not so nice things on Time machine, like this.

Ok now, time to roll up my sleeves and start digging out those files. So far this post looks promising.

As a lesson learned: mind your back-ups!

Wednesday, November 05, 2008

Power of pursuation/example/recommendation

We plan to invite about 130 teachers to the portal and I'd like to make a little experiment on this. The idea is to show two different examples of how to access and discover learning resources on the portal, and see whether that has an influence on
  • Uptake: users sign up
  • Retention: users come back
  • Different ways to discover resources (social cues vs. conventional search strategies)
  • Focus on the system (discovery e.g. clicking on resources vs. contributing, e.g. bookmark and tag, rate, etc)
The point would be to test the statement from the results reported in Harper et al (2007), where an email newsletter with manipulated social comparison made no difference to a member's interest in using the system, but it changed their focus within the system.
While subjects who received an email message with the comparison manipulation were no more likely to click on one of the links or log in to the system, they were more likely to rate movies.
There are differences, but in grosso modo the idea has similarities: In Harper et al (2007) the manipulated social information in the email concerned the subject him/herself (treatment group), whereas in my case the information would be about some other teachers that the subjects would be able to relate with (e.g. see favourites from a science or language teacher). The biggest difference would be that there is no comparison aspect of the subject's performance to the other users in the system, which was one of the central features of Harper et al study.

Study design

The randomly selected treatment group would receive an invitation with set goals of expectations on the use of the portal. Examples will be given on how to access and discover learning resources based on social navigation cues, show examples of how to browse other user's Favourites and how to access resources through a tag cloud.

The randomly selected control group will
receive an invitation with set goals of expectations on the use of the portal. Examples will be given on how to access and discover learning resources would receive an invitation where examples would be given on how to access and discover learning resources based on conventional free text search or browsing keywords
(to be reworked, just initial ideas based on tracking methods that I could use).
  • Immediate reaction: number of people who access the portal through the direct links on the invitation as opposed to the number of people who access the portal though the main page. the time these people spend on the portal on the first time and how they search, how many searches they execute and how many resources they click on
  • Uptake: is there difference between the groups on signing up on the portal?
  • Retention: on the longer run, say, within a month, do they still come back
  • Different ways to discover resources (social cues vs. conventional search strategies)
  • Focus on the system (discovery vs. contributing in terms of ratings, tagging, etc) : do people who see examples of social navigation use these methods more than the control group? are there any differences how many resources the subjects in both groups tag and rate?

Hypothesis to test
(to be reworked, just basic ideas)

hypotheses would be that subjects in the treatment group will discover more resources than the control due to social navigation cues made readily available to them. By discovering I mean that they click on these resources on the portal to view them. I also would hypothesise that the retention rate is better among the treatment group, as they get a direct access to selected resources whereas the control group has to look for the interesting resources and might get diverted there.

Relevant studies in this direction, to be completed

A study towards this direction was reported by Harper et al (2007). They study the effect of email newsletters that told the community members whether their contribution was above or below average. They report that a) previous studies has shown that information about social norms can affect contributions, e.g. people recycled more material when they were provided with information about how much other people had recycled (Schultz, 1999).


Social Comparisons to Motivate Contributions to an Online Community.
, Harper, F.; Li, X.; Chen, Y.; Konstan, J. , Persuasive Technology, 26/04/2007, Palo Alto, CA, (2007)

Using Social Psychology to Motivate Contributions to Online Communities, Ling, K.; Beenen, G.; Ludford, P.; Wang, X.; Chang, K.; Li, X.; Cosley, D.; Frankowski, D.; Terveen, L.; Rashid, A.M.; Resnick, P.; Kraut, R. , Journal of Computer-Mediated Communication, Volume 10, Issue 4, (2005)

Changing Behavior With Normative Feedback Interventions: A Field Experiment on Curbside Recycling (1999)
by P Schultz

Monday, November 03, 2008

Web 2.0 = Learning 2.0?

Last week I was part of an expert workshop on Learning 2.0. It was organised by the Institute for Prospective Technological Studies (IPTS), one of the European Commission's research institutes. They currently run a year-long study on the Impact of Web 2.0 Innovations on Education and Training in Europe. The objective is to assess the impact of Web 2.0 trends on the field of learning and education in Europe, and to propose avenues for further research and policy-making in Europe.

It was very cool to be part of this group, we were about 30 people with various backgrounds and our main job was to looked at the preliminary results of two studies. We first discussed the intermediate results of an exploratory study that seeks to identify and analyse the existing practices related to the Web 2.0 initiatives in the field of learning in Europe. For this reason, a Learning 2.0 database had been set-up where practitioners were able to report their cases. A presentation of this study is available.

The second part of the validation workshop focused on the cases studies: Case study on 'Good Practices for Learning 2.0: Innovation' and Case study on 'Good Practices for Learning 2.0: Inclusion'.

A lot of the workshop time was spent on brainstorming mode, which is something that I truly enjoyed. The point was to try to identify what would be the NEW in what was called Learning 2.0. That's pretty tough, as we hardly know what is the new thing in Learning 1.0! Here is one image of our brainstorming chart, thanks to Graham!



The most skepticism, if I could even call it that as many of us were very enthusiastic about the potential of Web 2.0 for education, was that how can Web 2.0 technologies and tool help the learning, or can they help it at all?

Lots of things could and have been listed by the proponents of Web 2.0, like personalisation, participation, collaboration, motivation, social skills, reflection and meta-cognition. Those, however, are not inherit to Web 2.0, but to any good learning!

So is there anything that makes learning with Web 2.0 so special? In contrary to encouraging reflective learning, Web 2.0 seem to promote sporadic grasshopper minds, like some current studies on multitasking suggest.

What the workshop could say, though, was that more well coordinated research on Learning 2.0 is needed to better understand its potential. One such study in this direction is the new Becta study, which is very impressive.

Some links to other literature that folks in the workshop pointed out: http://delicious.com/tag/iptsl20

Also, check out the literature reviews and other studies that have already come out of Learning 2.0 or are about to come out. There are interesting things going on!

From the Learning 2.0. site:

The rapid growth of social computing or web 2.0 applications and supporting technologies (E.g. blogs, podcasts, wikis, social networking sites, sharing of bookmarks, VoIP and P2P services), both in terms of number of users/subscribers and in terms of usage patterns leads to the fact that the phenomena are also increasingly being used in the educational field and for learning purposes. As it enables different types of learning and teaching settings (formal, non-formal and informal), it is an important driver of innovation in learning.

Description: The Learning 2.0 study will

1. Identify and analyse the existing practices and related success factors of major web 2.0 initiatives in the field of learning in Europe;
2. Look at the innovative dimension of using web 2.0 for learning;
3. Analyse the position of Europe vs. the rest of the world in terms of quantitative and qualitative use of innovative Learning 2.0 approaches;
4. Discuss the potential of social computing applications to (re)-connect groups at risk-of-exclusion;
5. Propose avenues for further research and policy-making.