A workflow to evaluate online tools for learning

Web 2.0 Expo Hall/TopRank Online Marketing © 2008/CC-BY 2.0

Web-based technologies are changing the way we live, work and learn at an unprecedented rate and in many unpredictable ways. YouTube, Facebook, Scoop.it, Pinterest and many other tools, all seem to hold out quick, easy and inexpensive solutions – solutions that don’t appear to require an army of IT specialists to support, and which promise much in the way of improved and relevant interactions.

As an individual, trying out a new online tool is reasonably straightforward, but, as educators, what should we be looking for? How should we start to evaluate these tools to see if they will work for us and our students? There are very many permutations to look at here. We all have our own approaches to teaching, and there are some areas of technology we may feel happier with than others. I think it’s fair to say that we are, each of us, unique in our approach to teaching and learning – and how we use these tools will reflect that uniqueness.

However there are some key principles we can apply to evaluation that can help us begin to choose what’s best for us and our students, and in this blog post I propose a workflow as a guide to how we can go about this. I suggest that there are two key questions we need to ask ourselves when exploring a platform or tool for use in teaching and learning. First and foremost is “Will it work?” – for our institution, ourselves as educators and our learners – and secondly within what learning context can we place this tool? I suggest three main considerations:

  • A technical test – including a pragmatic and a usability review
  • A pedagogy test – based on Chickering and Gamson’s ‘7 Principles of Good Practice’.
  • A learning design review – based on a modified Dial-e framework.

The technical test has two aspects – a usability review (which looks at how well the interface works), and a pragmatic review.

Pragmatic review

The pragmatic review includes consideration of 5 interrelated areas:

1. Does the service work equally well in the different browsers and mobile devices that you and your students use?
This is important as we want the opportunities for learning to be available in a timely manner and this means supporting the variety of devices that we and our students use on a daily basis.

2. Are the outputs re-usable? This includes the ability to download videos, slideshows, essays, notes and other outputs – so that they can be used in other environments.

3. Many online tools can accommodate different ways of learning – for example using video to record achievement instead of or in addition to reflective writing. However not all tools allow access to learners with disabilities. So you should consider:

  • Can you use it with a screen reader?
  • Can you easily add closed captions or transcripts to audio and video?

Consult your institutions’ Disability Support Team or contact the experts at JISC TechDis when considering new tools to support learning.

4. How reliable is the service, including:

  • Robustness of the service. Does the tool have a record of going off line? There have been instances of cloud services losing data – something that could be disastrous if you’re at the end of a module and have no fallback position.
  • Some free third party services have also been known to change to costly subscription services with little notice to users .
  • Many tools are in a continual state of development and may change the way they work to a lesser or greater degree without notice. When running a busy module, this type of change will add to your and your learners work, and could have a demoralising effect.

5. What are the terms of service?

  • You and your institution need to be aware of your obligations under data protection legislation.
  • You need to make your learners aware of the implications of sharing private data online and the risks associated with it.

This is an important area that is best dealt with by experts. I recommend JISC Legal’s advice on this.

Usability review

Alongside these ‘pragmatic’ considerations it’s important to look at how the tool actually works in practice. Although learning how to use online tools is important for developing digital literacy – some tools are easier to use than others and, when confronted with a new interface it’s worth spending some time exploring how easy it is to use.

The key questions you need to ask are:

  • Does the interface support all the tasks expected by the user? That’s in terms of help and support documentation, as well as the underlying functionality.
  • Are there conflicts in the functionality of interface? Although most tool developers engage in beta testing, not all wrinkles are necessarily ironed out before a tool goes live. You should robustly test the tool to ensure that it does what you want it to do.
  • Does functionality change the nature of the underlying task? If your students have to spend a significant amount of time learning the interface, are they going to have enough time on task? What can you do to reduce the cognitive load of learning how to use the tool?

Pedagogy test

I think Chickering and Gamson’s ‘7 Principles of Good Practice in Undergraduate Education’ provide a sound basis for evaluating the usefulness of a tool to support teaching and learning. The principles highlight the importance of:

  • Good communication channels between learners, and between learners and staff
  • Opportunities for cooperation among students.
  • Time on task – ensuring that technology is employed to focus on tasks not on wrangling difficult and poorly designed tools.
  • Supporting diverse methods and means of learning. Online tools present opportunities to use digital media, video, images, sound, mapping and reflective activities that can enable a more diverse and richer approach to learning than has hitherto been possible
  • Setting high expectations. The facility of online tools and web 2.0 technologies to readily share practice and reflection has the potential to improve learners’ performance. The web affords access the best the world has to offer online and this can be used as a springboard for learning. But it’s equally true that digital technologies can be used for superficial activities that undermine academic standards. In Rethinking University Education, Diane Laurillard warns that:“…new technology easily supports a fragmented, informational view of knowledge…and is in danger of promulgating only that.” (Laurillard, 2002, p227).

We need to ensure that the elements that distinguish academic learning (the ability to analyse, evaluate, articulate and represent experience effectively) are made explicit when designing and delivering programmes of learning that incorporate these tools.

While evaluating an online tool to ensure that it will work the way you want (and demonstrate a real benefit to your learners) you may also assess what type of learning can take place and explore approaches to learning design.

Learning Design

In this area I would like put forward the Dial-e Framework as a good starting point for modeling your approach. This framework was developed by Simon Atkinson and Kevin Burden to support the use of digitized archive films held by the Newsfilm Online collection – now part of JISC MediaHub. They identified 10 discrete learning designs, which I have simplified to 4 main categories:

  • Stimulus
    The use of tools and content to stimulate interest and engagement – something that quickly engages learners to consider a new concept or approach.
  • Investigation
    Which would typically involve using digital technology to research, understand and apply processes or concepts – for example watching and engaging with an online ‘how-to’ video.
  • Analysis
    Exploring textual qualities in, for example, film or media studies, where learners analyse editing, framing, lighting, sound design etc – as well as alternative perspectives, where tools and content are used to understand and empathise with others.
  • Creation
    Which involves the evaluation and application of tools, content and methods to create a project – either using original content or from re-usable sources or both.

New technologies call for new approaches to pedagogy – and I think that this modified approach to the Dial-e framework provides a good starting point for considering the uses to which we can put both digital tools and content.

What’s your approach?

In this post I’ve attempted to provide a workflow which I hope you will find useful. This is important and evolving subject and I am very interested to hear how you approach evaluation.

References:

  • Diana Laurillard (2002). Rethinking University Teaching: a conversational framework for the effective use of learning technologies, 2nd edition. Routledge, London.
  • Arthur W. Chickering and Zelda F. Gamson (1987). Seven principles for good practice in undergraduate education. In American Association of Higher Education Bulletin vol.39 no.7 pp.3-7

Further reading:

The Centre for Learning and Performance Teachnologies: The Social Learning Handbook
Edudemic: Facebook Guidelines for Educators
JISC Legal: Facing up to Facebook
Terms of Service; Didn’t Read – A user rights initiative to rate and label website terms & privacy policies, from very good Class A to very bad Class E.
We-Share – an infrastructure that collects descriptions of ICT tools available at the Web of Data and adapts them to be used for educational purposes.

Advertisements

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s