3 stars and a wish for ALTC 2015

Talking Content Analysis #altc

Talking about content analysis #altc

Yesterday I attended my first Association for Learning Technology conference (ALTC) which this year was held at the University of Manchester. As the object of my research is to develop a real and relevant approach to automatically measuring and visualising learning activity online, it is essential that as well as being grounded in pedagogic theory, the approach should make sense at a practical level to its users. So, it’s important that I get out from my research lab, share my findings and connect with users; that is: learners, teachers, administrators – and on this occasion, learning technologists. ALTC is arguably the biggest, most connected Learning Technology conference in Europe, if not the world, so having my proposal accepted and being invited to give an extended, 30 minute presentation with the possibility of being selected for publication in ALT’s journal (Research in Learning Technology) was a huge privilege.

One of the key features of ALT’s Extended Presentation format is that at least half of the time should taken up with debate and interaction with the audience (no ‘death by powerpoint’!), and as this years’ conference theme was ‘shaping the future together’ I set about producing a highly participative presentation. I shared my slides beforehand with the 20 or so delegates who had indicated they would be joining the presentation (using ALTC’s excellent web site), and I set up and tested (on my long-suffering colleagues) a set of questions using the Socrative audience response system. ALTC’s ‘Guidelines for Presenters’ also called for a visual approach that eschewed bullet points, so I spent a considerable amount of time scouring Flickr Commons for usable, Creative Commons licensed images to illustrate my talk. Sadly my research fund could only stretch to attending one day at the conference, but I was determined to make the most of the few hours I had.

three stars

GoldStarAs I mentioned, ALTC is a big conference, with what looked to me like over 2,000 delegates from UK educational institutions – but also with a global presence, both in person and online. This is a highly knowledgeable and engaged audience. Because they work at supporting teaching and learning through the use of technology day after day, they have a exceptional understanding of the practicalities of integrating advanced tools within the curriculum. They ask questions, challenge assumptions, and can back up their arguments with evidence. In short it’s the best audience any learning technology researcher could stand in front of.

GoldStarAs well as talking the talk – ALTC walks the walk. They actually use technology to enhance the conference experience. In addition to providing each presentation with a dedicated web presence, they live stream all their keynotes and invited speakers, they add value through a number of applications (including this Flickr reader) and encourage communication between delegates and the rest of the world with the #altc hashtag (which on day one was ‘trending on Twitter’). This Google spread sheet set up by Martin Hawksey itemises tens of thousands of tweets generated by the event.

GoldStarBetween sessions, I got to have a very pleasant one-to-one chat with keynote speaker, learning technology guru, and generally all round nice guy, Steve Wheeler – primarily about how intense live blogging can be (think I might give it a try one day).

A wish

I really wish I could have more than 22 minutes to give my presentation and answer questions. I think the organisers were pushing their luck a bit by programming two 30 minute extended presentations into a 60 minute slot. Once you factor in a crashed pc and further delays it proved to be impossible to give the presentation I had intended – which was a great shame. However, TweetDeck 2015-09-11 13-33-41there was a keen interest from the 40-plus audience for what I had to say. The Socrative audience response system worked well, provided some interesting feedback, and highlighted a key point of my talk – that visualised feedback affects behaviour. Despite having to take ‘an early bath’ I had the opportunity to discuss my work afterwards, was asked some constructive and challenging questions, and made some good connections. My hope now is that I the go-ahead to publish in the RLT journal, but we’ll have to see.

ICALT 2015 paper accepted

Hualien city, by Luis Jou García ©2010, CC BY-NC-SA 2.0

Hualien city, by Luis Jou García ©2010, cc by-nc-sa 2.0

I’ve just had the good news that a paper based on my summer project,  Can you tell if they’re learning?, (co-written with my supervisors) has been accepted as a short paper at ICALT2015. The conference takes place at the start of July in Hualien City, Taiwan – about 10,000 km from home. Here’s the abstract:

The proliferation of Web-based objects designed for learning, makes finding and evaluating online resources a considerable hurdle to overcome. While established Learning Analytics methods use Web interaction data as proxies for learner engagement, there is uncertainty regarding the appropriateness of these measures. In this paper we propose a method for evaluating pedagogical activity in Web-based comments using a pedagogical framework, and present a preliminary study using this approach that assigns a Pedagogical Value (PV) to each comment. This has value as it categorises discussion in terms of focus on pedagogical activity rather than interaction on the Web. Using the DiAL-e Framework we code a selection of comments associated with learning objects within a Massive Open Online Course and test the resulting Pedagogical Value against established Language Analysis and Learning Analytics methods. Results show that PV is distinct from typical interactional measures; there are negative or insignificant correlations with established Learning Analytics methods, but strong correlations with linguistic indicators of pedagogical activity. This suggests that adopting pedagogical frameworks may produce more accurate measures of pedagogical activity than interaction analysis, and that linguistic rather than interaction analysis has the potential to automatically identify learning behaviour.

Obviously I’m looking forward to interacting with key academics in the field of technology in education, but I also hear that the surfing’s quite good. What’s not to like?

Open Data: are local councils getting the message?

Liam Maxwell GaaP Seminar 5 February 2015/Tim O'Riordan ©2015/cc-by-sa 3.0

Liam Maxwell GaaP Seminar 5 February 2015/Tim O’Riordan ©2015/cc-by-sa 3.0

I attended a highly inspirational talk at the Ordnance Survey last night. The key speaker, Chief Technology Officer at the UK Government’s Cabinet Office, Liam Maxwell, spoke on “Government as a Platform” (GaaP) under the auspices of the Southern Policy Centre to a distinguished group including local and national politicians, academics, CEO’s and researchers. Maxwell is in charge of streamlining the online provision of government services and has overseen the move from the old direct.gov.uk service to gov.uk – promoting their key message that they are providing “[d]igital services so good people prefer to use them”. How successfully this is happening can be observed by exploring gov.uk’s performance data.

So what is GaaP and should we mind it?

The driving force behind GaaP is the Web and how it enables governments, local and national, to have a better understanding our needs, and enables us to oversee, interrogate, and participate in our government in new and potentially more effective ways. In addition to “building digital services that are simpler, clearer and faster to use”, at the heart of GaaP is shared information. Although managed by different Cabinet Office team, open data plays a significant part in lifting the lid on the workings of government. Data that was once squirreled away in Whitehall filing cabinets and town hall basements are now being made available on the Web in an unprecedented move towards greater transparency and openness in government.

In this new arrangement, government, as a source of data, becomes the ‘guide on the side’ – an enabler rather than the leader of civic participation – and as active, Web-connected citizens we now have the tools to find solutions to problems that affect us. As public.resource.org assert in their ‘8 government open data principles’: “[o]pen data promotes increased civil discourse, improved public welfare, and a more efficient use of public resources.” At a time of increasing constraints on public spending, the benefits of open data, open standards and open source tools (like the Government Data Service open source platform) have the potential to effective positive change in how we use government services.

There are some substantial barriers to overcome. Real concerns exist about the effective and secure management of data, as have surfaced in the debate on the government’s care.data project. Can we sure that those publishing data do so without inadvertently releasing our personal information? This requires very clear understanding of the dangers of re-identifying anonymised public data, and effective controls on how data are released for publication.

In addition, there is a lack of public awareness about, and the necessary skills and knowledge to use open data effectively. This will come, with the bedding-in of new Computer Science curriculum, and through interventions like those run by the Ordnance Survey, but there is still a great deal to do before we start to see tangible benefits in the delivery of government services.

Close to home, local government are starting to adopt more transparent practice, but progress is slow. My local authority, Southampton City Council, has released some financial data – some of which could be considered as ‘3 star’, and anyone with time and motivation to find their way around MS Excel (with the NodeXL template), or Tableau software will find something of interest. Cambridge City Council have published a considerable amount of data (some 4 star), and across the country there’s a patchy, but growing amount of local government data available for all of us to interrogate.

This is no small undertaking, council budgets are being squeezed at an unprecedented level, and doing something new and with uncertain outcomes is a difficult sell at the best of times. Creating exemplars of good practice is important, to this end the Cabinet Office recently recognised Hampshire County Council and others as ‘Local Digital Champions‘, and the Society of Information Technology Management (SOCITM) has called for the establishment of a ‘local GDS’ to help local councils translate policy into action.

The gap between our current local government services, and how they could be better designed and managed in future, is important to us all. There are already inspiring developments – as well as the SOCITM initiative, the Local Government Association’s open data repository, the work of the Open Data Institute, and the Government Digital Service are supporting the move to more open government. The key message is that open data, open standards and open tools provide us with opportunities to develop modern, responsive public services, and to participate in improving our local economies.

DAL MOOC – Weeks 3/4 Social Network Analysis

'dalmooc' Twitter Seach NodeXL Graph

‘dalmooc’ Twitter Search NodeXL Graph/Tim O’Riordan ©2014/cc-by-sa 3.0

Sadly I couldn’t get Gephi (the recommended network visualisation tool for this course) to work, as it involved downgrading my version of Java, and the many comments on this issue in the DAL MOOC discussion forum didn’t fill me with confidence. So, the social network graph shown above was created using the NodeXL template in Excel. It’s not as pretty as some graphs I’ve seen, but it does works well enough.

The graph is built using the ‘import from Twitter search network’ function (I’ve listed the settings I used to create it at the foot of this post) and shows those mentioning ‘dalmooc’ in their tweets between 3 and 11 December 2014. It clearly demonstrates the centrality of two tutors (George Siemens and Dragon Gašević) to the discussion in the final weeks of the MOOC – which is probably not the preferred outcome for a course that aims to engage learners in collaboration and co-creation of knowledge. However, in-depth analysis was carried out by more experienced hands than mine, which I report later in this post.

In week’s 3 and 4 the course has moved towards a more in-depth discussion on social network analysis and an indication of the common metrics used to analyse learning interactions. Before I get onto that topic, I’ll say something about my understanding of the importance of social networks.

Social Networks

Social network formation is a dynamic process in which individuals typically interact with others similar to themselves [1] and use their network connections as social endorsement [2]. People may choose new acquaintances who are friends of friends – a process known as triadic closure [3]. This type of ‘weak tie’ tends to be more useful than the stronger ties associated with close friends [4, 5], however, there is some evidence that triadic closure does not support higher levels of trust, and that ‘real world’ interaction plays a significant role in developing trust relationships [6]. This suggests that spatial and social proximity is significant in SNAs dependent on high levels of trust. Be that as it may, social interaction is seen as the single most important influence in business, in the workplace, and in job hunting, and online networking exhibit similar attributes.

The analysis of social networks is an important and emerging field of study in education. Vygotsky [8] discusses the incidence of higher internal processes resulting from social interaction; Johnson and Johnson [7] report that social learning is effective, and Garrison et al. [9] assert that social learning has an efficacious effect on critical thinking. Social networks, social activities and social interaction are critical predictors of academic performance [10, 11], and, Tinto [12] suggests that social interaction is good for student retention.

Social Network Analysis (SNA) for learning provides deep insights into the different social processes that unfold while learning takes place. So what metrics are useful in assessing and evaluating SNA for learning?

Essentially, SNA in this context interprets network interaction within learning activities and environments and looks at typical SNA metrics like network density, centrality, closeness, betweeness, number of degrees, in-degree (who is connected to you), out-degree (who you are connected to), and modularity. As weak ties do most of the work in social networks, SNA for learning explores interaction within the log data of VLEs and online social networking applications used to support learning, and identify pedagogical practices that support the development of of these types of connections.

Week 3

In the week 3 Google Hangout, Shane Dawson,  one of the key developers of the VLE discussion forum analysis and visualization tool SNAPP, discussed using SNA to look for learners with high ‘betweeness’ scores. These learners tend to communicate early, more widely and fill ‘structural holes’ between diverse networks – all possible indicators of creativity. When constructing arguments in social networks, learners benefit from having another to work with. Thinking aloud and verbalising involves reflection and regulation of the quality of learning, and activates deeper processes for learning. To support this Dr Dawson referred to Maarten de Laat‘s [12] work, which suggests a strong connection between creativity, ‘dialogic’ skills and ‘higher order thinking’. Essentially, networked and agile learners who demonstrate the capacity to see multiple perspectives (e.g. have high betweeness scores), strongly indicates engagement in higher quality of critical thinking and creative attributes.

In online learning environments learners’ social interaction requires observation and analysis, and may require some intervention (‘scaffolding’). Two key activities may require support: learners who ‘bounce’ around a network, rapidly moving from one subject to another, may indicate dissatisfaction with co-creating/collaborative environments; and ‘over-communicative’ learners who dominate interaction may require attention. In addition, while social capital may be accrued by developing high betweeness centrality, the linguistic content of the messages is critical to understanding their value. So, responding to others with useful and ‘on-task’ message have a positive effect on a learners’ social capital, while thanks and compliments have a negative association.

Week 4

In week 4 we moved onto making sense of SNA in a learning context. In his introduction Dr Gašević was clear that the key role of learning analytics are not simply to gather online quiz scores, course grades (which can only provide a snapshot of achievement), or trivial measures (e.g. the number VLE log-ins), but to analyse and interpret dynamic learning products, (unstructured text in online comments, tags, or blogs) [14]. Learning analytics should be about learning, and the critical areas under examination need to include: learning design, community building, creativity, social capital, academic performance, and distributed pedagogy.

To demonstrate analytics in practice, Dr Gašević introduced the courses’ two ‘data tzars’, Vitomir Kovanovic and Srecko Joksimovic. The ‘tzars’ had collected, cleaned, analysed, and constructed three directed, weighted social graphs from the first two weeks of the MOOC, from data generated by learners and course leaders within the edX discussion forum, Facebook and Twitter feeds. They carried out four main activities:

  1. Calculated betweeness and degree centrality
  2. Calculated linguistic properties for each message and student (i.e. LIWC, coherence, Coh-metric) – Number of words in a sentence, number of average words per sentence, calculated average per student,
  3. Ran regression analyses between betweeness and linguistic properties – differences between what learners write and write about, and position in the graph
  4. Made visualisations

The main outcomes were that:

  • Learners who post messages with deeper cohesion (e.g. continuing a thread, quoting others, asking questions, expressing appreciation or agreement) tend to be the central nodes in the network.
  • Cognitive processes and word count are the best predictors of network position – at this stage of the course.

In their recent conference paper Kovanovic, Joksimovic, Gašević, & Hatala [15] assert that messages containing affective (e.g. emotional, humourous, or self-disclosing), cohesive (e.g. quoting others, asking questions, complementing, agreeing) and interactive (e.g. addressing named persons, using inclusive pronouns, greetings) facets of social presence “significantly predict the network centrality measures commonly used for measurement of social capital”.

Which is fairly intuitive, after all, the more you say – and the more friendly you are – the more you are likely to be central in a network. However, it also appears that whenever different cognitive processes within the language are attended to, this has a positive impact on indicators of social capital.

The messages they analysed tended to have high deep cohesion, and low referential cohesion. Which indicates that learners in these networks are demonstrating good networked learning skills and are employing deep levels of knowledge construction. They are all good at building knowledge, building connections, and sharing ideas within the environment.

Metrics, Graphs and Tools

The key metrics used in this study were:

  • Word count (WC – count of words in a message)
  • Causation (cause – because, effect, hence)
  • Cognitive processes (cogmech – cause, know, ought)
  • Text coherence (LSA – average similarity between sentences in a message)
  • Deep cohesion (the extent to which the ideas in the text are cohesively connected at a deeper conceptual level that signifies causality or intentionality)
  • Referential cohesion (the extent to which explicit words and ideas in the text are connected with each other as the text unfolds.)

The following graphs were produced:

TwitterBigPicture

Twitter Big Picture/DAL MOOC edX ©20114

edXBigPicture

edX Big PictureDAL MOOC edX ©2014

Facebook Big Picture/DAL MOOC edX ©20114

Facebook Big Picture/DAL MOOC edX ©20114

Key:

Shapes: students = circles; instructors = squares
Nodes: colour = community, size = betweeness centrality, label = out-degree (# replies)
Twitter edges: blue = retweet, red = mention, green = reply
Facebook edges: blue = comment, red = like

The ‘Tzars’ Toolkit:

  • R (igraph) and Python – graph extraction and analysis
  • LIWC – Linguistic Inquiry and Word Count (LIWC), text analysis software that calculates the degree to which people use different categories of words across a wide variety of texts
  • SEMILAR – The Semantic Similarity Toolkit (SEMILAR) “software environment offers users, researchers and developers easy access to fully-implemented semantic similarity methods.”
  • Coh-metric – a computational tool that produces linguistic and discourse representations of text.

Additional analysis will be carried out, including:

  • Keyword extraction (e.g. Alchemy)
  • Topic modelling – co-occurrence graphs vs LDA (Linear Discriminant Analysis, used to find characteristics or differences between two or more classes of objects or events).
  • Other centrality measures vs LIWC/Coh-Metric (e.g. degree, closeness centrality)

Node XL Tweet Search Import:

  • Term: dalmooc, 85 tweets, between 3/12 and 11/12
  • Edge – colour = relationship, width = relationship, label = relationship
  • Vertices (nodes) – colour = followed [red=most, green=least], shape = betweeness centrality [square = >100], size = betweeeness centrality, label = vertex name
  • Dynamic filters – Out-degree >1
  • Layout: Franchterman-Reingold

Back to top

References:

  1. McPherson, M, Smith-Lovin, L and Cook, J M (2001). Birds of a Feather: Homophily in Social Networks. In Annual Review of Sociology Vol. 27: 415-444.
  2. Karlan, D, Mobius, M, Rosenblat, T, and Szeidl, A (2009). Trust and social collateral. In The Quarterly Journal of Economics, 124(3), 1307-1361.
  3. Rapoport, A (1953). Spread of information through a population with socio-structural bias: I. Assumption of transitivity. In The Bulletin of Mathematical Biophysics, 15(4), 523-533.
  4. Granovetter, M S (1973). The Strength of Weak Ties. In American Journal of Sociology, 78(6)1360-1380. The University of Chicago Press
  5. Watts, D J (2003). Six Degrees: the Science of a Connected Age. London: W W Norton and Company
  6. Bapna, R, Gupta, A, Rice, S, and Sundararajan, A (2011). Trust, Reciprocity and the Strength of Social Ties: An Online Social Network based Field Experiment. In Workshop on Information Systems and Economics.
  7. Johnson, D W, & Johnson, R T (2009). An Educational Psychology Success Story: Social Interdependence Theory and Cooperative Learning. In Educational Researcher, 38(5), 365-379.
  8. Vygotsky, L S (1978). Mind in society: The development of higher psychological processes. M. Cole, V. John-Steiner, S. Scribner, & E. E. Souberman, (Eds.) Cambridge, MA: Harvard University Press.
  9. Garrison, D R, Anderson, T, & Archer, W (2001). Critical thinking, cognitive presence, and computer conferencing in distance education. In American Journal of Distance Education, 15(1), 7-23.
  10. Gašević, D, Zouaq, A, Jenzen, R (2013). Choose your classmates, your GPA is at stake!’ The association of cross-class social ties and academic performance. In American Behavioral Scientist, 57(10), 1459-1478.
  11. Astin, A (1993). What matters in college: Four critical years revisited. San Francisco: Jossey-Bass.
  12. Tinto, V (2006). Research and Practice of Student Retention: What Next? In Journal of College Student Retention: Research, Theory and Practice, 8(1), 1-19.
  13. De Laat, M, Chamrada, M, & Wegerif, R (2008). Facilitate the facilitator: Awareness tools to support the moderator to facilitate online discussions for networked learning. In Proceedings of the 6th International Conference on Networked Learning, 80-86.
  14. Gašević, D, Dawson, S, Siemens, . (2015). Let’s not forget: Learning analytics are about learning. In TechTrends (in press)
  15. Kovanovic, V, Joksimovic, S, Gasevic, D, & Hatala, M (2014). What is the Source of Social Capital? The Association between Social Network Position and Social Presence in Communities of Inquiry. In Proceedings of  G-EDM 2014: Workshop on Graph-based Educational Data Mining at Educational Data Mining Conference (EDM 2014), July 4-7, 2014, London, UK.