Archive

Uncategorized

When I used to work for the TLRP TEL Programme, I had a bit of a side project going to use network analysis to explore the literature cited by the different projects, as a way of pulling out the classic papers and common ground across the programme.

If you’ve not familiar with the TLRP TEL programme, it followed on from the main phase of the Teaching and Learning Research Programme. It was jointly funded by the ESRC and EPSRC, and the main phase ran from 2008 to 2011, during which eight major projects were funded, spanning a range of topics in Technology-Enhanced Learning. These included (thanks to the Internet Archive for links):

  • Echoes: developed a multimodal digital environment to help develop children’s social interactions and support exploratory learning
  • Ensemble: semantic web technologies for case-based learning in Higher Education
  • HapTEL: developed haptic technology for dentistry students
  • Interlife: used Second Life to help support young people to develop social skills and navigate transitions
  • LDSE: A learning design support environment for teachers and lecturers
  • Migen: Intelligent support for mathematical generalisation
  • Personal inquiry: Designing for Evidence-based Enquiry across Formal and Informal Settings of Learning
  • Synergynet: Multi-touch tabletops for collaborative learning

With such a diverse range of technologies and settings, there was logically a question throughout of what it was that brought everyone together; what was distinct about TEL. The programme addressed this by organising events and publications around a range of cross-programme themes – links to the main outputs below:

Another way of exploring the threads that ran through the programme would be to look for commonalities in the literature which the projects cite. This could potentially have formed the basis of a collection of classic works in TEL as a reference bank, a bit like the one that Chris Davies and Rebecca Eynon produced in 2015 (‘Education and Technology – major themes in Education’). I had a first go at this about ten years ago (!), using Touchgraph. It was a bit clunky though, and formatting the data took a while, so sadly it went unfinished. This was before I had discovered Gephi though, and what was a lot of work back then is now a lot easier. (I used this approach on the Openness and Education literature recently, for example).

It has remained something I’d like to have followed-up on though. Being versed in Gephi it is now a lot easier for me to do this, and the time delay at least means that all the papers which were being submitted, reviewed or in press are now available for inclusion.

Getting the data

I couldn’t find my original files, so set about getting the data from scratch. The TEL website is now gone, so I looked up the projects in the ESRC Research Catalogue website. The project pages in the catalogue list a variety of publication types associated with each of the projects. It had been my intention to include the full range of publications in the analysis; however, it quickly became apparent that many of the items were not available online, either having not been online originally or that the links were no longer functional (particularly conference items). As a result, I decided to focus on just the journal papers associated with each project as these were the most consistently available (even a couple of those had disappeared in the relatively short space of time since the end of the programme) (and readers, take note: please use your institutional repositories!).

This yielded a list of 66 target papers across the eight projects. Four could not be located, and a further four could be found, but I wasn’t able to access the full texts. A total of 58 papers were therefore included. From each, the reference lists were copied into Excel, and (where multiple records existed) consistently formatted. The data was exported as a two-column CSV file, of ‘source’ (cited article) and ‘target’ (TEL project paper which cited it) and imported into Gephi.

Exploring the network

The network contained a total of 1,978 nodes (each node being either one of the sampled papers, or any resource cited by them – including other papers, books, chapters, conference items, government sources or websites). When visualising the network, it wasn’t immediately obvious where the bounds of each project lay. If there was nothing in common at all, you would expect to see eight distinct clusters of papers, one for each project. However, in practice, there were varying degrees of overlap (click to view a larger version):

To see where the project boundaries do lie, I colour-coded the sample of journal papers according to projects (I’ve also adjusted the node size the reflect in-degree, to make them stand out a bit more) (again, click to open a bigger version):

(note that the black node was a joint publication, with authors from three of the projects.)

It is quite interesting here to see how closely related the projects were to each other, and also within projects there is a reflection of the extent to which different strands within the projects were writing together or working in parallel.

The nodes that I am really interested in are the ones which represent articles which were cited by two or more of the projects. To narrow it down, all the items which were cited only once can be removed, which dramatically reduces the size of the network (now 315 nodes):

I examined the network to look for examples which met the criteria (cited by at least two of the projects – I didn’t count items associated with the black node though as this had a disproportionate overlap with a paper in one of the three projects it was associated with) and labelled the results:

The result of this was a list of 42 publications (in alphabetical order):


Anastopoulou, S., Sharples, M., Ainsworth, S. & Crook, C. (2009) Personal inquiry: linking the cultures of home and school with technology mediated science inquiry. In Mobile Learning Cultures across Education, Work and Leisure (eds N. Pachler & J. Seipold), pp. 55-58. WLE Centre, London. Proceedings of the 3rd WLE Mobile Learning Symposium, London, 27th March 2009. ISSN 1753?3385.

Anderson, T. & Whitelock, D. (2004) The educational semantic web: Visioning and practicing the future of education. Journal of Interactive Media in Education, 1: 1-15.

Berners-Lee, T., Hendler, J. & Lassila, O. (2001) The Semantic Web. Scientific American 284, 34-43.

Bruner, J.S. (1966) Toward a Theory of Instruction. The Belknap Press, Cambridge, MA.

Cohen, L., Manion, L., & Morrison, K. (2000) Research methods in education (5th ed.). Taylor & Francis Ltd.

Conlon, T. & Pain, H. (1996) Persistent collaboration: a methodology for applied AIED. International Journal of Artificial Intelligence in Education, 7, 219-252.

Dillenbourg, P., Baker, M., Blaye, A. & O’Malley, C. (1996) The evolution of research on collaborative learning. In E. Spada & P. Reiman (Eds), Learning in Humans and Machine: Towards an interdisciplinary learning science (pp. 189-211). Oxford: Elsevier.

Druin, A. (2002) The role of children in the design of new technology. Behaviour and Information Technology 21(1), 1-25.

Engestrom, Y. (1999) Activity theory and individual and social transformation. In Perspectives on Activity Theory (eds Y. Engestrom, R. Miettinen & R.?L. Punamaki), pp. 19-38. Cambridge University Press, Cambridge, UK.

Facer, K. & Sandford, R. (2010) The next 25 years? Future scenarios and future directions for education and technology. Journal of Computer Assisted Learning 26, 74-93.

Fullan, M. (1991) The new meaning of educational change. Cassell, London.

Good, J. & Robertson, J. (2006) CARSS: A framework for learner centred design with children. International Journal of Artificial Intelligence in Education, 16, 381-413.

Greeno, J. (1991) Number sense as situated knowing in a conceptual domain. Journal for Research in Mathematics Education, 22 (3), pp. 170-218.

Kirschner, P., Sweller, J., & Clark, R.E. (2006) Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential and inquiry-based teaching. Educational Psychologist, 41(2), 75-86.

Kobbe, L., Weinberger, A., Dillenbourg, P., Harrer, A., Hamalainen, R., Hakkinen, P. & Fisher, F. (2007) Specifying computer-supported collaboration scripts. International Journal of Computer-Supported Collaborative Learning, 2(2), 211-224.

Kolb, D. A. (1984) Experiential learning: Experience as a source of learning and development. Englewood Cliffs, NJ: Prentice-Hall, Inc.

Koper, R. & Olivier, B. (2004) Representing the learning design of units of learning. Educational Technology and Society 7, 97� 111.

Kraemer, K., Dedrick, J. & Sharma, P. (2009) One laptop per child: vision vs. reality. Communications of the ACM, 52 (6), pp. 66-73.

Landauer, T. (1995) The trouble with computers: usefulness, usability, and productivity. MIT Press, Cambridge.

Laurillard, D. (2002) Rethinking university teaching: A conversational framework for the effective use of educational technology, 2nd ed., London: Routledge.

Lave, J. & Wenger, E. (1991) Situated Learning: Legitimate Peripheral Participation . Cambridge: Cambridge University Press.

Lewins, A. & Silver, Christina (2007). Using software in qualitative research: A step-by-step guide. London: Sage.

Merrill, M.D. (1994) Instructional design theory. Englewood Cliffs, NJ: Educational Technology Publication.

Moss, J., & Beatty, R. (2006). Knowledge building in mathematics: supporting collaborative learning in pattern problems. International Journal of Computer-Supported Collaborative Learning, 1, 441-465.

Mulholland P., Collins, T., Gaved, M., Wright, M., Sharples, M., Greenhalgh, C., Kerawalla, L., Scanlon, E., and Littleton, K. (2009). Activity guide: an approach to scripting inquiry learning. In Proceedings of 14th International Conference on Artificial Intelligence in Education, Brighton 6-10 July 2009.

Naismith, L., Lonsdale, P., Vavoula, G., & Sharples, M. (2004). Literature review in mobile technologies and learning, NESTA Futurelab Series.

Noss, R., Healy, L. & Hoyles, C. (1997) The construction of mathematical meanings: connecting the visual with the symbolic. Educational Studies in Mathematics 33, 203-233.

Papert, S. (1980) Mindstorms: Children, Computers and Powerful Ideas. Basic Books, New York.

Piper, A., O’Brien, E., Morris, M., & Winograd, T. (2006) SIDES: A cooperative tabletop computer game for social skills development. Proceedings of the 2006 20th Anniversary Conference on Computer Supported Cooperative Work, 10. ACM.

Roschelle, J. (2003). Unlocking the learning value of wireless mobile devices. Journal of Computer Assisted Learning, 19(3), pp. 260-272.

Roschelle, J., & Pea, R. (2002). A walk on the WILD side: How wireless handhelds may change computer-supported collaborative learning International Journal of Cognition and Technology, 1(1), 145-168.

Schon, D. (1987) Educating the reflective practitioner: Toward a new design for teaching and learning in the professions, San Francisco: Jossey-Bass Publishers.

Schon, D.A. (1983) The Reflective Practioner: how professionals think in action. Temple Smith, London.

Schwartz, D.L., Brophy, S., Lin, X. & Bransford, J.D. (1999) Software for managing complex learning: examples from an educational psychology course. Educational Technology Research and Development 47, 39-59.

Sfard, A. (1998) `On Two Metaphors for Learning and the Dangers of Choosing Just One’, Educational Researcher 27(2): 4-13.

Sharples, M., Taylor, J., & Vavoula, G. (2007). A theory of learning for the mobile age. In R. Andrews, & C. Haythornthwaite (Eds.), The Sage handbook of e-learning research (pp. 221-247). London: Sage Publications Ltd.

Slotta, J.D. (2010). Evolving the classrooms of the future: The interplay of pedagogy, technology and community. In K. M�kitalo-Siegl, F. Kaplan, J. Zottmann, & F. Fischer (Eds.) Classroom of the future. Orchestrating collaborative spaces (pp. 215-242). Rotterdam: Sense.

Vygotsky, L.S. (1978) Mind in society. The development of higher psychological processes. Cambridge, MA: Harvard University Press.

Wang, F., & Hannafin, M. J. (2005). Design-based research and technology-enhanced learning environments. Educational Technology Research and Development, 53(4), 5-23.

Wenger, E. (1998) Communities of Practice: Learning, Meaning and Identity. Cambridge: Cambridge University Press.

Wingate, U. (2007), A framework for transition: supporting learning to learn in higher education. Higher Education Quarterly, 61 (3), pp. 391-405.

Yin, R.K. (2003) Case study research: Design and methods. London: Sage Publications.

There were a couple which I was slightly surprised did not make the cut (Larry Cuban, oversold and underused, and Ton de Jong on computer simulations). The list includes a few methodological texts (e.g. Yin, Cohen et al.) and some which are perhaps slightly too specialised, but I think that overall anyone as a student or newer researcher in TEL would benefit from reading or being aware of most of the literature surfaced by this approach.

The other advantage of time passing between the TEL programme and now is that it would also be possible to look at onward citations of the projects’ papers, which could be interesting to see how the topics addressed by the TEL projects have changed over time and where they sit in relation to current research agendas in TEL.

Advertisements

One of the nice things about living near Cambridge is being able to go to some of the excellent, usually free, events that frequently happen there. At the end of April I headed to Wolfson College for a one-day event entitled The Digital Person: A symposium (2019).

The event was organised by the Hub of All Things (HAT), a multi-institution research project which has developed a data infrastructure for users to be in control of their personal data online. Users would maintain their data in a personal data account (or ‘HATDex’). If platforms wished to gain access to the data, they would send requests and pay a fee. So the idea has the advantages of users being able to control their own data, maintain its accuracy, and be the beneficiary from the monetisation of that data. (I wondered how that would work for educational data online, particularly interactions, where there are multiple actors. E.g. who owns a discussion thread in a MOOC? Individual authors’ posts would be meaningless out of context. Technically I am sure the T&Cs mean that the MOOC owns it, but that doesn’t sit well with the goal of giving users control of their data either. This type of data may be beyond the scope of a personal data account though. It would certainly be relevant to large scale demographic/user characteristics MOOC studies though. Anyway, thinking on about that.).

I was glad to find that the day wasn’t just a sales pitch for why we should all be using a HATDEX though (or rather, it didn’t do so directly, but the presentations were all compelling discussions of broader critical issues in the monetisation of online data and privacy, which would all underscore why something like a HATDEX is a good idea). The presentations are now available online and included:

The presentations were really helpful resources for anyone interested in the topics (& include a lot of links to other fascinating literature) The 2018 Reuters Institute Digital Media Report was referred to several times during the day, and can be found here: http://media.digitalnewsreport.org/wp-content/uploads/2018/06/digital-news-report-2018.pdf A couple of emergent, interdisciplinary fields were referred to which were new to me – human-data interaction, and machine behaviour – which I will now be keeping an eye out for. A very stimulating day, and hopefully I’ll be able to get to the 2020 one too (I think registration is open at Eventbrite already!).

One of my goals for this year is to have a bit of a data analysis refresher. Sometimes I wonder if I may be in a bit of a SPSS-Gephi-nVivo rut! (A useful rut, nonetheless).

My current to-do list includes general use of R and Tableau, and epistemic network analysis and structural equation modelling in particular. Although I have used it before, I’m also hoping to find some time to get properly reacquainted with the Digital Methods Initiatives’ latest tools too.

I find it always helps to have data to play with (not just artificial problem-sets type data). I’m planning to have a good look online for open datasets in Education and Educational Technology-related topics. I’m aware that openly publishing datasets is still quite niche – but there is a growing body out there.

I’ll be looking first of course at Figshare, and UK HE institutional repositories. But is there anywhere else I should be looking? Is there a particular place where US institutions post open datasets, for example? Do you know of any particular projects which have released data openly, either through platforms or just on their websites? (No need to recommend the OER Hub, they are already on my list of course 😉 ). I’d be very interested in any recommendations, and will share what I find at a later date.

Recently I looked out some of my old external hard drives. I was looking specifically for old files related to a side project from years ago which never reached fruition but I’d like to revisit (as it would be a lot easier for me to do now than it was then – anyway, more on that later!).

The files on the hard drives go way back, through the TLRP-TEL Research Programme (~2008 – 2012), right back to my first research job on the Cambridge-MIT Institute Plant Sciences Pedagogy Project (2005-2008). This was my first academic research post, just a couple of months after finishing my masters in Plant Pathology. My role in the project involved working within Plant Sciences, to conduct educational research into undergraduate teaching and learning within the department, and develop e-learning resources to support it.

As it was my first experience of educational and technology-enhanced learning research, it was accompanied by a range of vocabulary which was new to me. This must be quite a common experience for people getting started in educational and pedagogic research; Education, as a research field within the Social Sciences, shows a high level of heterogeneity in terms of researchers’ subject backgrounds (as shown in this chart – figure 4.3 from David Mills’ chapter in McAlpine & Ackerlind’s ‘Becoming an Academic’).

To help with this transition, I kept a document where I noted the words which were new and unfamiliar, as a working glossary, and I came across this when I was excavating the old hard drive. It was interesting to look back at the words which seemed unusual then but are now all so familiar. It made me wonder to what extent the glossary reflected trends in the field at the time – some of the concepts I’m sure are not as prevalent now as then – and how the glossary would look if it was being constructed today.

The list included the following (minus my slightly cringey notes 😉 ):

Action research

Activity theory

Andragogy

Behaviourism

Belief (as in the work of Posner)

BERA

Bloom’s taxonomy

Boundary objects (Wenger / CoP)

Bourdieu (esp. ‘habitus’)

Brokerage (Wenger / CoP)

Case study

Cognitive

Communities of Practice (Wenger)

Concept map

Constructive alignment

Constructivism

CPD

Deep learning (converse surface learning)

Didactic

Discourse analysis

Dweck

EARLI

E-learning

Emic

Engstrom (See ‘activity theory’)

Epistomology

EPSRC

ESRC

Ethnography

Ethnomethodology

Etic

Evidence based (Evidence informed)

Focus group

Gestalt

Habitus (see Bourdieu)

HEA

Hierarchy of needs

Humanistic

Intensional networks (Nardi et al., 2000)

Lexis

Liminal

Maslow – Hierarchy of needs

Meta-cognition

Neuro-linguistic programming (NLP)

Ontogenesis

Ontology

Pedagogy

Phenomenology

Piaget

Polanyi (Tacit knowledge)

Practitioner

Reification

Reflexivity

RLO (Reusable learning object)

Runaway object (Engstrom)

Russellian

Self-efficacy

Self-regulation

Sfard (metaphors for learning)

Social capital

Social constructivism

Stenhouse

Structuralist

Surface learning (converse deep learning)

Tacit knowledge

TEL

Threshold concepts (Meyer and Land)

TLRP

Triangulation

VLE

VRE

Vygotsky

Wenger

Of course, fourteen years later, it’s not possible for me view the field in the way that I did then. I’d be really interested to hear from others though – which were the words which stood out for you when you started out in educational research? What would be on the list now, that wasn’t then?

Over the summer, I’ve been busy working on my SRHE project (with my now nine-month-old research assistant). I presented initial results from the first part of the project, focusing on how academics’ identity and information sharing is refracted through different social networking platforms, at the Social Media & Society conference in Copenhagen in July. It was my first outing to Social Media & Society, and was a very interesting and enjoyable conference.

Since then, I’ve been working on qualitative analysis of the free-text questions from the survey, which asked academics to give examples of interactions through social media that they had experienced and perceived to be of high impact. It has taken me longer than I expected to do the coding, due to having more responses than I’d expected – very happy to have more data though and thank-you all who took part!

I’ve been using an open coding approach to characterise the different types of high-impact interactions through social media as perceived by academics, but I’d like to also be able to compare the results to more ‘institutional’ definitions of ‘impact’. I note that the UKRI website seems to distinguish broadly between two types of impact – ‘academic impact’ and ‘economic & social impact’, while the REF Impact Case Studies lists eight ‘impact types’, including ‘Political, Health, Technological, Economic, Legal, Cultural, Societal, and Environmental’. The UKRI definitions (only two categories) aren’t very specific, while the REF list sounds more like a proxy for subject area. What am I missing? Are there any other ‘official’ typologies I should be aware of? Many thanks 🙂

I’ve been updating my CV recently, as I’ll be resuming the academic job hunting soon, looking out for posts (possibly part time) starting from the Autumn onwards.

However, I don’t like my CV as it stands; it doesn’t seem to give a clear picture of who I am, how I got to where I am today, or how the items on it make sense together. I suspect that some readers get to the undergraduate degree in Biological Sciences, and write me off straight away when I’m applying for Education/Social Sciences positions. (I also suspect that the increasing reliance on online forms for university HR may even filter me out before a human reads it – but that’s a different post for a different time 😉 ).

So, I decided to experiment with making a timeline-based version of my CV. I’m still tweaking it, but I think it does help to show how everything fits together. I’ve not included non-peer reviewed publications, posters, or presentations (yet) as there are a lot of them – but I might add them, as they do include a wider range of activities (e.g. not just formal conferences but also some teaching). I may also add another filter for skills, as well as research methods, and collaborators.

Although some would probably say I should not foreground it, I have included my two periods of maternity leave. There weren’t obvious gaps on the timeline which would have called for an explanation around mat leave, but it felt wrong not to include them – they are a big part of who I am, and keeping going with my PhD and staying research active through them was a major challenge.

Any feedback would be welcome – what else should I add? What is, or isn’t, coming across well in this format? The timeline can be viewed by clicking on the picture below, or the following link (opens in a new tab): http://www.katyjordan.com/cv.html

Last year, I was honoured to receive one of the Society for Higher Education’s Newer Researcher Awards. With the funding from the award, I am carrying out a project this year to build upon and test a model of public-professional academic identity online which emerged from my PhD research. The main data collection phase is an online survey, to explore the types of information academics share online through different platforms, and academics’ perceptions about online audiences and high impact interactions through social media.

I had to modify the timeline of my original project plan due to maternity, but I am very excited to now be able to launch my survey! The survey takes around 15 minutes to complete and can be accessed here:

https://openuniversity.onlinesurveys.ac.uk/sharing-social-media

All participants who complete the survey and leave a contact email address at the end will be entered into a draw to win a £50 Amazon voucher. The survey will remain open until 14th May. Your participation would be greatly appreciated and please feel free to share the link with others too – I will be promoting the survey online over the next few weeks but any help to get as wide a reach as possible would be great. Many thanks, and looking forward to being able to share the results in due course!