How do educators and educational technologists think about data as evidence to support their work?

Article in the ASCILITE Technology Enhanced Learning Blog



This question is central to work being undertaken by colleagues at UTS and internationally. Evidence and data are increasingly emphasised in educational contexts, with the spread of What Works centres such as the Educational Endowment Foundation (UK), Evidence for Learning (Australia), the What Works Clearinghouse (USA), international (PISA), national (SATS, NAPLAN, etc.) alongside local (formative assessment) data.

Understanding evidence in education requires a grasp of the decisional contexts of evidence – how is it used, by whom, and what trade-offs there are in decisions about what evidence to collect or pay attention to? The discourse around evidence in education focuses on systems level concerns such as: which pedagogic practices are supported by research and should have funded professional development and how countries compare on international scales. This is in contrast to the moment-to-moment decisions made in classrooms all the time.

Learning Data: More about the Why than the How?

How educators think about data as a form of evidence is a central question to some work we’ve recently undertaken (Prestigiacomo et al., 2020; 2021). It’s also been a recurring theme in recent conversations with colleagues at UCL EDUCATE which aims to support educational technology startups in using evidence in the design and evaluation of their products (Cukurova, Luckin and Clark-Wilson, 2018). Initiatives such as UCL EDUCATE are highlighting a shared need to develop a greater understanding of evidence among educational decision makers and within educational technology enterprises to encourage more discerning, data-informed decision-making when evaluating impact. This perspective also reframes data literacy away from statistical competence, to the ability to ask useful questions of data.

Data can tell you things. But how you think about data, what kinds of data, the types of questions you ask, and so on provides insight into how people think about a problem, the outcomes they’re looking for and their understanding of indicators of those outcomes. As Fives notes, teachers have different understandings of knowledge and knowing (epistemic cognition), and this influences the way they make decisions about assessment, when they consider the learning outcomes they would like students to attain, and when they ask questions such as “what will I learn about my students from the formative assessment event?” (Fives et al., 2017, p. 3). Or, as we put it in (Knight et al., 2014), the way we think about pedagogy, epistemology, and assessment (focusing on technology-derived assessment data) are fundamentally entwined, and provide insight into people’s perspectives/standpoints on those issues. Similarly, findings from the UCL EDUCATE project indicate that evidence-informed learning technology enterprises (ELTEs) have six “superpowers” (including Leadership Vision, Teamwork and Research Know-How) that can be assessed using the ELTE framework’s diagnostic survey and subsequently developed through enterprise-based professional learning activities (Moeini, 2020).

The UCL EDUCATE program, working with 252 London-based educational technology enterprises between 2017-2019, evolved a pragmatic approach to identifying evidence. Companies came onto the program thinking that large-scale trials are the gold standard of evidence. What they found was that in the context of a dynamic start-up environments, where products are not yet stable (and may never be) it is more realistic and useful to adopt exploratory approaches to data collection and analysis that use a broader set of evidence involving student, teacher and parent inputs, hence bridging the gap between developers of edtech, researchers and learners/teachers/parents (i.e. the users of educational technologies).

Similarly, in recent workshops with tertiary educators and pre-service teachers (Prestigiacomo et al., 2020; 2021), we researched the purposes that educators had for using data as a form of evidence, and what it was evidence of. It emerged that teachers expressed their desire to move away from de-contextualised low-level data analysis, to instead value data to make visible higher-order constructs and learning, that are more meaningfully related to their immediate practice, and would enable them to make better-informed decisions.

Understanding this “why” of data is crucial to understand how we support educators in their effective use of evidence and data, taking a broad understanding of their practice rather than focusing on single sources of assessment data.

References

Cukurova, M., Luckin, R., & Clark-Wilson, A. (2018). Creating the golden triangle of evidence-informed education technology with EDUCATE. British Journal of Educational Technology, 50, 490–504. https://doi.org/10.1111/bjet.12727

Fives, H., Barnes, N., Buehl, M. M., Mascadri, J., & Ziegler, N. (2017). Teachers’ Epistemic Cognition in Classroom Assessment. Educational Psychologist, 52(4), 270–283. https://doi.org/10.1080/00461520.2017.1323218

Knight, S., Buckingham Shum, S., & Littleton, K. (2014). Epistemology, assessment, pedagogy: Where learning meets analytics in the middle space. Journal of Learning Analytics, 1(2), 23–47. https://doi.org/10.18608/jla.2014.12.3

Moeini, A. (2020). Theorising Evidence-Informed Learning Technology Enterprises: A Participatory Design-Based Research Approach. Doctoral Thesis. London: UCL Institute of Education.

Prestigiacomo, R., Hadgraft, R., Hunter, J., Lockyer, L., Knight, S., van den Hoven, E., & Martinez Maldonado, R. (2020). Learning-centred Translucence: An Approach to Understand How Teachers’ Talk About Classroom Data. The10th International ACM Learning Analytics & Knowledge Conference (LAK20), Frankfurt, Germany.

Prestigiacomo, R., Hunter, J., Knight, S., Martinez Maldonado, R., & Lockyer, L. (2021). Data in Practice: A Participatory Approach to Understanding Pre-service Teachers’ Perspectives. Australasian Journal of Educational Technology, 36(6), 107–119. https://doi.org/10.14742/ajet.6388

The authors can be contacted as follows:

This article is republished (by the authors) from The ASCILITE Technology Enhanced Learning Blog Read the original article.


Simon Knight
Simon Knight
Associate Professor

Dr Simon Knight is a senior lecturer in the Transdisciplinary School, co-editor-in-chief of the Journal of Learning Analytics, and Director of CREDS. Simon researches how people find, use, and evaluate evidence.