lnu.sePublications
Change search
Link to record
Permanent link

Direct link
Masiello, Italo, ProfessorORCID iD iconorcid.org/0000-0002-3738-7945
Publications (10 of 54) Show all publications
Masiello, I., Mohseni, Z., Palma, F., Nordmark, S., Augustsson, H. & Rundquist, R. (2024). A Current Overview of the Use of Learning Analytics Dashboards. Education Sciences, 14(1), Article ID 82.
Open this publication in new window or tab >>A Current Overview of the Use of Learning Analytics Dashboards
Show others...
2024 (English)In: Education Sciences, E-ISSN 2227-7102, Vol. 14, no 1, article id 82Article in journal (Refereed) Published
Abstract [en]

The promise of Learning Analytics Dashboards in education is to collect, analyze, and visualize data with the ultimate ambition of improving students’ learning. Our overview of the latest systematic reviews on the topic shows a number of research trends: learning analytics research is growing rapidly; it brings to the front inequality and inclusiveness measures; it reveals an unclear path to data ownership and privacy; it provides predictions which are not clearly translated into pedagogical actions; and the possibility of self-regulated learning and game-based learning are not capitalized upon. However, as learning analytics research progresses, greater opportunities lie ahead, and a better integration between information science and learning sciences can bring added value of learning analytics dashboards in education.

Place, publisher, year, edition, pages
MDPI, 2024
Keywords
learning analytics dashboards, LAD, trends
National Category
Computer Systems Learning
Research subject
Computer Science, Information and software visualization; Computer and Information Sciences Computer Science
Identifiers
urn:nbn:se:lnu:diva-126815 (URN)10.3390/educsci14010082 (DOI)001149183000001 ()2-s2.0-85183134033 (Scopus ID)
Funder
Forte, Swedish Research Council for Health, Working Life and Welfare, 2020-01221
Available from: 2024-01-16 Created: 2024-01-16 Last updated: 2024-08-22Bibliographically approved
Mohseni, Z., Masiello, I. & Martins, R. M. (2024). A technical infrastructure for primary education data that contributes to data standardization. Education and Information Technologies: Official Journal of the IFIP technical committee on Education, 29, 21045-21061
Open this publication in new window or tab >>A technical infrastructure for primary education data that contributes to data standardization
2024 (English)In: Education and Information Technologies: Official Journal of the IFIP technical committee on Education, ISSN 1360-2357, E-ISSN 1573-7608, Vol. 29, p. 21045-21061Article in journal (Refereed) Published
Abstract [en]

There is a significant amount of data available about students and their learning activities in many educational systems today. However, these datasets are frequently spread across several different digital services, making it challenging to use them strategically. In addition, there are no established standards for collecting, processing, analyzing, and presenting such data. As a result, school leaders, teachers, and students do not capitalize on the possibility of making decisions based on data. This is a serious barrier to the improvement of work in schools, teacher and student progress, and the development of effective Educational Technology (EdTech) products and services. Data standards can be used as a protocol on how different IT systems communicate with each other. When working with data from different public and private institutions simultaneously (e.g., different municipalities and EdTech companies), having a trustworthy data pipeline for retrieving the data and storing it in a secure warehouse is critical. In this study, we propose a technical solution containing a data pipeline by employing a secure warehouse—the Swedish University Computer Network (SUNET), which is an interface for information exchange between operational processes in schools. We conducted a user study in collaboration with four municipalities and four EdTech companies based in Sweden. Our proposal involves introducing a data standard to facilitate the integration of educational data from diverse resources in our SUNET drive. To accomplish this, we created customized scripts for each stakeholder, tailored to their specific data formats, with the aim of merging the students’ data. The results of the first four steps show that our solution works. Once the results of the next three steps are in, we will contemplate scaling up our technical solution nationwide. With the implementation of the suggested data standard and the utilization of the proposed technical solution, diverse stakeholders can benefit from improved management, transportation, analysis, and visualization of educational data.

Place, publisher, year, edition, pages
Springer, 2024
Keywords
Data standard, Data pipeline, Secure data pipeline, Educational data, Primary education, Technical infrastructure, SUNET drive
National Category
Information Systems, Social aspects
Research subject
Computer and Information Sciences Computer Science, Information Systems
Identifiers
urn:nbn:se:lnu:diva-129073 (URN)10.1007/s10639-024-12683-2 (DOI)001208963600002 ()2-s2.0-85191712850 (Scopus ID)
Funder
Linnaeus University
Available from: 2024-04-28 Created: 2024-04-28 Last updated: 2024-12-10Bibliographically approved
Tomas, N., Masiello, I., Eva, B. & Veronica, L. (2024). Assessment during clinical education among nursing students using two different assessment instruments. BMC Medical Education, 24(1), Article ID 852.
Open this publication in new window or tab >>Assessment during clinical education among nursing students using two different assessment instruments
2024 (English)In: BMC Medical Education, E-ISSN 1472-6920, Vol. 24, no 1, article id 852Article in journal (Refereed) Published
Abstract [en]

Background Assessment of undergraduate students using assessment instruments in the clinical setting is known to be complex. The aim of this study was therefore to examine whether two different assessment instruments, containing learning objectives (LO`s) with similar content, results in similar assessments by the clinical supervisors and to explore clinical supervisors’ experiences of assessment regarding the two different assessment instruments.

Method A mixed-methods approach was used. Four simulated care encounter scenarios were evaluated by 50 supervisors using two different assessment instruments. 28 follow-up interviews were conducted. Descriptive statistics and logistic binary regression were used for quantitative data analysis, along with qualitative thematic analysis of interview data.

Result While significant differences were observed within the assessment instruments, the differences were consistent between the two instruments, indicating that the quality of the assessment instruments were considered equivalent. Supervisors noted that the relationship between the students and supervisors could introduce subjectivity in the assessments and that working in groups of supervisors could be advantageous. In terms of formative assessments, the Likert scale was considered a useful tool for evaluating learning objectives. However, supervisors had different views on grading scales and the need for clear definitions. The supervisors concluded that a complicated assessment instrument led to limited very-day usage and did not facilitate formative feedback. Furthermore, supervisors discussed how their experiences influenced the use of the assessment instruments, which resulted in different descriptions of the experience. These differences led to a discussion of the need of supervisor teams to enhance the validity of assessments.

Conclusion The findings showed that there were no significant differences in pass/fail gradings using the two different assessment instruments. The quantitative data suggests that supervisors struggled with subjectivity, phrasing, and definitions of the LO´s and the scales used in both instruments. This resulted in arbitrary assessments that were time-consuming and resulted in limited usage in the day-to-day assessment. To mitigate the subjectivity, supervisors suggested working in teams and conducting multiple assessments over time to increase assessment validity.

Place, publisher, year, edition, pages
Springer Nature, 2024
Keywords
Assessment, Clinical education, Feedback, Learning objectives
National Category
Educational Sciences
Research subject
Pedagogics and Educational Sciences, Education
Identifiers
urn:nbn:se:lnu:diva-131640 (URN)10.1186/s12909-024-05771-x (DOI)001285774000006 ()2-s2.0-85200862577 (Scopus ID)
Funder
Karolinska Institute
Available from: 2024-08-08 Created: 2024-08-08 Last updated: 2024-08-21Bibliographically approved
Nordmark, S., Augustsson, H., Davidsson, M., Andersson-Gidlund, T., Holmberg, K., Mohseni, Z., . . . Masiello, I. (2024). Piloting Systematic Implementation of Educational Technology in Swedish K-12 Schools: Two-Years-In Report. Global Implementation Research and Applications, 4, 309-323
Open this publication in new window or tab >>Piloting Systematic Implementation of Educational Technology in Swedish K-12 Schools: Two-Years-In Report
Show others...
2024 (English)In: Global Implementation Research and Applications, E-ISSN 2662-9275, Vol. 4, p. 309-323Article in journal (Refereed) Published
Abstract [en]

Halfway through a four-year research project supported by implementation science and the Active Implementation Frameworks (AIF), this article reports on the status of the initial two implementation stages. Our research investigates the impact of systematically preparing educators and educational institutions to integrate digital learning materials and learning analytics dashboards to enrich teaching practices and improve student performance outcomes.

Furthermore, it seeks to establish a foundation for the use of innovative and validated educational technology (EdTech) through sustainable implementation strategies, evidence-based evaluation, and continuous redesign of digital learning materials. By adopting this comprehensive approach, we aim to enhance the knowledge base regarding effective digital innovation integration within educational environments.

We argue that applying implementation science in educational settings facilitates the adoption of effective innovations, promotes evidence-based decision-making, and helps identify and address obstacles to change. Our ongoing research underscores the transformative impact of implementation science in education. Thus far, we have highlighted the crucial role of teacher perspectives and the necessity of co-designing technology aligned with teaching and learning objectives.

This nuanced approach refutes the notion of a one-size-fits-all solution or a quick fix achievable in a single academic year. Instead, it advocates a dynamic, collaborative model that acknowledges the multifaceted nature of implementation. Our journey has reaffirmed the dedication of teachers, showcasing their readiness to invest time and effort when their professionalism is respected, and their input is genuinely valued and acted upon.

Place, publisher, year, edition, pages
Springer, 2024
Keywords
Implementation Science, Educational Technology, Digital Learning Materials, Learning Analytics Dashboards, Data Literacy, Active Implementation Frameworks
National Category
Other Computer and Information Science Educational Sciences
Research subject
Computer and Information Sciences Computer Science; Pedagogics and Educational Sciences
Identifiers
urn:nbn:se:lnu:diva-131163 (URN)10.1007/s43477-024-00130-w (DOI)
Projects
Utbildningsteknologi i grundskolan
Funder
Forte, Swedish Research Council for Health, Working Life and Welfare, 2020 − 01221
Available from: 2024-06-28 Created: 2024-06-28 Last updated: 2024-12-10Bibliographically approved
Rundquist, R., Holmberg, K., Rack, J., Mohseni, Z. & Masiello, I. (2024). Use of Learning Analytics in K-12 Mathematics Education: Systematic Scoping Review of Impact on Teaching and Learning. In: : . Paper presented at The European Conference on Educational Research (ECER), Nicosia, Cyprus, 27-30 August, 2024. , Article ID 629.
Open this publication in new window or tab >>Use of Learning Analytics in K-12 Mathematics Education: Systematic Scoping Review of Impact on Teaching and Learning
Show others...
2024 (English)Conference paper, Oral presentation with published abstract (Refereed)
Abstract [en]

Introduction

The generation and use of digital data and analyses in education comes with promises and opportunities, especially where digital materials allow use of Learning Analytics (LA) as a tool in Data-Based Decision-Making (DBDM). LA implies, analysing educational data to understand and optimise learning and learning environments (Siemens & Baker, 2012). In this paper we discuss LA as “a sophisticated form of data driven decision making” (Mandinach & Abrams, 2022, p. 196) as we explore how LA is used to support mathematics teaching and learning with digital materials in classroom practice. Data driven decision making or DBDM has been defined by Schildkamp and Kuiper (2010) as “systematically analyzing existing data sources within the school, applying outcomes of analyses to innovate teaching, curricula, and school performance, and, implementing (e.g., genuine improvement actions) and evaluating these innovations” (p. 482). DBDM is a key for the interpretation of LA, and can use any form of data, but in this review, the term DBDM is restricted to digital data. Using LA as a tool for DBDM could streamline data, making it more readily interpretable. However, questions remain about how usage can translate into practice (Mandinach & Abrams, 2022). 

Quality of technology integration is not merely about technology use, but also about pedagogical use (Ottestad & Guðmundsdottir, 2018), about transformation and amplification of teaching as well as learning through use of technology (Consoli, Desiron & Cattaneo, 2023). LA within Digital Learning Material (DLM) can offer learners adaptive functions seamlessly embedded in DLMs or, provide learners (and teachers) compiled student assessments in relation to learning goals extracted from learning activities (Wise, Zhao & Hausknecht, 2014). The role of the teacher in student learning is clearly of central importance (Hattie & Yates, 2013; Yackel & Cobb, 1996), and teachers have a key responsibility to make digital technology a recourse in teaching to support student learning (Scherer, Siddiq & Tondeur, 2019). 

This paper present findings from an exploratory systematic scoping review which was conducted regarding the use and impact of LA and DBDM in classroom practice to outline aspects related to Digital Learning Material (DLM), teacher usage, and student learning in the context of K-12 mathematics education. 

A scoping review was deemed most appropriate since it can be performed even if there is limited number of published primary research (Gough, Oliver & Thomas, 2017), fitting new research areas such as LA, as it provides “a technique to ‘map’ relevant literature in the field of interest” (Arksey & O’Malley, 2005, p. 20), as well as combine different kinds of evidence (Gough, et al., 2017).

Method

The methodology used the five-stage framework (Arksey & O’Malley, 2005), identifying the research question, identifying relevant studies, study selection, charting the data, collating, summarizing, and reporting the results. The databases ACM Digital Library, ERIC, PsycINFO, Scopus and Web of Science were chosen as they cover a wide range of topics within both technology and educational science to answer:

RQ1: How are analyses of digital data from DLM used in mathematics education?

RQ2: How do analyses of digital data from DLM impact teaching and learning?

The key elements of the research questions, Participants, Phenomena of Interest, Outcome, Context, Type of Source of Evidence (Arksey & O’Malley, 2005) were used to create the eligibility criteria. Publications that were included reported qualitative and/or quantitative data and were connected to the use of DLM and LA based on digital data involving students (between 6–19 years old) and teachers in mathematics K-12 education. The search was limited to papers published from 2000 up-to-date (March 2023) in English, Swedish or Norwegian. Exclusion criteria were developed to ensure consistency within the selection process.

Each record was screened by two reviewers and the relevance were coded according to the inclusion criteria. An independent researcher outside of the review group was consulted to design and validate the results of an inter-rater reliability test. The calculated inter-rater reliability score was 0.822, greater than 0.8, indicating a strong level of agreement (McHugh, 2012). After further screening 57 records were assessed to be eligible. At this stage the review pairs swapped batches and preformed data extraction showing, authors, year, title, location, aim, population, digital technology, method, intervention, outcomes, and key findings was performed for each record. 

The final selection of 15 articles was made by group discussion and consensus. Discussions mainly centred around four components (use, analysis, learning and teaching). The heterogeneity in our sample demanded a configurative approach to the synthesis to combine different types of evidence (Gough et al., 2017). A thematic summary provided the analysis with a narrative approach to answer RQ1. To explore RQ2 more deeply, a thematic synthesis was performed (Gough et al., 2017). The analysis focused on LA-usage based on digital data for student learning, for teaching, and for teachers’ DBDM. PRISMA Extension for Scoping Reviews (PRISMAScR) (Tricco, Lillie, Zarin, O'Brien, Colquhoun, Levac et al., 2018) was used as guidelines for reporting the results.

Preliminary results

3653 records were identified whereof 15 studies were included. Results show that LA-research is an emerging field, where LA-applications is used across many contents and curricula standards of K-12 mathematics education. LA were mainly based on continuously collected individual student log data concerning student activity in relation to mathematical content. Eight of the studies included embedded analytics and all 15 studies included extracted analytics, but accessibility varied for students and teachers. Overall, extracted analytics were mainly mentioned as a function for teacher-usage, available as tools for formative assessment, where analytics need to be translated by teachers into some kind of pedagogical action (i.e., into teaching).

LA-usage supports a wide variety of teachers’ data use, and while mathematics teachers seemed to have a positive attitude towards LA-usage, some teachers were unsure of how to apply it into their practice. The thematic synthesis yielded two themes regarding teaching, which showed that teaching by DBDM focused on Supervision and Guidance. Results indicate extracted analytics is more commonly used for Supervision than guidance. 

Results regarding learning suggest that LA-usage have a positive effect on student learning, where high-performing students benefit most. The included studies examine students’ digital learning behaviour, by describing sequences of actions related to LA, learning outcomes and student feelings. Hereby, through the thematic synthesis, we capture parts of students’ studying-learning process and how it can be affected by LA usage. Finally, we suggest a definition of an additional class of LA, which we introduce as Guiding analytics for learners.

Going forward, research on using LA and DBDM is essential to support teachers and school leaders to meet today’s demands of utilising data, to be aware of possible unwanted consequences, and to use technology to enhance active learners and students’ ownership of learning.

Keywords
Learning Analytics, K-12 education, teaching, learning, data-based decision-making
National Category
Educational Sciences Computer and Information Sciences
Research subject
Pedagogics and Educational Sciences; Computer and Information Sciences Computer Science
Identifiers
urn:nbn:se:lnu:diva-133161 (URN)
Conference
The European Conference on Educational Research (ECER), Nicosia, Cyprus, 27-30 August, 2024
Note

This is a shorter and preliminary paper based on an accepted paper soon to be published.

Available from: 2024-12-03 Created: 2024-12-03 Last updated: 2025-01-14Bibliographically approved
Rundquist, R., Holmberg, K., Rack, J., Mohseni, Z. & Masiello, I. (2024). Use of Learning Analytics in K–12 Mathematics Education: Systematic Scoping Review of the Impact on Teaching and Learning. Journal of Learning Analytics, 11(3), 174-191
Open this publication in new window or tab >>Use of Learning Analytics in K–12 Mathematics Education: Systematic Scoping Review of the Impact on Teaching and Learning
Show others...
2024 (English)In: Journal of Learning Analytics, E-ISSN 1929-7750, Vol. 11, no 3, p. 174-191Article in journal (Refereed) Published
Abstract [en]

The generation, use, and analysis of educational data comes with many promises and opportunities, especially where digital materials allow usage of learning analytics (LA) as a tool in data-based decision-making (DBDM). However, there are questions about the interplay between teachers, students, context, and technology. Therefore, this paper presents an exploratory systematic scoping review to investigate findings regarding LA usage in digital materials, teaching, and learning in K–12 mathematics education. In all, 3,654 records were identified, of which 19 studies met all the inclusion criteria. Results show that LA research in mathematics education is an emerging field where applications of LA are used in many contexts across many curricula content and standards of K–12 mathematics education, supporting a wide variety of teacher data use. Teaching with DBDM is mainly focused on supervision and guidance and LA usage had a generally positive effect on student learning with high-performing students benefiting most. We highlight a need for further research to develop knowledge of LA usage in classroom practice that considers both teacher and student perspectives in relation to design and affordances of digital learning systems. Finally, we propose a new class of LA, which we define as guiding analytics for learners, which harnesses the potential of LA for promoting achievement and independent learning.

Keywords
K-12 education, learning analytics, data-based decision-making (DBDM), analytics for learners, teaching, learning, research paper
National Category
Educational Sciences
Research subject
Pedagogics and Educational Sciences; Pedagogics and Educational Sciences
Identifiers
urn:nbn:se:lnu:diva-134328 (URN)10.18608/jla.2024.8299 (DOI)2-s2.0-85213544970 (Scopus ID)
Available from: 2025-01-08 Created: 2025-01-08 Last updated: 2025-01-14Bibliographically approved
Mohseni, Z., Masiello, I., Martins, R. M. & Nordmark, S. (2024). Visual Learning Analytics for Educational Interventions in Primary and Secondary Schools: A Scoping Review. Journal of Learning Analytics, 11(2), 91-111
Open this publication in new window or tab >>Visual Learning Analytics for Educational Interventions in Primary and Secondary Schools: A Scoping Review
2024 (English)In: Journal of Learning Analytics, ISSN 1929-7750, Vol. 11, no 2, p. 91-111Article in journal (Refereed) Published
Abstract [en]

Visual Learning Analytics (VLA) uses analytics to monitor and assess educational data by combining visual and automated analysis to provide educational explanations. Such tools could aid teachers in primary and secondary schools in making pedagogical decisions, however, the evidence of their effectiveness and benefits is still limited. With this scoping review, we provide a comprehensive overview of related research on proposed VLA methods, as well as identifying any gaps in the literature that could assist in describing new and helpful directions to the field. This review searched all relevant articles in five accessible databases — Scopus, Web of Science, ERIC, ACM, and IEEE Xplore — using 40 keywords. These studies were mapped, categorized, and summarized based on their objectives, the collected data, the intervention approaches employed, and the results obtained. The results determined what affordances the VLA tools allowed, what kind of visualizations were used to inform teachers and students, and, more importantly, positive evidence of educational interventions. We conclude that there are moderate-to-clear learning improvements within the limit of the studies’ interventions to support the use of VLA tools. More systematic research is needed to determine whether any learning gains are translated into long-term improvements.

Place, publisher, year, edition, pages
Society for Learning Analytics Research (SoLAR), 2024
Keywords
Visual learning analytics, Learning analytics dashboard, Educational interventions, Primary school, Secondary school, Scoping review, Systematic review
National Category
Computer and Information Sciences Pedagogy
Research subject
Computer and Information Sciences Computer Science
Identifiers
urn:nbn:se:lnu:diva-131233 (URN)10.18608/jla.2024.8309 (DOI)001295934400006 ()2-s2.0-85202576381 (Scopus ID)
Available from: 2024-07-01 Created: 2024-07-01 Last updated: 2024-09-12Bibliographically approved
Mohseni, Z., Masiello, I. & Martins, R. M. (2023). Co-Developing an Easy-to-Use Learning Analytics Dashboard for Teachers in Primary/Secondary Education: A Human-Centered Design Approach. Education Sciences, 13(12), Article ID 1190.
Open this publication in new window or tab >>Co-Developing an Easy-to-Use Learning Analytics Dashboard for Teachers in Primary/Secondary Education: A Human-Centered Design Approach
2023 (English)In: Education Sciences, E-ISSN 2227-7102, Vol. 13, no 12, article id 1190Article in journal (Refereed) Published
Abstract [en]

Learning Analytics Dashboards (LADs) can help provide insights and inform pedagogical decisions by supporting the analysis of large amounts of educational data, obtained from sources such as Digital Learning Materials (DLMs). Extracting requirements is a crucial step in developing a LAD, as it helps identify the underlying design problem that needs to be addressed. In fact, determining the problem that requires a solution is one of the primary objectives of requirements extraction. Although there have been studies on the development of LADs for K12 education, these studies have not specifically emphasized the use of a Human-Centered Design (HCD) approach to better comprehend the teachers’ requirements and produce more stimulating insights. In this paper we apply prototyping, which is widely acknowledged as a successful way for rapidly implementing cost-effective designs and efficiently gathering stakeholder feedback, to elicit such requirements. We present a three-step HCD approach, involving a design cycle that employs paper and interactive prototypes to guide the systematic and effective design of LADs that truly meet teacher requirements in primary/secondary education, actively engaging them in the design process. We then conducted interviews and usability testing to co-design and develop a LAD that can be used in classroom’s everyday learning activities. Our results show that the visualizations of the interactive prototype were easily interpreted by the participants, verifying our initial goal of co-developing an easy-to-use LAD.

Place, publisher, year, edition, pages
MDPI, 2023
Keywords
learning analytics dashboard; human-centered design; paper prototype; interactive prototype; usability test; K12; educational data
National Category
Computer Systems
Research subject
Computer Science, Information and software visualization
Identifiers
urn:nbn:se:lnu:diva-125814 (URN)10.3390/educsci13121190 (DOI)001130760800001 ()2-s2.0-85180651196 (Scopus ID)
Available from: 2023-11-29 Created: 2023-11-29 Last updated: 2024-08-20Bibliographically approved
Nilsson, T., Masiello, I., Broberger, E. & Lindström, V. (2023). Digital feedback during clinical education in the emergency medical services: a qualitative study. BMC Medical Education, 23(1), Article ID 156.
Open this publication in new window or tab >>Digital feedback during clinical education in the emergency medical services: a qualitative study
2023 (English)In: BMC Medical Education, E-ISSN 1472-6920, Vol. 23, no 1, article id 156Article in journal (Refereed) Published
Abstract [en]

Background Clinical education is essential for students’ progress towards becoming registered nurses (RN) in Sweden. Assessment of caring skills in the Emergency Medical Services (EMS) is complex due to the ever-changing scenarios and the fact that multiple supervisors are involved in the student’s education. Currently, assessments of student’s skills are summative and occur twice during the six weeks of clinical education. A digitalized assessment tool (DAT) with an adaptation for formative assessment is a new approach to assessment of nursing skills in the EMS. Since new technologies and changes in procedures are likely to affect both students and supervisors, our aim in this study is to describe students’ and clinical supervisors’ experience of formative assessments using DAT in the EMS.

Method This study is qualitative, using semi-structured group interviews (N = 2) with students and semi-structured individual telephone interviews (N = 13) with supervisors. The data was analysed according to Graneheim and Landman’s method for content analysis. This analysis generated 221 codes organized into 10 categories within which three themes were identified. The students in this study were nursing students in their last semester and all supervisors were experienced RNs.

Results The results showed that students and supervisors had mainly positive views of the DAT and the formative assessment stating that the information they provided while using the DAT offered opportunities for reflection. The DAT supported the students’ learning by visualizing strengths and areas of improvement, as well as displaying progress using a Likert scale. The application improved communication, but additional features linking the assessment tool with the university were requested. The application contributed to transparency in the assessments and was seen as preferable to the traditional ‘pen and paper’ method.

Conclusion A digital system was described in a positive manner, and the assessment using the DAT facilitated reflection and formative assessment. The use of a Likert scale was considered positive in order to demonstrate progression which with advantage could be demonstrated visually.

Place, publisher, year, edition, pages
BioMed Central (BMC), 2023
National Category
Other Medical Sciences Computer and Information Sciences
Research subject
Computer and Information Sciences Computer Science, Computer Science
Identifiers
urn:nbn:se:lnu:diva-120355 (URN)10.1186/s12909-023-04138-y (DOI)000949354600001 ()36918851 (PubMedID)2-s2.0-85150239379 (Scopus ID)
Available from: 2023-04-20 Created: 2023-04-20 Last updated: 2023-05-26Bibliographically approved
Masiello, I., Fixsen, D. L., Nordmark, S., Mohseni, Z., Holmberg, K., Rack, J., . . . Augustsson, H. (2023). Digital transformation in schools of two southern regions of Sweden through implementation-informed approach: A mixed-methods study protocol. PLOS ONE, 18(12), Article ID e0296000.
Open this publication in new window or tab >>Digital transformation in schools of two southern regions of Sweden through implementation-informed approach: A mixed-methods study protocol
Show others...
2023 (English)In: PLOS ONE, E-ISSN 1932-6203, Vol. 18, no 12, article id e0296000Article in journal (Refereed) Published
Abstract [en]

Background: The enhancement of–or even a shift from–traditional teaching and learning processes to corresponding digital practices has been rapidly occurring during the last two decades. The evidence of this ongoing change is still modest or even weak. However, the adaptation of implementation science in educational settings, a research approach which arose in the healthcare field, offers promising results for systematic and sustained improvements in schools. The aim of this study is to understand how the systematic professional development of teachers and schools principals (the intervention) to use digital learning materials and learning analytics dashboards (the innovations) could allow for innovative and lasting impacts in terms of a sustained implementation strategy, improved teaching practices and student outcomes, as well as evidence-based design of digital learning material and learning analytics dashboards.

Methods: This longitudinal study uses a quasi-experimental cluster design with schools as the unit. The researchers will enroll gradually 145 experimental schools in the study. In the experimental schools the research team will form a School Team, consisting of teachers/learning-technologists, school principals, and researchers, to support teachers’ use of the innovations, with student achievement as the dependent variable. For the experimental schools, the intervention is based on the four longitudinal stages comprising the Active Implementation Framework. With an anticipated student sample of about 13,000 students in grades 1–9, student outcomes data are going to be analyzed using hierarchical linear models.

Discussion: The project seeks to address a pronounced need for favorable conditions for children’s learning supported by a specific implementation framework targeting teachers, and to contribute with knowledge about the promotion of improved teaching practices and student outcomes. The project will build capacity using implementation of educational technology in Swedish educational settings.

Place, publisher, year, edition, pages
Public Library of Science (PLoS), 2023
National Category
Educational Sciences Computer and Information Sciences Pedagogy Engineering and Technology
Research subject
Computer and Information Sciences Computer Science; Pedagogics and Educational Sciences
Identifiers
urn:nbn:se:lnu:diva-126167 (URN)10.1371/journal.pone.0296000 (DOI)001130378600009 ()2-s2.0-85180364012 (Scopus ID)
Available from: 2024-01-01 Created: 2024-01-01 Last updated: 2024-09-05Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0002-3738-7945

Search in DiVA

Show all publications

Profile pages

my web page