lnu.sePublications
Change search
Link to record
Permanent link

Direct link
Publications (10 of 13) Show all publications
Kalmendal, A. (2025). Development of a Swedish classroom screening tool for grammatical comprehension (LegiLexi GRA) in young school-age children. Acta Logopaedica, 2, 36-57
Open this publication in new window or tab >>Development of a Swedish classroom screening tool for grammatical comprehension (LegiLexi GRA) in young school-age children
2025 (English)In: Acta Logopaedica, E-ISSN 2004-9048, Vol. 2, p. 36-57Article in journal (Refereed) Published
Abstract [en]

Background: Early identification of grammatical comprehension difficulties is critical for supporting language and reading development and academic achievement in young learners. There is a lack of robust whole-class screening tools of grammatical listening comprehension, particularly in linguistically diverse contexts such as Swedish primary education.

Objective/Aim: This study aimed to develop and evaluate the psychometric properties of the LegiLexi GRA-10, a brief classroom screening tool designed to detect students with challenges in grammatical comprehension among first-year students in Swedish compulsory school, including monolingual and multilingual students with Swedish as their first language (L1) and second language (L2) learners of Swedish.

Material and Methods: An initial 16-item version of the test was administered to a large sample of first-year students (N=8245). Item Response Theory (IRT) analyses were employed to select the ten best performing items to form the GRA-10. The tool’s discriminatory power and reliability were evaluated across both L1 and L2 student populations.

Results: The psychometric evaluation demonstrated that the GRA-10 reliably detects students with grammatical comprehension challenges, using a cut-off score of eight points or lower. The tool exhibited strong internal consistency and high discriminatory power across both L1 and L2 groups. These results indicate that GRA-10 is effective in identifying students who may benefit from early targeted language support.

Conclusion: The LegiLexi GRA-10 offers a valid, reliable, and efficient method for classroom-based screening of grammatical comprehension in young Swedish learners. Its strong psychometric properties across diverse language backgrounds support its use as an early identification tool, facilitating timely and targeted educational support. The GRA-10 represents a valuable addition to the resources available for promoting language development and academic success in the early years of schooling.  

Abstract [sv]

Bakgrund: Tidig identifiering av svårigheter med grammatisk förståelse är viktigt för att stödja språk- och läsutveckling samt skolframgång tidigt i grundskolan. Det saknas robusta verktyg för helklass-screening av grammatisk förståelse, inte minst i språkligt heterogena sammanhang som svensk grundskola.

Syfte: Denna studie syftade till att utveckla och utvärdera de psykometriska egenskaperna hos LegiLexi GRA-10, ett kort screeningverktyg som används i helklass. GRA-10 är utformat för att identifiera elever med utmaningar i grammatisk förståelse i svensk grundskolas första år, både hos enspråkiga och flerspråkiga elever som följer kursplanen för svenska (L1) och elever som följer kursplanen för svenska som andraspråk (L2).

Material och Metod: En ursprunglig version av testet med 16 uppgifter administrerades till ett stort urval av elever i förskoleklass (N=8245). Item Response Theory (IRT)-analyser användes för att välja ut de tio bäst presterande uppgifterna till GRA-10. Verktygets diskriminerande förmåga och tillförlitlighet utvärderades för både L1- och L2-elever.

Resultat: Den psykometriska utvärderingen visade att GRA-10 på ett tillförlitligt sätt identifierar elever med svårigheter i grammatisk förståelse, med resultat på åtta poäng eller lägre. Verktyget uppvisade stark intern konsistens och hög diskriminerande förmåga i både L1- och L2-grupperna. Resultaten indikerar att GRA-10 är effektivt för att identifiera elever som kan ha nytta av tidiga och riktade stödinsatser för språk.

Slutsats: LegiLexi GRA-10 erbjuder en tillförlitlig och effektiv metod för klassrumsbaserad screening av grammatisk förståelse hos svenska elever i grundskolans första år. De starka psykometriska egenskaperna över elever med olika språkbakgrunder stöder dess användning som ett verktyg för tidig identifiering, vilket möjliggör tidig och riktad pedagogisk support. GRA-10 utgör ett värdefullt tillskott till de resurser som finns för att främja språk- och läsutveckling samt skolframgång i de tidiga skolåren.

Place, publisher, year, edition, pages
Stockholm: Karolinska Institutet, Institutionen för Klinisk vetenskap, intervention och teknik, 2025
Keywords
Grammatical comprehension, test development, early literacy screening
National Category
Educational Sciences
Research subject
Pedagogics and Educational Sciences
Identifiers
urn:nbn:se:lnu:diva-138468 (URN)10.58986/al.2025.34243 (DOI)
Available from: 2025-05-12 Created: 2025-05-12 Last updated: 2025-05-12
Kalmendal, A. (2025). Evidence in education: How metascience can improve the quality of evidence syntheses in educational psychology. (Doctoral dissertation). Växjö: Linnaeus University Press
Open this publication in new window or tab >>Evidence in education: How metascience can improve the quality of evidence syntheses in educational psychology
2025 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

This dissertation investigates how metascientific approaches can enhance the quality and reliability of evidence syntheses in educational psychology. Prompted by the replication crisis, widespread questionable research practices, and the growing dependence on systematic reviews and meta-analyses in education, this work critically examines current research standards and advances innovative solutions rooted in open science.

Study I evaluates the methodological validity and reproducibility of the influential research synthesis Visible Learning by John Hattie. The study reveals several methodological flaws that contest the assumptions of the findings and the failure of being able to reproduce the statistics serves as a warning example of the presence of the replication crisis. 

Study II evaluates the risk of bias and transparency in systematic reviews conducted in educational psychology. Alarmingly, most included systematic reviews were judged as high risk of bias and across the entire sample, there was a lack of data sharing, preregistered protocols, and reproducible primary research data. 

Study III is a proof of concept of a registered report in educational psychology, the study aims to investigate the evidence of a writing intervention by conducting a systematic review. By adhering to the state-of-the-art conducting standards in systematic reviews, this protocol covers all aspects needed to produce reliable evidence as well as being reproducible. 

In Study IV, an innovative open-source Community-Augmented Meta-Analysis combined with a database is developed. The study presents solutions to several well-known problems in systematic reviews by allowing the research community to update, store, calculate, and share educational interventional data in a convenient way.

The findings of the included studies highlight significant gaps in research rigor and transparency, underscoring the necessity of fundamental change to adhere to current standards and modern research practices. 

By incorporating methodological tools such as preregistration, open science, risk of bias assessments and FAIR data principles, this dissertation calls for a paradigm shift in the synthesis and application of evidence in educational psychology. Ultimately, it seeks to promote more trustworthy, transparent, and impactful research to better inform educational policy and practice.

Place, publisher, year, edition, pages
Växjö: Linnaeus University Press, 2025
Series
Linnaeus University Dissertations ; 574/2025
Keywords
Metascience, Educational psychology, Open science, Methods, Statistics
National Category
Educational Sciences Psychology
Research subject
Social Sciences, Psychology; Pedagogics and Educational Sciences
Identifiers
urn:nbn:se:lnu:diva-138467 (URN)10.15626/LUD.574.2025 (DOI)978-91-8082-306-7 (ISBN)978-91-8082-307-4 (ISBN)
Public defence
2025-06-13, Newton, Vejdes plats 6. 352 52, Växjö, 10:00 (English)
Opponent
Supervisors
Funder
Swedish Research Council, 2020-03430
Available from: 2025-05-13 Created: 2025-05-12 Last updated: 2025-05-13Bibliographically approved
Carlsson, R., Batinovic, L., Hyltse, N., Kalmendal, A., Nordström, T. & Topor, M. (2024). A Beginner's Guide to Open and Reproducible Systematic Reviews in Psychology. Collabra: Psychology, 10(1), Article ID 126218.
Open this publication in new window or tab >>A Beginner's Guide to Open and Reproducible Systematic Reviews in Psychology
Show others...
2024 (English)In: Collabra: Psychology, E-ISSN 2474-7394, Vol. 10, no 1, article id 126218Article in journal (Refereed) Published
Abstract [en]

This paper provides guidance and tools for conducting open and reproducible systematic reviews in psychology. It emphasizes the importance of systematic reviews for evidence-based decision-making and the growing adoption of open science practices. Open science enhances transparency, reproducibility, and minimizes bias in systematic reviews by sharing data, materials, and code. It also fosters collaborations and enables involvement of non-academic stakeholders. The paper is designed for beginners, offering accessible guidance to navigate the many standards and resources that may not obviously align with specific areas of psychology. It covers systematic review conduct standards, pre-registration, registered reports, reporting standards, and open data, materials and code. The paper is concluded with a glimpse of recent innovations like Community Augmented Meta-Analysis and independent reproducibility checks.

Place, publisher, year, edition, pages
University of California Press, 2024
Keywords
systematic review, open science, reproducibility, guide
National Category
Psychology
Research subject
Social Sciences, Psychology
Identifiers
urn:nbn:se:lnu:diva-134351 (URN)10.1525/collabra.126218 (DOI)001379350300001 ()2-s2.0-85213028550 (Scopus ID)
Available from: 2025-01-09 Created: 2025-01-09 Last updated: 2025-04-30Bibliographically approved
Batinovic, L. & Kalmendal, A. (2024). Community Augmented Meta-Analysis of Evidence in Learning and Didactics. In: Poster presented at META-REP 2024, October 28-31, München: . Paper presented at META-REP 2024, October 28-31, München.
Open this publication in new window or tab >>Community Augmented Meta-Analysis of Evidence in Learning and Didactics
2024 (English)In: Poster presented at META-REP 2024, October 28-31, München, 2024Conference paper, Poster (with or without abstract) (Refereed)
Abstract [en]

Community-augmented meta-analysis (CAMA) platforms pioneer a new standard for promoting FAIR (findable, accessible, interoperable, reusable) data sharing, and allow dynamic and interactive meta-analysis which ensures reproducibility of results (Tsuji et al., 2014). As the area of (special) education research moves towards open science practices, our CAMA platform sets to facilitate data sharing of meta-analyses and make evidence-based practice accessible to practitioners. Furthermore, we aim to promote high-quality standards in conducting evidence synthesis, which are still not readily implemented in the education research area (Nordstrom et al., 2023). Our platform provides Bayesian and frequentist meta-analytic methods, interactive interface to conduct and visualize the analyses, easy-to-understand results and a large database of extracted effects that can be downloaded and reused by researchers. Users can conduct various moderator analyses, based on demographic information, risk of bias assessment, or study characteristics, and contribute to the database by submitting new extracted effects.

Keywords
Meta-Analysis, FAIR, CAMA, Open Science, Education
National Category
Psychology
Research subject
Social Sciences, Psychology; Pedagogics and Educational Sciences
Identifiers
urn:nbn:se:lnu:diva-133241 (URN)
Conference
META-REP 2024, October 28-31, München
Available from: 2024-11-06 Created: 2024-11-06 Last updated: 2024-12-20Bibliographically approved
Bratt, A. S. & Kalmendal, A. (2024). eHealth interventions can potentially provide cost-effective and accessible support for stress-related problems in healthcare professionals. Evidence-Based Nursing, 27(1), Article ID e103718.
Open this publication in new window or tab >>eHealth interventions can potentially provide cost-effective and accessible support for stress-related problems in healthcare professionals
2024 (English)In: Evidence-Based Nursing, ISSN 1367-6539, E-ISSN 1468-9618, Vol. 27, no 1, article id e103718Article in journal, Editorial material (Other academic) Published
Abstract [en]

Commentary on: López-Del-Hoyo Y, Fernández-Martínez S, Pérez-Aranda A, Barceló-Soler A, Bani M, Russo S, Urcola-Pardo F, Strepparava MG, García-Campayo J. Effects of eHealth interventions on stress reduction and mental health promotion in healthcare professionals: A systematic review. J Clin Nurs. 2023 Jan 26. doi: 10.1111/jocn.16634. Epub ahead of print.

Place, publisher, year, edition, pages
BMJ Publishing Group Ltd, 2024
National Category
Nursing
Research subject
Health and Caring Sciences, Caring Science
Identifiers
urn:nbn:se:lnu:diva-123400 (URN)10.1136/ebnurs-2023-103718 (DOI)2-s2.0-85166429233 (Scopus ID)
Available from: 2023-07-31 Created: 2023-07-31 Last updated: 2025-02-03Bibliographically approved
Nordström, T., Kalmendal, A., Lucija, B., Däldborg, P., Dahl, H., Hyltse, N. & Carlsson, R. (2024). FAIR-data adherence in systematic reviews: A closer (meta) view of educational science. In: Presented at Open Science Community Sweden Conference 2024, Växjö, Sweden, 24/9-26/9 2024: . Paper presented at Open Science Community Sweden Conference 2024, Växjö, Sweden, 24/9-26/9 2024.
Open this publication in new window or tab >>FAIR-data adherence in systematic reviews: A closer (meta) view of educational science
Show others...
2024 (English)In: Presented at Open Science Community Sweden Conference 2024, Växjö, Sweden, 24/9-26/9 2024, 2024Conference paper, Oral presentation only (Refereed)
Abstract [en]

This systematic living meta-review investigated presence of open science practices such as FAIR data adherence (findability, accessibility, interoperability, and reusability), preregistration of protocols and reproducibility (e.g., searches) in educational systematic reviews through the years 2019-2023. Additionally, we assessed systematic reviews for adherence to methodological standards (Methodological Expectations of Campbell Collaboration Intervention Reviews (Wang et al., 2021), using the tool ROBIS—Risk of Bias in Systematic Reviews (Whiting et al., 2016). 

Results show that very few systematic reviews are assessed as low risk of bias, preregister a protocol, are possible to reproduce (e.g., searches and where exactly primary data is collected) and very few organize their data according to FAIR, although there are some reviews that adhere to the (much needed) rigorous standards and share data that can be reused. This poster will address these issues on how the community can improve reproducibility practices.

Keywords
FAIR-data, Open science, Reproducibility, Systematic reviews, Risk of bias
National Category
Educational Sciences
Research subject
Pedagogics and Educational Sciences, Education
Identifiers
urn:nbn:se:lnu:diva-132772 (URN)
Conference
Open Science Community Sweden Conference 2024, Växjö, Sweden, 24/9-26/9 2024
Available from: 2024-09-25 Created: 2024-09-25 Last updated: 2025-04-30Bibliographically approved
Kalmendal, A., Henriksson, I., Nordström, T. & Carlsson, R. (2024). Protocol: Strategy instruction for improving short‐ and long‐term writing performance on secondary and upper‐secondary students: A systematic review. Campbell Systematic Reviews, 20(2), Article ID e1389.
Open this publication in new window or tab >>Protocol: Strategy instruction for improving short‐ and long‐term writing performance on secondary and upper‐secondary students: A systematic review
2024 (English)In: Campbell Systematic Reviews, E-ISSN 1891-1803, Vol. 20, no 2, article id e1389Article in journal (Refereed) Published
Abstract [en]

This is the protocol for a Campbell systematic review. The objectives are as follows. This review aims to investigate the effectiveness of all types of teacher-delivered classroom-based strategy instruction aimed at students in the general population (all students) including struggling students (with or at-risk of academic difficulties) in ages 12–19 for increasing writing performance. The majority of previous reviews scoped all outcomes presented in the primary studies. This review will solely focus on covering three most common outcomes: story quality, story elements and word count/length.

Place, publisher, year, edition, pages
John Wiley & Sons, 2024
Keywords
Systematic review, Strategy instruction, Writing, Education
National Category
Educational Sciences Applied Psychology
Research subject
Social Sciences, Psychology; Pedagogics and Educational Sciences
Identifiers
urn:nbn:se:lnu:diva-128162 (URN)10.1002/cl2.1389 (DOI)001178099300001 ()2-s2.0-85186906708 (Scopus ID)
Available from: 2024-03-07 Created: 2024-03-07 Last updated: 2025-05-12Bibliographically approved
Nordström, T., Kalmendal, A., Batinovic, L., Däldborg, P., Dahl, H., Hyltse, N. & Carlsson, R. (2024). Reproducibility, Open Science, FAIR and Risk Of Bias in Educational Systematic Reviews: A systematic living meta-review of research standards. In: META-REP 2024, the conference on meta-science and replicability in the social, behavioural, and cognitive sciences. October 28 to 31, 2024 in Munich, Germany: . Paper presented at META-REP 2024, the conference on meta-science and replicability in the social, behavioural, and cognitive sciences. Munich, Germany, October 28 - 31, 2024.
Open this publication in new window or tab >>Reproducibility, Open Science, FAIR and Risk Of Bias in Educational Systematic Reviews: A systematic living meta-review of research standards
Show others...
2024 (English)In: META-REP 2024, the conference on meta-science and replicability in the social, behavioural, and cognitive sciences. October 28 to 31, 2024 in Munich, Germany, 2024Conference paper, Poster (with or without abstract) (Refereed)
Abstract [en]

This systematic living meta-review investigated the presence of open science practices (e.g., preregistration of protocols), reproducibility (e.g., describing searches and where exactly primary data is collected), and FAIR data adherence (findability, accessibility, interoperability, and reusability) in educational systematic reviews through the years 2019-2023. Additionally, we assessed the systematic reviews for adherence to MECCIR standards (Methodological Expectations of Campbell Collaboration Intervention Reviews; Wang et al., 2021), using the tool ROBIS (Risk of Bias in Systematic Reviews; Whiting et al., 2016).

Results show that very few systematic reviews could be assessed as low risk of bias, had preregistered a protocol, were possible to reproduce, and very few organized their data according to FAIR. There were however some reviews that adhered to the (much needed) rigorous standards and shared data that can be reused. This poster will address these issues further and discuss how the community can improve reproducibility practices.

Keywords
Meta-review, systematic review, risk of bias, FAIR, open science
National Category
Psychology Educational Sciences
Research subject
Social Sciences, Psychology
Identifiers
urn:nbn:se:lnu:diva-133228 (URN)
Conference
META-REP 2024, the conference on meta-science and replicability in the social, behavioural, and cognitive sciences. Munich, Germany, October 28 - 31, 2024
Available from: 2024-11-04 Created: 2024-11-04 Last updated: 2025-04-30Bibliographically approved
Nordström, T., Kalmendal, A. & Batinovic, L. (2023). Risk of bias and open science practices in systematic reviews of educational effectiveness: A meta-review. Review of Education, 11(3), Article ID e3443.
Open this publication in new window or tab >>Risk of bias and open science practices in systematic reviews of educational effectiveness: A meta-review
2023 (English)In: Review of Education, E-ISSN 2049-6613, Vol. 11, no 3, article id e3443Article in journal (Refereed) Published
Abstract [en]

In order to produce the most reliable syntheses of the effectiveness of educational interventions, systematic reviews need to adhere to rigorous methodological standards. This meta-review investigated risk of bias occurring while conducting a systematic review and the presence of open science practices like data sharing and reproducibility of the review procedure, in recently published reviews in education. We included all systematic reviews of educational interventions, instructions and methods for all K-12 student populations in any school form with experimental or quasi-experimental designs (an active manipulation of the intervention) with comparisons and where the outcome variables were academic performance of any kind. We searched the database Education Resources Information Center (ERIC) through the years 2019–2021. In parallel we hand-searched four major educational review journals for systematic reviews: Educational Research Review (Elsevier), Educational Review (Taylor & Francis), Review of Education (Wiley), and Review of Educational Research (AERA). Systematic reviews were assessed with the risk of bias tool ROBIS and whether the studies had pre-registered protocols, shared primary research data, and whether a third party could reproduce search strings and details of where exactly primary research data were extracted. A total of 88 studies that matched our PICOS were included in this review; of these, 10 educational systematic reviews were judged as low risk of bias (approximately 11%) . The rest were classified as high risk of bias during a shortened ROBIS assessment or assessed as high risk or unclear risk of bias following a full ROBIS assessment. Of the 10 low risk of bias reviews, 6 had detailed their search sufficiently enough for a third party to reproduce, 3 reviews shared the data from primary studies, however none had specified how and from where exactly data from primary studies were extracted. The study shows that at least a small part of systematic reviews in education has a low risk of bias, but most systematic reviews in our set of studies have high risk of bias in their methodological procedure. There are still improvements in this field to be expected as even the low risk of bias reviews are not consistent regarding pre-registered protocols, data sharing, reproducibility of primary research data and reproducible search strings.

Place, publisher, year, edition, pages
John Wiley & Sons, 2023
Keywords
education, meta-review, open science, reproducibility, risk of bias, systematic review
National Category
Educational Sciences
Research subject
Pedagogics and Educational Sciences; Pedagogics and Educational Sciences, Education
Identifiers
urn:nbn:se:lnu:diva-125850 (URN)10.1002/rev3.3443 (DOI)001135375900005 ()2-s2.0-85178214449 (Scopus ID)
Funder
Swedish Research Council, 2020-03430
Available from: 2023-12-01 Created: 2023-12-01 Last updated: 2025-05-12Bibliographically approved
Kalmendal, A., Carlsson, R. & Nordström, T. (2023). Visible learning, best practice or boondoggle?: Challenges in assessing a meta-meta-analysis. In: Presented at Unconference on Open Scholarship Practices in Education Research, Centre for Open Science, Charlottesville, United States of America: . Paper presented at 2023 Unconference on Open Scholarship Practices in Education Research, Centre for Open Science, Charlottesville, United States of America.
Open this publication in new window or tab >>Visible learning, best practice or boondoggle?: Challenges in assessing a meta-meta-analysis
2023 (English)In: Presented at Unconference on Open Scholarship Practices in Education Research, Centre for Open Science, Charlottesville, United States of America, 2023Conference paper, Oral presentation with published abstract (Refereed)
Abstract [en]

In 2009, John Hattie released the meta-meta-review Visible Learning which summarized 800 meta-analyses into 138 possible influences on student achievement. The influences were all re-coded to a standard metric (Cohen’s d) and ranked based on their effect sizes, ranging from negative (e.g. retention), little effect (e.g., student personality), to strong influences on student achievement (e.g., Response to intervention). To this day, the general criticism has focused on discovering examples of flaws in Hattie’s approach which has been referred to as cherry-picking by proponents of Visible Learning. The purpose of this project is to conduct a rigorous systematic assessment of the presented material. This talk will go through the syntheses made in Visible Learning and also how the quality assessment of the material is done. For example, previous research indicates that several influences have combined meta-analyses despite not having similar population, intervention, comparison groups, outcomes, and study types (PICOS). The talk will also contain a practical demonstration of the codesheet and coding of the influences. The approach taken includes resources when conducting or assessing any type of meta-review. 

Keywords
Meta-science, open science, educational research
National Category
Educational Sciences Psychology
Research subject
Social Sciences, Psychology; Pedagogics and Educational Sciences
Identifiers
urn:nbn:se:lnu:diva-119738 (URN)
Conference
2023 Unconference on Open Scholarship Practices in Education Research, Centre for Open Science, Charlottesville, United States of America
Available from: 2023-03-14 Created: 2023-03-14 Last updated: 2023-03-16Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0003-2871-9693

Search in DiVA

Show all publications