Education
Experiences
Latest publications
-
Simulation-based Active Learning for Systematic Reviews: A Scoping Review of Literature
Authors: J. J. Teijema, S. Seuren, D. Anadria, A. Bagheri, R. van de Schoot
Journal of Information Science • 2025/12/17
Background: Active learning is a proposed method for accelerating the screening phase of systematic reviews. While extensively studied, evidence remains scattered across a fragmented body of literature. Objective: This scoping review investigates whether active learning is recommended for systematic review screening and identifies areas needing further research. Design: We screened 1887 records published since 2006 using ASReview, an active learning tool, and included 60 relevant studies. We also analyzed 238 of 336 collected datasets for study design, dataset usage, and implementation. Results: All 60 studies recommended active learning as a means to improve screening efficiency. Despite some methodological heterogeneity, consistent endorsement was found across the literature. Conclusions: Active learning shows strong potential to support systematic review screening. Standardizing evaluation metrics, encouraging open data practices, and diversifying model configurations are key priorities for advancing this field.
-
Echo State and Band-pass Networks with aqueous memristors: leaky reservoir computing with a leaky substrate
Authors: T. M. Kamsma, J. J. Teijema, R. van Roij, C. Spitoni
Chaos: An Interdisciplinary Journal of Nonlinear Science • 2025/9/12
Recurrent Neural Networks (RNN) are extensively employed for processing sequential data such as time series. Reservoir computing (RC) has drawn attention as an RNN framework due to its fixed network that does not require training, making it an attractive for hardware based machine learning. We establish an explicit correspondence between the well-established mathematical RC implementations of Echo State Networks and Band-pass Networks with Leaky Integrator nodes on the one hand and a physical circuit containing iontronic simple volatile memristors on the other. These aqueous iontronic devices employ ion transport through water as signal carriers, and feature a voltage-dependent (memory) conductance. The activation function and the dynamics of the Leaky Integrator nodes naturally materialise as the (dynamic) conductance properties of iontronic memristors, while a simple fixed local current-to-voltage update rule at the memristor terminals facilitates the relevant matrix coupling between nodes. We process various time series, including pressure data from simulated airways during breathing that can be directly fed into the network due to the intrinsic responsiveness of iontronic devices to applied pressures. This is done while using established physical equations of motion of iontronic memristors for the internal dynamics of the circuit.
-
Large-scale simulation study of active learning models for systematic reviews
Authors: J. J. Teijema, J. de Bruin, A. Bagheri, R. van de Schoot
International Journal of Data Science and Analytics • 2025/5/2
Despite progress in active learning, evaluation remains limited by constraints in simulation size, infrastructure, and dataset availability. This study advocates for large-scale simulations as the gold standard for evaluating active learning models in systematic review screening. Two large-scale simulations, totaling over 29 thousand runs, assessed active learning solutions. The first study evaluated 13 combinations of classification models and feature extraction techniques using high-quality datasets from the SYNERGY dataset. The second expanded this to 92 model combinations with additional classifiers and feature extractors. In every scenario tested, active learning outperformed random screening. The performance gained varied across datasets, models, and screening progression, ranging from considerable to near-flawless results. The findings demonstrate that active learning consistently outperforms random screening in systematic review tasks, offering significant efficiency gains. While the extent of improvement varies depending on the dataset, model choice, and screening stage, the overall advantage is clear. Since model performance differs, active learning systems should remain adaptable to accommodate new classifiers and feature extraction techniques. The publicly available results underscore the importance of open benchmarking to ensure reproducibility and the development of robust, generalizable active learning strategies.
Skills
Python
Programming for data analysis, automation, and research tooling.
Machine Learning
Applied ML methods for text classification, active learning, and explainable AI.
Data Science
Statistical analysis, simulation workflows, and research data pipelines.
Open Science Standards
FAIR data principles, reproducibility, open-source development.
NLP & Systematic Review Automation
Development of tools and workflows to accelerate literature screening.