Status: open / Type of Theses: Bachelor Theses, Master theses / Location: Leipzig
Despite the increasing adoption of Explainable AI (XAI) in Learning Analytics (LA), many
Explainable Learning Analytics (XLA) systems used to support competence-based
assessment (CBA) frameworks in education remain non-interactive. While these systems
can quantify learner performance across competencies, existing dashboards/platforms
tend to present static visualizations and summary-level metrics, offering little room for
learner interaction, scenario exploration, or personalized insight extraction. Without
interactivity, explanations remain surface-level and fail to accommodate individual
learning paths and cognitive diversity. Therefore, there is a critical need for a new
generation of interactive, explainable learning analytics tools that provide real-time,
learner-specific justifications for the analytics, allow users to query, explore, and reflect on
their assessment data; support personalized pathways and interventions based on
transparent insights; and uphold fairness, trust, and accountability in the evaluation
process. Addressing this gap is essential for aligning learning analytics with the core
values of competence-based education, i.e., clarity, mastery, feedback, and learner
autonomy.
This will be an implementation-focused study, blending system engineering, human-
computer interaction, educational theory, and responsible AI principles.
To develop and evaluate an Interactive Explainable Learning Analytics (InXLA) platform
that enhances transparency, engagement, personalisation and actionable feedback in a
competence-based assessment digital environment.