Breadcrumbs

 

How I Build Scorable xAPI Interactives for SLS Using AI — A Step-by-Step Workflow

Published: March 8, 2026 | Singapore

Creating rich, scored interactives for Singapore's Student Learning Space (SLS) traditionally required significant programming expertise and time. Today, with the help of AI-assisted tools developed by the Singapore educator community, teachers can generate fully functioning HTML5 interactives that report scores through xAPI in just a few hours — even with minimal coding experience.

This article documents a practical workflow that teachers can follow to go from prompt → interactive → scorable SLS activity. The process is illustrated using a real classroom example: a Primary 5 Triangle Area interactive.


The Workflow at a Glance

The entire pipeline consists of five stages:

StageToolPurpose
1 SLS ACP Interactive Generator Generate the base HTML5 interactive using AI
2 SLS Prompt Generator Design and refine effective prompts
3 AIzipper Package the code into an SLS-compatible ZIP
4 xAPI Integrator Agent Inject xAPI scoring and analytics
5 SLS Platform Upload, embed, and verify the activity

This workflow enables teachers to move from idea → classroom-ready interactive in a structured and repeatable way.


Stage 1: Generating the Interactive with SLS ACP and AI

The process begins inside the SLS ACP (AI Content Partner) Interactive Generator, where teachers describe the learning activity they want.

https://vle.learning.moe.edu.sg/my-drive/module/edit/3c9eb133-3948-4a14-9635-2d73fbabbf79/section/101671807/activity/101777129?quizPage=4

The AI model (typically Gemini or Claude) then generates a self-contained HTML5 interactive.

However, writing a good prompt from scratch can be difficult. To support this, teachers can use the SLS Prompt Generator library:

https://iwant2study.org/lookangejss/promptLibrary/ai-prompt-library.html

https://iwant2study.org/lookangejss/promptLibrary/ai-prompt-library.html

The tool helps teachers construct structured prompts by filling in key parameters:

• Topic / Subject Matter
Example: Triangle Area

• Grade Level
Example: Primary 5–6

• Subject Area
Example: Mathematics

• Interactive Type
Simulation, Educational Game, or Interactive Exploration

• RAT Framework Alignment

RAT LevelPurpose
Replace Digital version of existing quiz or worksheet
Amplify Enhances learning through simulations or visualisations
Transform Enables new learning experiences impossible before

• Specific Requirements

Examples include:

  • Include Exploration Mode and Quiz Mode

  • Display base, height, and area units

  • Allow students to manipulate triangle dimensions

  • Provide instant feedback

The prompt generator also contains a library of ready-made examples across subjects — including trigonometry tools, Chinese text-to-speech interactives, projectile simulations, and life-cycle quizzes. Teachers can adapt these examples instead of starting from scratch.

Once the prompt is generated, it is copied into the SLS ACP Interactive Generator, which produces the first version of the interactive.


Stage 2: Iterative Refinement — “Keep Asking Until It's Right”

The first AI-generated output is rarely perfect.

For the P5 Triangle Area interactive, the initial version might:

  • lack the intended Exploration / Quiz structure

  • use incorrect labels or units

  • have incomplete instructions

  • require UI improvements

The solution is iterative prompting.

Teachers continue refining the interactive by asking the AI to:

  • adjust the interface

  • add missing features

  • correct mathematical logic

  • improve clarity for students

This process may involve 5–20 iterations.

The key principle is simple:

Do not settle for the first result — keep refining until the interactive genuinely supports learning.

The final version of the Triangle Area interactive contains:

• Exploration Mode
Students manipulate triangle dimensions to observe how area changes.

• Quiz Mode
Students answer questions about triangle area using the interactive.

• Learning analytics indicators, including:

  • Interactions

  • Elapsed Time

  • Unique Explorations

  • Diversity of actions

  • Sequence Changes

These metrics later feed into SLS learning analytics dashboards.


Stage 3: Packaging the Code with AIzipper

Once the interactive works as intended, it must be packaged into a format that SLS can load.

Simply copying HTML code will not work because:

  • External CDN scripts cannot load inside SLS

  • SLS requires a specific ZIP structure

  • The interactive must run offline

The AIzipper – Code to ZIP Converter solves this.

https://iwant2study.org/lookangejss/slsZipper/Code_to_ZIP_Converter/

https://iwant2study.org/lookangejss/slsZipper/Code_to_ZIP_Converter/

The tool performs three key tasks:

1️⃣ Merges HTML, CSS, and JavaScript into a single structure
2️⃣ Downloads any external dependencies automatically
3️⃣ Generates an SLS-compatible ZIP package

The resulting ZIP contains a root index.html file and runs completely offline inside SLS.


Stage 4: Injecting xAPI Scoring with the xAPI Integrator Agent and use Agentic AI to help with specific scripts to connect xAPI statements to actual HTML5 variables in scripts

An interactive becomes truly useful in SLS when it can report student performance and interaction data.

This is done through xAPI (Experience API).

The xAPI-Scorable Interactive Integrator Agent adds this functionality automatically.

https://iwant2study.org/lookangejss/appXapiIntegratorAgent/

https://iwant2study.org/lookangejss/appXapiIntegratorAgent/

Steps:

1️⃣ Upload the ZIP from Stage 3
2️⃣ Choose an integration mode

ModePurpose
Timeline Mode Logs interactions such as clicks, input changes, and navigation
Minimal Mode Adds xAPI libraries for custom implementations

3️⃣ Optional: add AI instructions for advanced tracking

Examples:

  • Weight later answers more heavily

  • Track time spent on sections

  • Log exploration patterns

The tool injects the required xAPI scripts and produces a new ZIP ready for SLS deployment.

Now you need to follow 🤖 AI Agent — For Complex Simulations

⚠️ Why web-based AI injection has limits for complex content

The only reliable way to get customised and targeted data analytics in an interactive is to follow Step 4b.

Step 4b: Follow the 🤖 AI Agent workflow for complex simulations

Why web-based AI injection has limits for complex content

For complex simulations such as ACP generated interactive, reliable xAPI injection usually requires browser-based trial-and-error verification.

A web tool can only analyse your source code statically. It cannot:

run your simulation,
observe which events actually fire,
check whether window.storeState() is receiving the correct data, or
verify that meaningful scores are being sent.
Because of this, the generated tracking code may look correct but still silently send score: 0.

The correct approach: use an agentic AI tool installed on your computer

Please download and install one of these tools:

  1. Claude Code by Anthropic https://claude.com/download 
    Download the installer and open your simulation folder in Claude Code. It can launch a browser with Playwright, observe which events fire, generate tracking code, inject it, rerun the simulation, and iterate until storeState() sends real data.
  2. OpenAI Codex by OpenAI https://openai.com/codex/
    Similar workflow: run the simulation, observe behaviour, generate code, verify it, fix issues, and repeat locally with full browser access.
  3. Visual Studio Code by Microsoft https://code.visualstudio.com/download with generous credits
    A strong alternative development environment for testing and debugging.
  4. AntiGravity by Google https://antigravity.google/ with generous credits
  5. Trae by ByteDance https://www.trae.ai/download
    These tools run on your own computer with full terminal and browser access. That allows them to repeatedly test and refine the xAPI integration until it is actually working — something a web-based, single-pass API call cannot reliably do for complex simulations.

 


Stage 5: Uploading to SLS and Verifying

The final ZIP is uploaded into SLS as an Interactive Response Question.

Teachers then:

 

  1. Upload the ZIP file

  2. Add question pages

  3. Configure quiz scoring

  4. Preview as student the interactive

During testing, teachers check:

  • Does the interactive load correctly?

  • Can students manipulate the triangle in Exploration Mode?

  • Do Quiz Mode questions work?

  • Is xAPI data being logged?

If any issue appears, teachers return to the AI workflow and refine the interactive.


Why This Workflow Matters

This workflow significantly lowers the barrier for creating high-quality digital interactives.

What previously required weeks of development can now be done by teachers in a single afternoon.

Benefits include:

• Teacher-created digital interactives
• Built-in learning analytics via xAPI
• Integration with SLS quizzes and gradebook
• Rapid prototyping and refinement with AI

Most importantly, teachers gain the ability to design interactive learning experiences tailored to their own students, rather than relying solely on pre-built resources.

https://iwant2study.org/lookangejss/math/AI/ice/Interactive_20260305042707.zip 

1 1 1 1 1 1 1 1 1 1 Rating 0.00 (0 Votes)