AI Transparency & System Documentation

Last updated: April 22, 2026
Operator: Gaman Lab, Mgarr MGR1021, Malta — info@edustories.ai

This document fulfils the transparency obligations applicable to EduStories as a deployer of AI systems under EU Regulation 2024/1689 (AI Act), and serves as internal technical documentation for audit purposes.


1. System Description

Field Value
System name EduStories Story Generator
Version Production (continuously deployed)
Operator / Deployer Gaman Lab
AI Provider (upstream) Replicate, Inc. — replicate.com
Underlying models Third-party large language model (text generation) and image generation model, accessed via Replicate API. Model versions may change; current configuration is maintained in the application's environment configuration.
Input modality Text (structured form fields: situation, goal, child's age range and gender, interests, characters)
Output modality Text (structured story with scenes and conclusion) + raster images (one per scene)
Deployment context Web application at https://app.edustories.ai, accessible globally
Intended users Adults: parents, educators, therapists, and other professionals working with children or individuals with special educational needs (autism spectrum, BES)
Intended purpose Generation of personalised social stories for educational and communicative support

2. Risk Classification

EduStories has assessed the system's risk level under AI Act Annex III and Title III:

  • High-risk classification (Annex III): The system does not fall under Annex III categories. It does not determine access to educational institutions, assess individuals for employment, make decisions about benefits, or provide clinical diagnoses. It generates content as a tool to assist adults; no automated decision with legal or significant effect on any individual is produced.
  • General-Purpose AI (GPAI) usage: EduStories is a deployer, not a provider, of GPAI models. The upstream providers (Replicate and the model developers) carry the provider-level obligations under Art. 52–55. EduStories' obligations are those of a deployer under Art. 26 and transparency obligations under Art. 50.
  • Conclusion: EduStories operates as a non-high-risk deployer of GPAI under the AI Act. Primary obligations are transparency (Art. 50) and appropriate human oversight (Art. 26).

3. Transparency Measures (Art. 50)

The following transparency measures are implemented in the product:

  • In-product disclosure: Every generated story displays the notice "Story generated with AI. Content may not always be accurate." (localised in all supported languages: IT, EN, FR, DE, ES) immediately below the story content.
  • Terms of Use: Section 4 of the Terms of Use contains an explicit AI disclaimer, identifies Replicate as the AI service provider, and states that the AI Act (EU Reg. 2024/1689) applies.
  • Privacy Policy: Section 4 describes how user inputs are sent to Replicate's API for AI processing.
  • No synthetic impersonation: The system does not generate content that impersonates real persons and does not produce deepfakes.

4. Human Oversight Measures (Art. 26)

  • Review before use: All generated content is presented to the authenticated user before any action is taken. The user reviews the story and decides whether to use, share, or discard it.
  • No automated delivery to minors: Stories are never sent directly to children. The registered user (an adult) receives and reviews the content.
  • Regeneration: Users can request story regeneration if the output is unsatisfactory.
  • Feedback mechanism: Users can submit quality ratings and comments on generated stories via the in-app feedback form.
  • Professional disclaimer: The Terms of Use and the in-story notice explicitly state that AI-generated stories are not a substitute for professional medical, therapeutic, or educational advice.

5. Data and Logging (Art. 26(6))

  • User inputs (form fields) and generated outputs (story text) are stored in the application database and associated with the authenticated user's account.
  • Stories are retained for as long as the user's account is active. Users may delete individual stories or their entire account at any time.
  • Processing timestamps ( processing_started_at , processing_completed_at , generation_time_ms ) are logged per story for operational monitoring.
  • Replicate API request/response logs are maintained by the application's logging system. Logs are subject to standard retention policies.

6. Limitations and Known Risks

Risk Likelihood Mitigation
Hallucinations or factually incorrect content in generated stories Medium In-product disclaimer; user review before use; regeneration option
Generated content inconsistent with child's needs or situation Medium Structured input form to guide the model; user review; professional disclaimer
Inappropriate or offensive content in outputs Low (filtered by upstream provider) Upstream model safety filters; user feedback mechanism; content reported to Replicate if detected
Input data (child's situation) sent to third-party AI provider Inherent to the service Disclosed in Privacy Policy and Terms of Use; data minimisation encouraged (no sensitive medical data required); Replicate DPA in place
Over-reliance on AI-generated content without professional input Medium Explicit disclaimer in-product and in legal documents; service not marketed as clinical tool

7. Upstream Provider Compliance

EduStories relies on Replicate, Inc. as the AI service provider. As a GPAI provider/intermediary, Replicate is responsible for:

  • Maintaining technical documentation for the underlying models (Art. 53).
  • Complying with applicable copyright and training data obligations.
  • Implementing safety measures and usage policies at the model level.

Replicate's terms of service and privacy policy: replicate.com/privacy


8. Contact for AI-Related Queries

Users or regulators wishing to raise concerns about the AI system or request further documentation may contact:

Gaman Lab
Mgarr, MGR1021, Malta
Email: info@edustories.ai

EU users may also contact their national AI supervisory authority. In Malta, the competent authority is the Malta Digital Innovation Authority (MDIA)mdia.gov.mt.


9. Document Revision History

Date Change
April 22, 2026 Initial version