AI and social stories: how to use it ethically and consciously in educational practice
Educational technology and special needs: can artificial intelligence support educators and therapists without replacing them
Artificial intelligence is also entering the world of inclusive education. But can it really be useful in creating social stories? In this article we analyse what it can do, what it cannot do and how to use it in an ethical and professional way, without losing human centrality.
When Time Is Not Enough (and Quality Suffers)
If you work as a special education teacher, educator, or therapist, you probably know this situation:
It’s evening. You still have two IEPs to update. A family is waiting for a personalised social story to help the child manage anxiety before a medical appointment.
You know the story needs to be carefully calibrated. Simple language. Positive perspective. No ambiguity.
But time is short.
This is where many people start wondering:
“What if artificial intelligence could help me?”
Not to replace professional expertise. But to lighten the workload.
What AI can really do in creating social stories
Generative artificial intelligence can:
- create a coherent first draft
- adapt language for different ages
- rephrase sentences to make them clearer
- structure a logical narrative sequence
This can reduce the initial writing time.
Real example
An educator needs to prepare a social story for:
Marco, 7 years old, autism, difficulty with changes in routine, anxiety when the teacher is absent.
AI can generate a basic structure:
- what happens when the teacher is absent
- who will be there
- what stays the same
- how I might feel
- calming strategies
But this is only a base.
AI does not know Marco. It does not know his preferred tone of voice. It does not know that for him the word “substitute” is already a trigger. Personalisation remains human.
What AI CANNOT do (and should not do)
AI cannot:
- observe the child
- interpret subtle cues
- assess real emotional regulation
- replace the educational relationship
And above all: it cannot take on clinical or educational responsibility. An effective social story is born from observation, from knowledge of the context, from teamwork.
Why ethics is central
When we talk about AI and special educational needs, delicate issues come into play:
- data privacy
- sensitivity of information
- conscious use of generated content
- transparency towards families
Using AI ethically means:
- Not entering sensitive identifying data
- Always reviewing generated content
- Never delegating educational decisions
- Considering it a tool, not an authority
Change of perspective: from “replacement” to “assistant”
The biggest risk is thinking: “If AI writes the story, the problem is solved.”
In reality, the most useful perspective is another one: AI is a draft. You are the author.
It’s like having an assistant who prepares a first outline, but the narrative and pedagogical responsibility remains yours.
This distinction is essential to protect educational quality.
A realistic and sustainable use
In a primary school, a support team uses an AI generator to:
- create initial drafts
- adapt already written stories
- speed up linguistic revisions
Result? Less time on technical structure. More time on observation and relationship.
Technology becomes time reclaimed for the person.
Conclusion
Artificial intelligence is not a shortcut.
It is not a magic solution.
But it can become concrete support if used with:
- awareness
- professional supervision
- ethical attention
- human centrality
Social stories remain a relational tool. And the relationship cannot be automated.
If you want to explore the topic of social stories and their conscious use further, take a look at the other EduStories articles dedicated to personalisation and practical strategies.
You can also subscribe to the newsletter to receive practical reflections and useful tools for your daily work.










