When the Road Safety Authority came to us, they didn't need a proof of concept. They had an existing platform — one that had been running in Irish schools for years — and they needed it rebuilt. More stable. More modern. More immersive. What they didn't say, because nobody says this at the start of a project, is that the rebuild would eventually be adopted into the Irish national primary and secondary school curriculum.
That outcome wasn't luck. It was the result of specific decisions we made about technology, accessibility, and what "educational" actually means when your users are eight years old and your delivery environment is a school computer lab with inconsistent internet.
This is the account of how we built it, what we got right, and what the broader implications are for anyone building browser-based 3D education for government or institutional buyers.
The Brief Was Deceptively Simple
Replace the old platform. Make it better.
In practice, that brief contained several competing constraints that don't resolve easily. The platform needed to work for primary school children and secondary school students — two populations with meaningfully different cognitive and motor skills. It needed to run in a browser, because schools don't have the IT overhead to manage software installations across hundreds of machines. It needed to be accessible to schools with limited bandwidth and aging hardware. And it needed to feel genuinely immersive — not a PDF dressed up with animation, but an environment where children actually make decisions and experience consequences.
The last constraint is the one most platforms in this space fail. They produce content that looks like an educational tool and behaves like one too: linear, passive, forgettable. The RSA brief was explicit that the original platform had value but needed to feel more alive. That word — "alive" — drove most of our design decisions.
Why WebGL Was the Only Real Option
We've built VR experiences on Meta Quest, spatial applications for Apple Vision Pro, and full Unity simulations for enterprise training. For this project, none of those delivery mechanisms were right.
A browser-based 3D experience built on WebGL meant zero installation friction. Teachers don't submit IT tickets to unlock a new application. Students don't need accounts. The platform loads the same way a website does — type the URL, press enter, start learning. In a classroom context, that matters more than most technologists appreciate. Every additional step between a teacher's lesson plan and students engaging with content is a step where adoption dies.
WebGL also solved the hardware problem. We weren't building for a specific device. The platform needed to perform on whatever computers a school happened to have — and Irish schools, like schools everywhere, have a wide spread. High-performance 3D rendered in-browser, accessible on any device, without downloads. That's the specification that makes institutional adoption possible.
The RSA client confirmed this after launch: the new platform felt "more stable, modern, and immersive than the original." Stability, in a school deployment context, is not a minor feature. It's the thing that determines whether a teacher uses the platform twice or twenty times.
Building for Children: What That Actually Requires
Building 3D educational content for children is different from building for adult enterprise users in ways that go beyond making things look colourful. The cognitive and perceptual constraints are real and specific.
The first thing we had to get right was decision density. Children, particularly at primary school age, can only process so much information before a screen becomes noise. Our 3D environments needed to be readable at a glance — clear visual hierarchy, unambiguous affordances, no UI element that required prior familiarity with game conventions to understand. Every scenario had to communicate its objective within seconds of loading.
The second thing was consequence feedback. Research on immersive safety training consistently shows that immediate, clear feedback following a decision is what separates effective scenario-based learning from passive content consumption. A child who decides to cross a road in our environment needs to understand immediately whether that decision was safe — not through a score screen two clicks away, but through what happens in the scene itself. We built the feedback into the environment, not the UI.
The third constraint was session length. We designed scenarios to be completable in under five minutes each, because a teacher has forty minutes and three other things on the agenda. Modular, self-contained learning units that fit inside a normal lesson structure are what actually get used. Long-form immersive experiences are for consumer entertainment. Educational platforms live or die by how well they slot into an existing timetable.
The Technical Decisions That Made Curriculum Adoption Possible
When a government body formally adopts a platform into a national curriculum, they are making a long-term institutional commitment. That means the evaluation criteria are different from a standard procurement. They're not just asking whether it works — they're asking whether it will keep working, whether it can be updated, whether it fits into existing teacher workflows, and whether the evidence supports it as a genuine learning tool.
Several specific decisions we made addressed those criteria directly.
Standards alignment from day one. We worked to ensure that platform content mapped to existing curriculum objectives across relevant subject areas — not just road safety as an isolated topic, but connections to science, physical education, and social-emotional learning strands. Platforms that position themselves as add-ons face higher adoption friction than platforms that fit inside what teachers already have to teach.
Teacher-facing documentation built alongside the platform, not after. We produced lesson guides, learning objectives, and facilitation notes as part of the delivery scope. Teachers adopt tools they can use confidently. Confidence requires knowing what the tool is for and how to use it in a lesson — not just how to operate it technically.
Progressive difficulty across age groups. We didn't build one platform and apply it uniformly to eight-year-olds and sixteen-year-olds. The scenario complexity, decision speed, and environmental density scaled across the intended age range. Primary school scenarios involved simpler environments with clearer visual cues. Secondary school scenarios introduced higher traffic density, more ambiguous crossing situations, and faster decision windows. Age-appropriate calibration is the difference between a tool that builds genuine skill and one that produces frustration or boredom.
Performance headroom for low-spec hardware. We built quality tiers into the rendering pipeline, so the platform automatically adjusted visual fidelity based on the device's capability. A school with three-year-old laptops got a platform that ran smoothly at reduced detail. A school with modern hardware got the full visual experience. Neither group got a broken experience.
What "Exceeding Expectations" Actually Looked Like
The RSA client's assessment — that we "exceeded expectations" and delivered something "more stable, modern, and immersive than the original" — came from a specific combination of outcomes, not a single impressive moment.
The platform launched on schedule. In educational technology procurement, this is not routine. Projects that involve government clients, curriculum alignment, and multi-age content frequently overrun. We delivered within the agreed timeline, which mattered because the rollout was tied to a school year calendar.
The content held up under classroom conditions. This is the test that matters most and the one that most pre-launch testing fails to simulate adequately. A platform can perform perfectly in a controlled demo environment and fall apart when thirty children are simultaneously loading scenarios on school WiFi. We stress-tested against realistic school network conditions, not ideal ones.
The client's internal stakeholders — teachers, curriculum coordinators, and the RSA team — could see the educational rationale in the design. That's a less tangible outcome but a critical one. Platforms that feel like they were built by people who understand education get adopted. Platforms that feel like they were built by people who understand technology and consulted an educator briefly do not.
What This Tells Us About WebGL for Government Education Buyers
We've seen a pattern across the educational and public sector projects we've delivered. The clients who commission the most durable, well-adopted platforms are the ones who understood early that the technology is not the product — the learning experience is the product, and the technology is what makes it accessible.
For government and institutional education buyers specifically, WebGL-based 3D experiences sit in a useful position. They deliver genuine immersion without the hardware cost, IT complexity, or procurement friction of VR headsets. They run on existing school infrastructure. They can be updated centrally without requiring schools to install patches or new builds. And they produce the kind of data — completion rates, scenario performance, session frequency — that procurement teams and curriculum coordinators need to justify continued investment.
The RSA project demonstrated that browser-based 3D education, built with the right constraints in mind, can achieve outcomes that historically only physical, instructor-led programs could reach. That's not a small claim. It's a well-documented result from a shipped, deployed, curriculum-adopted platform.
A Framework for Building 3D Educational Platforms That Get Adopted
If you're building — or commissioning — a 3D interactive road safety education platform or any browser-based 3D learning tool for institutional buyers, these are the decisions that determine whether the project ends at launch or continues into long-term adoption.
Before you build:
- [ ] Map all content to existing curriculum standards in at least two subject areas
- [ ] Define separate experience parameters for each age group you're targeting
- [ ] Confirm the delivery environment — device specs, network conditions, IT policies
- [ ] Identify the teacher workflow the platform needs to fit inside
During development:
- [ ] Build feedback into the environment, not a separate UI layer
- [ ] Design scenarios to be completable in under five minutes
- [ ] Implement rendering quality tiers for hardware variability
- [ ] Build teacher documentation in parallel with the platform, not after
- [ ] Test on the worst-case hardware and network conditions in your target environment
Before launch:
- [ ] Run a classroom simulation test with actual teachers, not developers
- [ ] Confirm the platform loads and performs without IT intervention
- [ ] Validate that learning objectives are legible to educators, not just technologists
For long-term adoption:
- [ ] Build content update pathways that don't require a full redeploy
- [ ] Provide usage data that administrators can report upward
- [ ] Stay in contact with curriculum coordinators after launch — adoption decisions happen months after delivery
The RSA platform got into the national curriculum because it did all of these things. Not because it was the most technically sophisticated platform we've ever built. Because it was built for the context it would actually live in.
That's the standard we hold every educational project to.