Is this a training problem?

Original portfolio demonstration, built to model the same evidence-based thinking it teaches.

Most L&D teams accept a stakeholder's training request as though it were a diagnosis. It isn't. This course puts a better argument into practice, teaching practitioners and managers to ask two questions before committing to any intervention. It also shows what Rise 360 looks like when you go beyond the authoring interface. You can read the thinking behind it in full here.

The Challenge

In most organisations, training gets built in response to requests, not evidence. A manager identifies a performance gap, reaches for the most familiar solution, and L&D gets to work. Nobody stops to ask the harder question: is training actually the right answer? The result is a profession that measures its output in courses completed rather than problems solved. A lot of what gets built addresses symptoms rather than causes. Not because the design is poor, but because the diagnosis never happened.

The Strategy

Ground the course in an evidence base drawn from Cathy Moore's Action Mapping reframe (what do people need to do differently?), Julie Dirksen's four gap types (knowledge, skill, motivation, environment), and the performance consulting lens from Clark Quinn, Neelen and Kirschner. The design principle was to model what it teaches. If the argument is that you should diagnose before you build, the course itself should demonstrate that: scenario first, root cause before solution, no content included that doesn't serve a clear performance outcome.

The Methodology

The course opens with a custom HTML introduction block, not the standard Rise template, signalling from the first screen that this is not a default authoring tool output. The central scenario is a Microsoft Teams message from a stakeholder requesting a communication skills course. This is rendered as a fully custom HTML component: dark-themed, with a typing animation, timed message reveal, and an audio ping on arrival. The aim is to place the learner inside a moment they will recognise from their own working day before the instruction begins. From that moment, the course teaches two diagnostic questions as its structural spine: what exactly do people need to do differently, and why aren't they doing it? A sorting activity asks learners to classify workplace situations by whether training is likely to help. A knowledge check requiring a correct answer to continue applies the root cause framework to a realistic scenario before the course concludes. The entire course runs on a custom dark theme applied via CSS overrides injected into the Rise export, covering quiz states, scenario blocks, interactive elements, sort activities, callout blocks, and navigation components. None of this is available through the Rise interface. It required working directly with the exported HTML and a patch script to apply changes on each export.

The course includes a downloadable job aid: a one-page diagnostic reference card aimed at managers. It summarises the two questions and four root causes in a format designed for use at the moment of need, the next time a training request lands in their inbox. It is the course's primary transfer mechanism, not a summary of what was covered but a tool for what comes next.

The Result

This is an end-to-end demonstration of instructional design practice: evidence-based course architecture, scenario-first engagement design, and technical execution that goes beyond standard Rise authoring. The course asks the same question a skilled practitioner should ask about every brief that lands on their desk. The design, the custom components, and the job aid are all answers to it.