Author-e · Learning management system

Administration System

Built the administrative backbone for a growing e-learning platform that had outgrown its manual processes.

EdTech B2B SaaS
Passive observation Stakeholder interviews Service blueprinting Workflow mapping Jobs to be Done Affinity mapping Card sorting Concept testing Prototype testing Risk and failure-state mapping
Role:
UX Designer Lead
Solo
Team:
Product owner, developers, finance
Type:
New product functionality
Delivered:
Shipped to production

A functionality built to automate operational workflows, reduce human error, and shorten processing time.

The learning platform had recently been redesigned for students and teachers, which successfully increased adoption among schools.

While the customer-facing experience improved, the operational side of the business remained heavily manual.

Core processes depended on system administrators:

  • bookstore orders arrived by email and were processed manually
  • delayed invoice payments often required administrators to manually grant student access
  • invoice records were fragmented across spreadsheets and email
  • quarterly VAT reporting required multiple days of manual reconciliation

"Seeing system administrators spend time on repetitive, automatable tasks highlighted clear inefficiencies in the process. It was time for improvement."

- from my passive observation notes

Operational workflows that could finally scale

01 70–85% Less manual work

Reduction in manual administrative workload

Order processing, invoice creation, payment checks, and student access activation were largely automated. Manual work was reduced to exceptions only, supported by dedicated admin tools for returns, reminders, invoice recovery, and order corrections.

02 Near zero Incorrect orders

Near elimination of incorrect package orders

Bookstores could only view school-specific valid package options, removing incorrect orders at the source rather than relying on training or manual correction.

03 Days → hours VAT reporting time

VAT reporting cut from days to under three hours

A unified finance dashboard consolidated all invoice types and enabled one-click VAT export for quarterly reporting. The platform continued to grow in users, schools, and order volume without requiring equivalent growth in operations staff. That was the goal.

04 50–70% Fewer support issues

Fewer operational support issues

Automated invoicing, payment-linked access, invoice regeneration, and clearer student communication reduced admin workload related to order issues, invoice corrections, and support requests.

Understand the operational ecosystem before designing screens

I started with passive observation of system administrators to understand real operational behavior, bottlenecks, workarounds, and dependencies across ordering, invoicing, finance, and student access.

This revealed that the problem was not isolated to a single workflow, but rooted in a fragmented service ecosystem involving bookstore purchasers, school administrators, students, and finance staff.

Using stakeholder interviews, workflow mapping, service blueprinting, and Jobs to be Done, I mapped how information, payments, invoices, and access moved across the system and where failures occurred.

This allowed the team to prioritise structural fixes over manual workarounds or training.

A role-based operational system with automated business logic

Bookstore Purchaser role

Designed a dedicated purchaser role with school-based package filtering, limiting bookstores to valid package options only and eliminating incorrect package selection at source.

learningbox.nl/purchaser/orders
Bookstore Purchaser, school-filtered package selection prevents incorrect orders at the source.

School ordering & invoicing system

Enabled school administrators to place orders for students through voucher or direct assignment workflows, bulk ordering, duplicate prevention, and multiple invoice profiles per school.

learningbox.nl/school/orders
School administrator, bulk ordering, voucher assignment, and multi-profile invoicing in one flow.

Finance dashboard

Built a unified dashboard consolidating bookstore, school, and student invoices into one operational view with:

  • payment status management
  • one-click BTW export
  • invoice re-download
  • payment reminders
  • order return handling with rules based on order and invoice type

This created a single source of truth for finance and operations.

learningbox.nl/admin/finance
Finance dashboard, unified invoice view across all order types with one-click BTW export.

Student invoice & temporary access improvements

Added temporary 14-day access during invoice processing, invoice regeneration for corrections, and clearer payment and access communication.

Post-launch iterations

Real-world usage revealed several operational edge cases that were not visible in prototype testing.

Post-launch iterations included:

  • 30-day expiry for unpaid student invoices
  • automatic credit invoices for expired orders
  • ability to reopen expired orders
  • restrictions preventing repeated temporary-access abuse

This project shifted a manual operation into an integrated solution that reduced admin dependency, improved financial control, and enabled sustainable growth.

The platform had grown. The operations behind it had not.

The learning platform had been redesigned for students and teachers, and adoption was growing. More schools, means more students and more orders. But every order still arrived by email, every invoice was created by hand, and one system administrator was holding the entire operation together through spreadsheets and manual checks.

The core problems were:

  • bookstore orders came in by email and were processed one at a time
  • student access had to be manually granted when invoice payments were delayed
  • invoice records lived across spreadsheets and email threads, not in one place
  • quarterly VAT reporting required several working days of manual work

"Seeing system administrators spend time on repetitive, automatable tasks highlighted clear inefficiencies in the process. It was time for improvement."

from passive observation notes

Understanding how people actually worked, not how they said they worked

Because nothing existed yet, every method had to be chosen for what it could reveal about behaviour and expectations rather than reactions to an existing product.

Passive observation: where it started

As part of the team, I watched the system administrator work through the same manual processes day after day: processing bookstore orders by hand, tracking everything in spreadsheets, manually creating invoices every two weeks, granting student access when payment was delayed, and spending multiple days before each quarterly VAT declaration gathering orders across different VAT categories.

Passive observation was the right starting point here because people are unreliable narrators of their own workflows. When you ask someone how they process an order, you usually get the happy path. They leave out the three spreadsheets they cross-reference, or the email they dig up before filling in a form. When you observe someone working, you see everything: the workarounds, the double-checks, the small frictions they have stopped noticing.

What I observed directly included constant context switching between email, spreadsheets, and the platform just to process a single order. Students messaging about missing access while waiting for an invoice to clear. And returned bookstore orders caused by incorrect course assignments, something that happened often enough to be a pattern but never often enough for anyone to treat it as a design problem. Looking back, that last observation was the most important thing I saw in the entire research phase.

Service blueprinting across roles

Observation showed me how one administrator worked. But the operation involved four distinct groups: bookstore purchasers, school administrators, students, and finance staff. I used service blueprinting to map how information, payments, invoices, and access moved across all of them, and where the handoffs broke down.

Stakeholder interviews

Observation gave me hypotheses. Interviews helped me understand the mental models behind the behaviour. I ran interviews with three groups: bookstore purchasers, school administrative staff, and author partners.

Rather than asking what was frustrating, I framed the questions around expectations. Three questions produced the most honest answers:

  • "If a tool existed tomorrow to handle this, what would you expect it to do?"
  • "What would make you trust it enough to use it?"
  • "What would make you abandon it and go back to how you do things now?"

The third question was consistently the most valuable. People are quick to describe what they want. They are slower to describe what would break their trust. And trust thresholds, not feature lists, determine whether a tool actually gets adopted.

The interviews confirmed a few things I had suspected. Bookstore purchasers did not want a new process. They wanted a better version of the one they already had, with fewer steps to memorise. School administrators expected the tool to fit around teaching workflows, not require them to switch into a separate admin mindset. Students cared about one thing: knowing exactly when access started and what happened if something went wrong.

Service blueprint mapping how information, payments, invoices, and access flow across roles, channels, and backstage operations in the Learning management system admin system

Naming the real problem before touching any solution

With observation notes, service blueprint data, and interview transcripts, the next step was synthesis. I used affinity mapping to cluster the raw material into themes. This is a step that is easy to rush, and I did rush it slightly on the first pass. The initial clusters were too task-focused. I went back and regrouped around what each problem was actually costing, which produced something more useful.

Five problem areas emerged:

  • Manual processing burden: every order was touched by hand, often more than once
  • Student access delays: payment and access were disconnected, with no automated link between them
  • Incorrect course packages: curriculum and cohort logic was invisible at the point of order
  • Fragmented invoice records: VAT reconciliation had to be rebuilt from scratch every quarter
  • No self-service: schools, students, and partners all depended on one administrator for everything

These were not five separate problems. They were symptoms of one structural issue. The platform had been built for a simple student self-ordering journey and was never extended to support the schools, cohorts, and external partners that had grown around it.

Personas

I built personas because the team kept designing as if all admin users were the same person. Three primary personas shaped every decision from this point forward.

LP
Bookstore Purchaser

Lars

Purchasing coordinator · Orders for 6+ schools · Semester-driven peaks

"Just show me the right options. I don't have time to cross-check against a spec sheet."

Goals
  • Process bulk orders quickly under deadline pressure
  • Avoid being blamed for incorrect package selections
  • Close the semester window without any follow-up
Frustrations
  • Wrong package only discovered after students redeem vouchers
  • No visible list of valid packages per school at order time
  • Correction requires back-and-forth emails with Learning management system
Behaviours
High-volume, seasonal Works from memory Minimal decision points No manual, no system
MA
Teacher-Administrator

Mia

Secondary school teacher · Admin on the side · 3 teaching days per week

"If I have to read a manual to use it, I won't use it."

Goals
  • Handle student enrolments without switching into an admin mindset
  • Manage students with different start dates in one flow
  • Keep teaching the priority, admin is secondary
Frustrations
  • Admin tasks interrupt teaching rhythm with no dedicated time
  • Systems that assume she already knows the process
  • Different start dates require separate flows, she skips the edge case
Behaviours
Multi-role context switching Zero-training expectation Intuition over instruction Low complexity tolerance
DS
Company-Sponsored Student

David

Professional learner · Employer-funded enrolment · Invoice required before start

"I just need to know what to type, and what happens if I get it wrong."

Goals
  • Start the course on time without access interruptions
  • Submit an invoice his employer will actually accept
  • Understand what happens if something goes wrong
Frustrations
  • No guidance on legal entity name vs. display name at point of entry
  • Invoice rejection only discovered after access is already lost
  • Falls behind peers while waiting for correction and resubmission
Behaviours
Defaults to familiar contact High anxiety about delays Needs clear confirmation One-time process

Together, the three personas pointed to the same structural gap: the platform had no infrastructure for anyone except a student ordering for themselves.

Problem statement: The Learning management system platform lacks the administrative infrastructure to support its real stakeholder ecosystem, forcing every order, invoice, and access decision through a single system administrator manually. This creates delays, errors, and an operation that cannot scale.

Jobs to be Done and How Might We

Before moving to solutions, I used Jobs to be Done to reframe each persona's need away from features and toward intent. The bookstore purchaser's job is not "to place an order." It is "to get the right materials to the right students without being blamed when something is wrong." That reframe directly shaped the package filtering decision: the system should make the wrong choice impossible, not just harder.

The three biggest tensions became How Might We questions to keep the solution space open before committing to anything:

HMW

help bookstore purchasers always select the correct package, without relying on email instructions?

HMW

allow school admins to order for students with different start times, without splitting the flow?

HMW

let students correct invoice details themselves, without submitting a support request?

Exploring the structure before designing any screens

Before sketching anything, two structural questions needed to be settled. Getting these wrong would make everything that followed harder to undo.

Separate admin application, or extend the current system?

A dedicated admin application felt clean on paper. But many smaller schools do not have a dedicated purchaser. The same person may teach, place orders, and occasionally order for themselves as a student. Forcing them between two logins would create friction exactly where we were trying to remove it. The decision was to extend the existing system with role-based views, not build a parallel one.

Build from scratch, or use an off-the-shelf solution?

I looked at existing tools. The requirements were too specific: course packages tied to cohort logic, VAT across multiple percentages, school-specific invoicing rules. Adapting a generic tool would have cost more than building something tailored. Custom build it was.

MoSCoW with the development team

Before producing any wireframes, I sat down with developers to walk through the ideas together. The conversation focused on what was technically possible and what would need to be approached differently. We used MoSCoW to prioritise the work. The Won't Have list turned out to be as important as the rest. It gave the team permission to ship something solid over something complete, which sounds obvious but is genuinely hard to hold to when six stakeholder groups each have a decade of unaddressed needs.

Deciding what goes where, and designing out the failure points

Each role gets its own account on the platform, with one exception: not all schools have a dedicated administrator. For those schools, purchasing features attach to selected teacher accounts. A system administrator controls who can do what.

For each role and flow, I mapped the full workflow step by step using workflow mapping, including where each flow could go wrong and how the design would prevent it. The goal was to make certain errors structurally impossible, not just to make them less likely.

Bookstore Purchaser role

Built around one principle: wrong packages should not be selectable, not just discouraged. A system administrator configures each school's valid package options once. After that, the purchaser selects the school and sees only what belongs to it. There is no other list to accidentally choose from. The fix was structural, not instructional. It just took observation to see that the problem was the list, not the purchaser.

Purchasers can add order numbers and any invoice details their organisation requires, then choose between bulk invoicing or sending invoices individually. Voucher codes are generated on completion. No training required.

learningbox.nl/purchaser/orders
Bookstore Purchaser, school-filtered package selection prevents incorrect orders at the source.

School Admin role

Two ordering methods: voucher codes for students who choose their own start date, or direct assignment for groups starting together. Invoice profiles are configured per school so billing details are always correct and nobody re-enters them. Bulk ordering is available. The system flags students who already have active packages to prevent duplicates. If only one invoice profile exists, it auto-selects. A dropdown only appears when multiple profiles are configured, which removed a hesitation point that showed up in prototype testing.

learningbox.nl/school/orders
School administrator, bulk ordering, voucher assignment, and multi-profile invoicing in one flow.

Student ordering flow

My Courses shows a persistent banner with the access end date whenever a student is on temporary access. This came directly from prototype testing: one participant missed the deadline entirely because they never reopened the confirmation email. Keeping that information visible throughout the window, not just at confirmation, was a small fix with a meaningful impact. Invoice Request is a three-step flow that confirms temporary access dates and explains what happens at day 14. Payment Status shows the current invoice state at any time.

Finance dashboard

The Invoice Dashboard shows all invoices across all schools and all order types in one view, filterable by school, date range, payment status, and invoice type. Marking an invoice as paid automatically triggers student access activation. Previously those two steps were separate and sometimes done out of order. Now they are linked at the system level. One-click VAT export handles quarterly declarations.

Post-deployment, a large number of student invoices were being created and never paid, filling the invoice view with stale records. Unpaid student orders now expire after 30 days. Expired orders generate an automatic credit invoice and can be reopened if needed.

learningbox.nl/admin/finance
Finance dashboard, unified invoice view across all order types with one-click BTW export.

Key IA decision: All invoice types flow into the same dashboard regardless of origin, whether from a bookstore order, a school administrator order, or a student invoice request. This was the architectural decision that made VAT reporting manageable and gave finance a single source of truth.

Why the sequence mattered as much as the output

On a project where nothing exists yet, the order you work in is itself a design decision. Starting with wireframes before the concept has been validated wastes everyone's time. Starting with a prototype before the IA is settled produces a polished version of the wrong thing. The sequence below was deliberate, and it is probably the thing I would defend most strongly if someone pushed back on the timeline.

1
Storyboards before wireframes
Map the real journey before touching any screen. Two costly gaps surfaced here.
2
Concept testing before screen design
Validate the idea, not the UI. Easiest step to skip. Most often regretted when you do.
3
Card sorting before labelling
Test navigation labels with no visual context. Getting this wrong means re-labelling everything.
4
Annotated lo-fi wireframes
Greyscale, rough, annotated with logic, not just layout. Behaviour written on the wireframe.
5
Hi-fi prototype for task testing
All four workflows end-to-end. Real enough to test. Clearly a prototype so feedback stays honest.
1
Storyboards before wireframes

I started with storyboards because the persona work was still fresh and the team needed to see how each person would actually move through a workflow, not just which screens would exist. A storyboard forces you to write the in-between moments that wireframes skip: what the user is thinking, what they have just done, what they expect next. Two gaps showed up here that would have been expensive to find later: a missing confirmation state in the bookstore flow, and the student invoice flow starting at the wrong point in the journey.

2
Concept testing before screen design

Once the storyboards held up internally, I ran concept testing with two school administrators and one bookstore purchaser before touching screen design. The question was not whether specific UI elements worked. It was whether the concept itself made sense to the people who would use it. This is the step that is easiest to skip under time pressure, and it is almost always the step that saves the most time later.

3
Card sorting before labelling anything

Navigation labels were tested as plain text cards with no visual context before any screen design started. Labels that seem obvious to a designer often mean something different to someone who does not share the same mental model. Getting this wrong at wireframe stage means rewriting labels, retesting, re-annotating. Getting it right here means none of that.

4
Annotated low-fidelity wireframes

Wireframes were kept greyscale and deliberately rough so feedback stayed focused on structure and logic. Every wireframe was annotated with the rule it enforced, not just the layout it showed. If a field auto-populates under a certain condition, that condition is written on the wireframe. Developers should not have to infer behaviour from a visual. This cut clarification questions during development significantly.

5
High-fidelity prototype for task testing

The Figma prototype covered all four primary workflows end-to-end. Realistic enough to complete real tasks, but clearly a prototype so participants could say when something did not make sense. The gap between "this looks real" and "this is a prototype" matters. People who think they are looking at a live product are less likely to speak up.

Testing a product that did not exist yet

Traditional usability testing was not an option. There was no live product. Instead, I ran prototype testing with a task-based format. Five participants: two school administrators from different school sizes, one bookstore purchaser who processed orders for six schools each semester, and two students. Each was given a realistic scenario and asked to work through the Figma prototype while thinking aloud.

What we observed What it meant What we changed
1
Prototype
testing
School administrators ordering for groups of students were forced to switch between two separate pages, voucher and direct order, whenever the order type varied across students. The constant context switching slowed the work down significantly.
Splitting voucher and direct ordering into separate pages mirrored the system architecture, not the school admin's mental model. They thought of one task: order for these students. The interface had divided that task in two.
Replaced with a single order flow. A checkbox toggles each line between voucher and direct order, and the steps the admin sees adapt automatically to that choice. One page, no switching mid-task.
2
Prototype
testing
A student participant missed the temporary access end date entirely. It was on the confirmation screen, but they never reopened it after closing the email.
Time-sensitive information that lives in a one-time confirmation gets lost. It needs to live in the place users return to.
Persistent banner on My Courses showing the access end date for the full 14-day window. Visible every time the student enters the course area.
3
Post-launch
A small group of students kept requesting the 14-day grace period at the start of every school year, treating it as a recurring discount instead of a one-time bridge.
A pattern testing could not surface. Scale revealed it: the grace period's intent was clear in design, but its rules were not enforced in the system.
A once-per-year limit per student, validated with a focused follow-up test and shipped as a small targeted update, not a redesign.
4
Post-launch
Large numbers of student invoices were created and never paid, filling the finance dashboard with stale records that masked the orders actually needing action.
An unpaid invoice that lingers indefinitely is not just clutter. It erodes trust in the dashboard as a single source of truth for finance.
Unpaid student orders now expire after 30 days, generate an automatic credit invoice, and can be reopened if needed. The dashboard stays clean by default.

"The bookstore purchaser described a significant drop in back-and-forth emails with the Learning management system team. The school administrator called the automated invoicing the change she did not know she needed."

post-launch follow-up interviews with two original research participants

Where it went sideways

Designing from nothing sounds like freedom. It is also the most disorienting kind of work. There is no wrong version to point to, which means there is no obvious right one either.

warning_amber
The abuse pattern that testing did not catch
The temporary access system tested cleanly. Five participants, no edge cases surfaced. Two enrolment periods later, students were repeatedly requesting the 14-day grace period across school years. No individual participant in testing would have had reason to try that. Scale revealed a pattern that controlled testing structurally could not. The lesson is not "test more people." It is "design for the user who has every incentive to find the edge of your system." We fixed it post-launch with a once-per-year cap, but it should have been in the original MoSCoW as a Should Have, not a post-launch discovery.
hub
Stakeholder scope creep during MoSCoW
The MoSCoW session surfaced more must-haves than the timeline could support. A common pattern when stakeholders who have been doing everything manually suddenly see a system that could do it for them. I had to push back on several requests and reframe the conversation around what a broken first version would cost versus what a working core would unlock. The Won't Have list became the most important output of that session. It gave us permission to ship something solid over something complete.

A note on method: Because every feature in this project was net new, research focused on concept validation and expectation mapping rather than behavioural observation of an existing product. This shaped which methods were appropriate at each stage. Knowing which tool belongs in which context is as much a part of UX practice as knowing the tools themselves.

Next case study

Teacher portal redesign →