Delogue PLM · 2025

Volunteering Platform · 2026

Designing Participation: Structuring decisions that feel safe to make

From Complexity to Clarity:
Reshaping Information Architecture for Better UX

A 0-to-1 project started from a blank page. I identified that the core barrier to volunteering wasn't motivation - it was commitment uncertainty. Then designed and tested an onboarding flow that broke participation into small, safe, adjustable decisions. Note: This project was tested, not launched.

Defined the problem space independently, used AI for rapid structural concept exploration, and validated the direction through user testing before handoff.

A 0-to-1 project started from a blank page. I identified that the core barrier to volunteering wasn't motivation - it was commitment uncertainty. Then designed and tested an onboarding flow that broke participation into small, safe, adjustable decisions. Note: This project was tested, not launched.

Defined the problem space independently, used AI for rapid structural concept exploration, and validated the direction through user testing before handoff.

A 0-to-1 project started from a blank page. I identified that the core barrier to volunteering wasn't motivation - it was commitment uncertainty. Then designed and tested an onboarding flow that broke participation into small, safe, adjustable decisions. Note: This project was tested, not launched.

Defined the problem space independently, used AI for rapid structural concept exploration, and validated the direction through user testing before handoff.

My Role

Product Designer

Starting point

Blank page

Status

Tested with users

0-to-1 Design

Problem framing

User Research

Interaction Design

User testing

AI-assisted exploration

problem

The barrier wasn't motivation
- it was uncertainty

Understanding
the status-quo

Starting from a blank page, the first question wasn't "what should the onboarding flow look like?" It was "why don't people participate when they want to?" Existing volunteering platforms assumed the problem was discoverability - surface more opportunities, get more sign-ups. But early research suggested something different.

People who expressed genuine willingness to volunteer still hesitated at the point of commitment. The friction wasn't finding something to do - it was not knowing what saying yes would actually mean. How much time? How often? What if circumstances changed? The ambiguity around commitment felt riskier than the contribution felt rewarding.

Uncovering navigation insights through workflow mapping and JTBD analysis.

Goal: Aligning the findings with our product vision and design principles

- Aligned product direction by collaborating with Customer Success and Sales
- Improved feature adoption through on-site visits and data-driven workflow insights
- Enhanced information architecture by identifying user logic via card sorting exercises.

AI in process

Using AI to explore structure
before committing to one

Understanding
the status-quo

With the problem defined, the next question was structural: how should an onboarding flow be organised to reduce perceived commitment while still capturing enough preference to be useful? Rather than defaulting to the first reasonable structure that came to mind, I used AI to rapidly generate and stress-test multiple structural approaches.

Uncovering navigation insights through workflow mapping and JTBD analysis.

Goal: Aligning the findings with our product vision and design principles

- Aligned product direction by collaborating with Customer Success and Sales
- Improved feature adoption through on-site visits and data-driven workflow insights
- Enhanced information architecture by identifying user logic via card sorting exercises.


The problem I chose to design for

Participation drops not because people lack motivation, but because the decision to participate feels irreversible before they have enough information to make it confidently. The design challenge was to make commitment feel specific, understandable, and adjustable - before asking for it.

Uncovering navigation insights through workflow mapping and JTBD analysis.

Goal: Aligning the findings with our product vision and design principles

- Aligned product direction by collaborating with Customer Success and Sales
- Improved feature adoption through on-site visits and data-driven workflow insights
- Enhanced information architecture by identifying user logic via card sorting exercises.

DESIGN DECISIONS
Step 1

Choosing types of help

Understanding
the status-quo

Users begin by selecting contribution categories - describing how they'd like to help rather than how much. Grouping related options creates clarity while keeping the decision lightweight. The structure supports quick recognition and reduces interpretation effort, helping users understand the scope of possible contributions without feeling overwhelmed.

Design rationale: Starting with contribution type rather than availability shifts the first question from "how much can I give?" to "what am I good at?" - a far less anxiety-inducing entry point that still produces useful signal.

Uncovering navigation insights through workflow mapping and JTBD analysis.

Goal: Aligning the findings with our product vision and design principles

- Aligned product direction by collaborating with Customer Success and Sales
- Improved feature adoption through on-site visits and data-driven workflow insights
- Enhanced information architecture by identifying user logic via card sorting exercises.

Step 2

Adding local relevance
and personal context

Understanding
the status-quo

Users refine where and how they want to help. Location-based filtering and a short open description allow expression of preference without committing to specific tasks. Structured filters combined with open input balance clarity with flexibility - users feel heard without feeling pinned down.

Design rationale: This step earns specificity gradually. By the time users reach it, they've already formed a mental picture of their contribution - so adding context feels like personalisation, not interrogation.

Uncovering navigation insights through workflow mapping and JTBD analysis.

Goal: Aligning the findings with our product vision and design principles

- Aligned product direction by collaborating with Customer Success and Sales
- Improved feature adoption through on-site visits and data-driven workflow insights
- Enhanced information architecture by identifying user logic via card sorting exercises.

Step 3

Confirming participation

Understanding
the status-quo

The confirmation step surfaces a clear summary of what the user has expressed, clarifies what happens next, and explicitly reassures them that preferences can be adjusted. Transparent feedback reduces residual uncertainty and strengthens trust in the platform at the most critical moment - just before commitment is formalised.

Design rationale: Most onboarding flows treat confirmation as administrative. Here it's a trust-building moment - the explicit message that "you're in control, and this is adjustable" is doing as much work as any earlier screen.

Uncovering navigation insights through workflow mapping and JTBD analysis.

Goal: Aligning the findings with our product vision and design principles

- Aligned product direction by collaborating with Customer Success and Sales
- Improved feature adoption through on-site visits and data-driven workflow insights
- Enhanced information architecture by identifying user logic via card sorting exercises.

Validation

What testing confirmed
- and what it challenged

Understanding
the status-quo

User testing validated the core structural decision: starting with contribution type before asking about availability consistently reduced expressed anxiety around commitment. Participants understood what they were signing up for earlier in the flow, and felt less hesitant at the confirmation step.

Testing also surfaced a tension the AI-generated concepts hadn't anticipated: some users wanted to see real opportunities before committing preferences - a hybrid of the need-matching and preference-first models. This remains an open question for a future iteration, and an honest reminder that AI-assisted exploration accelerates convergence but doesn't replace contact with real users.

Uncovering navigation insights through workflow mapping and JTBD analysis.

Goal: Aligning the findings with our product vision and design principles

- Aligned product direction by collaborating with Customer Success and Sales
- Improved feature adoption through on-site visits and data-driven workflow insights
- Enhanced information architecture by identifying user logic via card sorting exercises.

key insight

"
Starting from a blank page forced a discipline I've come to value: resist the interface until the problem is sharp. The most important design decision on this project wasn't a screen - it was the choice to frame commitment uncertainty as the core barrier, not discoverability.

AI helped me explore structural approaches faster than I could sketch them, but it also made the limits of that speed visible. The structural model it couldn't generate was the one users actually asked for. That tension is what I shared back with the team - and it's what made the project feel genuinely exploratory rather than just executed.

Uncovering navigation insights through workflow mapping and JTBD analysis.

Goal: Aligning the findings with our product vision and design principles

- Aligned product direction by collaborating with Customer Success and Sales
- Improved feature adoption through on-site visits and data-driven workflow insights
- Enhanced information architecture by identifying user logic via card sorting exercises.

Charlotte Kleckers

Copenhagen