← All selected case studies

From Silos to Shared Accountability

CompanyClimatePartner
RoleHead of Product Design
TimelineJanuary 2026 – Present
IndustryB2B Climate-Tech
  • Cross-team collaboration was structurally broken. Teams optimised locally, ownership was unclear, priorities were unaligned.
  • Diagnosed gaps across a 47-person department. Facilitated 8 workshops. Delivered a prioritised initiative roadmap with assigned owners.

In early 2026, I co-initiated this initiative with the Head of Engineering and the Head of Data. I owned the diagnostic design and workshop facilitation. The department comprised approximately 47 people across multiple teams.

The Digital Department had no shared goals or success metrics across teams. Each team optimised for its own priorities. Cross-team dependencies and priority conflicts became visible late, causing delays and friction.

Specific problems included unclear ownership, missing involvement of BA and Data teams, and absent decision documentation. The organisational structure reinforced local optimisation over shared outcomes.

These problems were connected. Without shared goals, teams set their own priorities. Without cross-team accountability, conflicts surfaced only when work was already blocked. Without decision documentation, the same disagreements recurred.

Constraints: workshop time was limited to two 90-minute sessions over two days. Groups worked without external moderation.

Phase 1 – Diagnosis. A Collaboration & Performance Survey was distributed to 47 people in the department. 25 responded. The survey used Likert scales (1–5) across multiple collaboration dimensions. Analysis focused on lowest and highest scoring items, plus a multiple-choice question identifying collaboration barriers. Free-text responses provided qualitative context.

Phase 2 – Problem Structuring. Survey findings were synthesised into a problem landscape and clustered into prioritisable topic areas. For each area, a problem-oriented "How Might We" question was formulated.

Phase 3 – Workshop (2 days). Seven groups of 5–6 people each received randomly assigned problem fields. A separate leadership group of four worked on the goal-alignment problem. Each group followed a structured format ("From HMW to One Clear Initiative") using printed workshop guides. The target output: one concrete initiative per group, defined by action point, owner, and metrics.

Day 1 focused on developing initiatives within groups. Day 2 began with a Gallery Walk and Silent Reading phase for cross-group reflection, followed by a joint prioritisation of all initiatives.

No external moderation. Groups worked independently with written instructions. The rationale was to build ownership: if groups developed their own initiatives without a facilitator steering outcomes, commitment to follow-through would be higher. The trade-off was less control over workshop quality across groups.

Random assignment of problem fields. Teams did not choose their topics. This prevented groups from gravitating toward familiar territory and forced engagement with problems outside their direct experience. Whether alternative assignment methods were considered is not documented.

Initiatives over analysis. The workshop was scoped to produce concrete, actionable initiatives rather than a comprehensive organisational analysis. This kept the output decision-ready within the time constraint, but left broader structural questions unaddressed.

Scope expansion beyond formal mandate. The initiative extended beyond the formal product design scope to address an organisational problem that directly affected design team effectiveness. Collaboration bottlenecks between teams were slowing down product design work, making a department-wide diagnostic a practical prerequisite.

Delivered: Survey dataset (n=25 of 47, providing qualitative indicators rather than statistically robust findings), problem landscape with prioritised topic areas, HMW questions for each problem field, workshop structure and printed group guides, and a prioritised list of initiatives with owners and proposed metrics (evidence: high – artefacts documented).

Not documented: Post-workshop implementation status. Which initiatives were executed, whether proposed metrics were tracked, and what measurable change resulted are not recorded. Qualitative feedback on the workshop itself is also not documented.

Organisational structure shapes collaboration more than individual team behaviour. The survey confirmed what was observable: teams optimised locally because the structure incentivised it. Changing collaboration patterns requires changing the conditions under which teams operate, not asking them to collaborate better.

Diagnosis without implementation tracking is incomplete. The initiative produced a clear problem map and concrete initiatives, but no mechanism was established to track whether those initiatives were executed or effective. In retrospect, building a review cadence into the workshop output would have closed this gap.

Running workshops without moderation requires strong written materials. The printed guides carried the full weight of facilitation. This worked within the constraints, but the variance in group output quality is an open question – no systematic comparison across groups was conducted.

Whether the proposed initiatives produced lasting change in cross-team collaboration is not documented.

I help companies get design leadership right.

Write me an email

Back to top