Company:
Deliverable:
Role:
Timeframe:
4 Months
Industry/Category:
Customer Support Platform, B2B
Improving a customer support platform that didn’t adapt to the real-world workflows of different businesses. Each business had their own preferred customer engagement channels (voice, chat, email, social) but the platform lacked the defined processes to support this variety, as well as the way each client prioritised one channel over another.
By identifying these gaps, remapping the task flows, and designing with actual customer service agent behaviours in mind, I helped transform a low-adoption tool into a daily- use platform that clients rely on to manage their multi-channel customer support— improving task completion rates, reducing handling time, and providing clients with clearer metrics to track and improve team performance.
The Problem
The Nubitel CX Agent Workbench was designed to streamline customer service operations by consolidating multiple communication channels into a single interface for improved agent productivity and seamless customer experience.
But it wasn’t landing…
The app’s low adoption rates signalled a disconnect between what the platform could do and what users actually needed. Feedback from clients revealed inefficiencies in managing customer interactions through their preferred channels, with the earlier version of the product having no clearly defined workflows to support the diverse use cases — leading to a frustrating user experience.
My Role
I was brought in to uncover the bottlenecks in the agent experience and lead UX updates that would improve usability, efficiency, and adoption of the Agent Workbench platform.
Redefined key user personas with a focus on agents:
With majority of the platform users being agents, I focused on the agent personas, honing in on their job scope requirements, day-to-day workflows, pain points, and KPIs. This focus greatly clarified priorities for the redesign.
Mapped use-case-specific workflows:
I then built detailed UX flows that reflected how agents handled tasks across channels, revealing gaps, redundancies, and opportunities to streamline the current workflows.
Defined key success metrics:
Based on how agents were evaluated, I introduced 2 KPIs — Task Completion Rate and Time on Task — to help clients better measure team performance. These metrics also provided a way (for the client and Nubitel) to assess the platform’s usability and its impact on agent productivity.
Working closely with Product and Engineering, I guided the required iterations and prototyping of the updated designs.
Introduced a job status menu feature for supervisors to monitor campaigns and agent activity in real time.
Built a team overview module displaying live interaction stats and KPI tracking for agents.
Added gamification features to reward successful resolutions and increase agent engagement.
Sharing edge cases and error validation screens during design-dev handoff made a noticeable difference, giving engineers better context, reducing back-and-forth, and making collaboration feel smoother and more aligned.