Sass

Challenge

Verizon had noticed that their IOT ThingSpace platform customers were not using the platform at all (usage had fallen to less than 10 % of all their B2B customers) and this was despite all the features and abilities advertised and the high subscription these customers paid. 

Verizon hired me as a principal consultant to find out why this was happening and design a solution.

I laid out a roadmap of actions and processes to achieve this: firstly, Discovery and research of the problem, pinpoint the issues, and come up with a solution and test and repeat.

I carried the following action plan:

  • UX audit of the platform using UX heuristics and scoring the results against standard benchmarks
  • Contextual inquiries. Field research, I requested to visit the customers at work using the application.
  • Usability test of the existing platform
  • Discovered Frustration signals like Rageclicks, Dead links and Mouse thrashing
  • Interview sessions with salespeople and customers remotely.
  • Prioritization and categorization of all feedbacks and reviews

Research Results

After collecting all the qualitative and quantitative data above

I saw and proved there were a couple of usability issues that seem to be driving customers to ignore the platform and just rebuild the same features in their own inhouse applications using the platform API which was risky a situation for the platform. Below are a summary of my findings;

  • Recognition rather than recall: Averagely, everyone was experiencing simple and common task that was took way to long and too many hoops to get done.
  • Visibility of system Status: Users easily got lost in the application as there was no sense of their location in the app
  • Feedback: There was no feedback on a lot of the activities taken. I heard and observed users complaining  of not knowing whether their actions had happened or not.
  • Error prevention: Most actions on the platform could critically affect machines somewhere in the ecosystem, so users had a lot of fear of clicking anything.
  • Help users recognize, diagnose, and recover from errors: Critical and destructive action confirmation. One user complained of several instances where items got deleted mistakenly. It was just too easy and scary for things to get removed. there was no recovery or cooling off period for removed items.
  • Consistency and standards: Too many technical jargons or technical terms that were mostly non-standard in their particular industry, therefore, increasing friction and confusion
  • Flexibility and efficiency of use: There was no flexibility in terms of personalization of the platform. Different sectors utilized the platform, but it lacked an easy, flexible and intuitive way for industry-specific personalization

On presenting this analysis to the management, I got buy in to start designing solutions

Summary of solution process

Product strategy

  • Design thinking to establish Red routes, mental models for the platform
  • UX Auditing and Usabilty testing
  • Field Research and Interviews
  • Tested prototypes
  • Error reporting

Design deliverables

  • Dashboard redesign
  • ODS security design
  • Devices Groups
  • Alerts
  • Firewall
  • Profiles
  • Reports

Measuring results

  • Use of quick animated clickthrough designs to quickly test solutions
  • MVP prototype testing
  • Measuring results using analytics
  • Checking metrics to see if usage had improved
  • Reports
  • A/B testing design solution options

The Old Design

The New Design Thinking Solution

Information Architecture 

Redesigned the site taxonomy using card sorting techniques to make it easier and more intuitive for users to locate  features and labels

Used interactive infographics and animated data graphics creatively placed with data tables to make the data easier to understand

This is a unique website which will require a more modern browser to work!

Please upgrade today!