What makes the biggest difference to end-user IT support experiences? What would you say (even if it’s a guess)? Speed perhaps? Or, posing the question differently, what does your IT service desk’s customer satisfaction (CSAT) questionnaire feedback tell you about the things – or factors – that strongly influence end-user experience? And this is across all three of good, indifferent, and poor experiences. I’d be willing to guess that there are probably insufficient questionnaire responses, and then inadequate detail to tell what’s making the most significant difference to end-user IT support experiences. So what can you do?
Start by questioning the experience-based value of your current metrics
IT organizations have long had a wealth of industry metrics to choose from, especially for the management of the corporate IT service desk. But how fit-for-purpose are these traditional IT-metric portfolios for measuring how well IT is doing from the end-user perspective?
Think for a minute about the IT metrics your IT service desk employs. How many of these metrics are related to operational performance – in that they measure how well activities are undertaken rather than what’s achieved? I’d be willing to bet that many of them are related to “numbers of things” such as ticket volumes or “the speed of things” such as the average ticket handling time. But what about metrics that tell you how well your IT service desk’s performance is better enabling employees and improving business operations and outcomes? There’s, of course, the aforementioned CSAT questionnaire that many IT organizations employ – but what insight does this really give your IT organization into end-user IT support experiences?
I could dig into the many flaws of CSAT questionnaires here, but I’ll instead “cut to the chase” by instead jumping to the common issue of end-user perceptions of their organization’s IT capabilities, including service and support, being below what the many traditional IT metrics are telling IT leadership. This issue is likely a long-held expectations gap – thanks to traditional IT metrics being supply-side rather than demand-side focused – but it’s also an experience gap, with this accentuated by IT organizations usually having little or no insight into end-user experiences.
Then look at the insight experience data provides
There are many ways in which IT organizations can use the collected employee experience data to gain greater insight into the delivered employee experience, to highlight and understand the impact of issues, and to focus improvement activities in the right areas. For example, and the table below shows aggregate data rather than that for an individual customer, how the level of employee lost productivity differs by IT support channel. Here it’s important to note that this data is for HappySignals customers who have already invested in identifying and addressing experience-related issues rather than IT service providers per se.
This channel-based data shows that, on average, employees lose 78% more productivity when using the corporate IT self-service portal than the telephone channel. I’d bet that this isn’t what your IT organization was expecting when it introduced its self-service capability to make operations and outcomes all three of “better, faster, and cheaper.” It’s also a key reason why employee self-service portal adoption levels, and the associated return on investment (ROI), are far lower than expected.
We can dig deeper into the experience received when end users are helped by IT support staff, with the experience data able to highlight the key factors that affect the end-user experience. For example, the aggregated Global IT Experience Benchmark Report H2/2021 data shows that the top three contributing factors for employee happiness with IT (when a score of 9-10 is given) were:
- Speed of service
- Service personnel’s attitude
- Service personnel’s skills
Whereas unhappy employees (who gave a feedback score of 6 or lower) highlighted:
- That their issue wasn’t solved despite ticket closure
- Slowness of service
- Having to reexplain the issue and provide details repeatedly, i.e. being bounced between people
For completeness, the key factors contributing to neutral feedback responses (a score of 7-8) were:
- Speed of service
- Having to reexplain the issue and provide details repeatedly
- It was difficult for the employee to know where to start
Such data is helpful when a one-dimensional view is taken but also beyond this. For example, while service desk analyst attitude is the second highest factor in making employees happy, it’s virtually insignificant in terms of making employees unhappy (with it the second-lowest factor contributing to employee unhappiness).
These examples only scratch the surface of what’s influencing end-user IT experiences – including the factors that cause both end-user happiness and unhappiness (and the associated adverse impact on employee productivity) – but hopefully, they’re enough for you to realize that you need to know more. If you would like to learn more about how to better understand and manage your end-user IT support experience, please get in touch.