The 10th Global IT Experience Benchmark: H1/2023

This report presents the findings from 860,339 end-user responses in over 130 countries. Discover our latest ITSM benchmarks as well as insights into response rates and the relationship between Time to Resolve and end-user perceived lost time with IT incidents.

HS_Report_hero

Digital vs. Human Experience: Understanding our Benchmark for IT Experience

If you're a CIO, IT Leader, Experience Owner, or Service Owner, you need to understand what the word "experience" refers to in contexts of DEX, XLA and other three-letter acronyms.

The HappySignals Global IT Experience Benchmark tells the story from an internal end-user perspective based on their experiences with IT. It identifies how people feel about IT across different IT touchpoints. Every response in the data represents an actual human relying on IT to do their daily work in large enterprises.
 
The data in this half-year report is based on 860,339 end-user responses from 130+ countries.

H12023-Report_Download-CTA

 


Key Takeways


  • End-user perceived lost work time per reassignment has increased, increasing the importance for IT to understand the impact of reassignments.

  • Response rates vary significantly across countries, providing valuable insights into where end-users may not believe their feedback matters.

  • The faster tickets are solved the higher the response rate. Our data shows a linear relationship between how long a ticket is open and response rates.

  • For the first time, remote work satisfaction surpasses IT service satisfaction, indicating strong end-user appreciation for remote work capabilities and benefits.


The ITXM landscape in late 2023

The IT experience landscape is currently marked by two dominant approaches. HappySignals advocates direct end-user feedback, while Digital Employee Experience (DEX) relies on endpoint data and KPIs for aggregated scores. A growing industry trend combines both data sources, focusing on enhancing the modern digital workplace experience. Understanding the benefits and trade-offs is crucial for IT organizations in this rapidly evolving field.

Defining and Measuring Experience

To navigate this landscape, IT organizations must first define and measure the user experience. At HappySignals, we prioritize a human-centric approach, where the end-user's perspective takes precedence. This perspective is similar to the experience of taking a car to the garage for a warning light and returning to find it fixed — for the end-user, the experience is the end-to-end view, while the diagnostic and resource planning steps are for internal purposes.

People, process, technology... in that order.

In the world of internal IT, users often lack choice in which technologies and services they can use. This means certain telemetry-based metrics, like usage time, etc, don't always indicate more than how long the application was used. Just because someone uses an application extensively doesn't guarantee a positive experience. Maybe the end-user didn't know how to use the application correctly, and that was the reason for the long time of use. We believe a human-centric IT experience approach helps drive better decision-making and highlights improvements that matter most to end-users. While process and technology analysis play a role, they can't replace direct end-user feedback.

Definition of experience in XLAs?

Choosing to prioritize understanding end-users means asking end-users how they feel, preferably on a continuous basis, to get a steady stream of experience data. It means asking for feedback from end-users, which means many can and sometimes will choose not to provide feedback. This is why response rates are often a topic of discussion when new customers are getting onboarded. 

Some feel that the reliance on end-users to provide feedback can be replaced with process and technology data, omitting end-user feedback for XLAs. We believe this can and has already started to lead to "XLA Watermelons," which are similar to the symptoms that created the need for the "experience movement" in the first place. 

This was recently highlighted by Gartner when studying how XLAs currently work when engaging with IT service providers. Gartner stated: 

  • "... (IT) providers often disguise legacy SLAs as XLAs for clients, driving the wrong perception about the effectiveness of XLAs."

  • Clients often lack a proper understanding of what an XLA is and, therefore, are unable to articulate their requirements properly, leading to uncertainties and dissatisfactions related to implementation.

Humans are still the best sensor

It can sound cliché, but humans are still the best sensor when evaluating the actualquality of IT experience. While technology metrics and data analysis can provide valuable insights, they can never fully capture the nuances and emotions that users experience. The subjective feedback from end-users is essential to understand how IT services are truly perceived and utilized.

By prioritizing the voice of the end-user, organizations can gain a deeper understanding of their needs, frustrations, and expectations. This enables IT decision-makers to make informed choices that align with the actual user experience, leading to improvements that matter most to the people using the technology.

Choosing a human-centric IT experience acknowledges that technology alone cannot fully capture the user's perspective.

We hope that this benchmark report can help you, as representatives of IT organizations, better understand how you can improve your IT experience in alignment with end-user needs and expectations.

 

2023-IT-experience-management-people-process-tech

 


Data source and collection methodology

The Global IT Experience Benchmark H1/2023 Report presents and analyzes data from 860,339 end-user responses collected through the HappySignals IT Experience Management Platform between January and June 2023.

On this webpage, we primarily included the 2023 data along with a few selected trends over a longer period of time.

Where does the data come from?

Our benchmark data is collected from all HappySignals customers. These include large enterprises, as well as Managed Service Providers (MSPs) who use the HappySignals Platform with their customers – enterprises and public sector organizations. 
 
Around 60% of HappySignals customers are using Outsourced Service Desk providers. 
 
All responses are from IT end-users – employees using internal IT services – and reflect their feelings and perceptions about IT. 

How is the data gathered?

HappySignals IT Experience Management Platform connects operational data (e.g. from customers’ IT service management (ITSM) platforms) with continuous survey data from end-users about Ticket-based IT and Proactive IT areas.

Ticket-based IT (Incidents and Requests):
End-user responses are immediately collected when tickets are resolved. Surveys are sent after each ticket, asking end-users to accept the resolution by giving feedback about their experience. The average response rate for HappySignals customers in 2023 is around 22%, with variations between different companies and geographies.

Proactive IT:
Surveys are sent proactively to end-users about Proactive IT areas (e.g. Overall IT Experience, Enterprise Applications, Laptops and Computers, Remote Work, Office Environment) rather than in connection with tickets. These surveys can be scheduled to target relevant end-users at optimal frequencies, enabling continuous measurement of non-ticket-based IT areas. 

What data is gathered?


Happiness:
End-users rate how happy they are with the IT area being measured (e.g. recent ticket-based service experience, Enterprise Applications, Mobile Devices, etc) on a scale from 0-10.


HappySignals then calculates the % of 9-10 scores - % of 0-6 scores = Overall Happiness (a number between -100 to 100).

Productivity:
End-users estimate how much work time they lost due to the IT touchpoint being measured.

Factors:
End-users select from a list of suggested reasons – which we call Factors – that influenced their Happiness rating. Multiple factors can be selected. These factors could be seen as experience indicators, a term also used in the IT Experience Management area.

The surveys automatically tailor the factors shown to each end-user depending on what IT area is being measured and whether the Happiness rating given in the first question was positive, negative, or neutral. Examples of factors include “It was difficult to know where to start” (Ticket-based Services) and “Applications are too slow” (Enterprise Applications).

 

Chapter_1_HSReport

Chapter 1:

Experience across IT

IT Happiness across all measurement areas

Key insights

  • Ticket-based services (incidents and requests), Collaboration with IT, and Remote Work are still the highest-rated areas of IT. 
  • Service Portal Experience has gone up again and reached +46, a new high since we started measuring it right after the Covid pandemic broke out.  
  • While end-users feel they lose more time with Remote Work than with Office Environment-related issues, they still rate Remote Work +41 points higher than Office Environment.

 

H1_23_Global_IT_Experience_Benchmark-IT-Experience-Across-Measurement-Areas

 

Measurement Areas H2/2022 Happiness H1/2023 Happiness
Overall IT Experience +39 +39
Services +79 +80
Collaboration with IT +85 +83
Remote Work +80 +82
Service Portal +43 +46
Office Environment +42 +43
Mobile Devices +9 +5
Laptops and Computers +20 +15
Enterprise Applications +15 +9

 

 

NB! Numbers may vary from the last report due to data quality improvements, which have been applied to historical data as well for better comparability. Scores are calculated with the same mathematical model as NPS. (Read about What is the difference between NPS and HappySignals?)

 

What is the business impact of ITXM on the Overall IT Experience?

One of the common mistakes is assuming which IT touchpoints make IT end-users happy. New customers are often surprised by the touchpoints highly rated by their end-users. Contrary to popular belief, IT services are frequently among the most highly rated IT areas.

If the goal is to enhance the overall employee experience with digital technologies, it's crucial to be aware of which areas are liked the most and the least.

Real-time experience data across different IT touchpoints provides valuable insights that facilitate conversations between IT, HR, and business functions.

Having a comprehensive understanding of the IT experience enables the company to allocate resources based on employee feedback data rather than the gut feelings of leadership team members. This, in turn, leads to a higher success rate in digital transformation projects.

 

Chapter_2_HSReport

Chapter 2:

Overall IT Experience - What really matters?

In this section, we look at the concept of Overall IT Experience. Unlike an average of our other IT experience measurement areas, it's a distinct measure capturing the overall perception of IT within the organization. Think of it as the "reputation" of IT within the enterprise.

The Overall IT Experience survey is an alternative to annual IT surveys. It provides continuous insights into how people generally feel about IT and the amount of time end-users perceive as lost each month due to IT-related issues. After end-users provide their scores on a scale of 0 to 10, they're asked to identify which specific aspects of IT influenced their ratings.

Below, we present the findings of the areas that end-users selected as contributing factors for their given scores. This data offers a quick and digestible view of the factors shaping end-user perceptions of IT services.

 
H1_23_Global_IT_Experience_Benchmark-What-Areas-Impact-Overall-ITX
 
Understanding what contributes to the Overall IT Experience for end-users can be done by looking at how often certain factors are selected in negative, neutral, and positive experiences.
 
These results indicate that when end-users are asked why they're happy or unhappy, IT support services have the biggest impact on their overall IT experience. 
 
If certain factors are seen with positive responses, but hardly ever with negative ones, it would suggest that that aspect of Overall IT Experience is important for a good experience but not so crucial for a bad experience.
 
Positive Scores
Based on the percentages of factors selected by end-users giving a positive score (9 or 10) for IT service management, here are some conclusions:

  • IT Support Services and IT Personnel’s Attitude are Critical: The fact that 58% of respondents selected IT Support Services and 57% selected IT Personnel’s Attitude as contributing factors suggests that the performance and behavior of IT support staff play a pivotal role in end-user satisfaction. It indicates that responsive, helpful, and friendly IT support is highly valued.
  • Quality IT Communication and Training Matters: The 18% selection of IT Communication and Training as a contributing factor shows that effective communication and comprehensive training are essential for end-users to have a positive experience. This suggests that clear communication about IT changes and user education can enhance satisfaction.
  • Remote Work Tools & Support is Significant: With 29% selecting Remote Work Tools & Support, it's evident that the ability to work remotely and the support provided for remote work significantly impact end-user satisfaction. This may be particularly relevant given the increasing importance of remote work in the modern workplace.
  • Hardware and Software are Important but not Dominant: While Computers & Equipment (30%), Work Applications (22%), and IT Security (19%) are important, they're not the most frequently selected factors. This suggests that having reliable hardware and software is expected, but it's the human and support aspects of IT that strongly influence overall satisfaction.
  • Mobile Phones and Office IT Facilities have Limited Impact: The relatively low selection of Mobile Phones (13%) and Office IT Facilities (17%) as contributing factors implies that, in the context of work in large enterprises, these factors have less influence on end-user satisfaction compared to others.
  • Potential for Alignment with Research: These findings align with existing research that emphasizes the importance of strong IT support, employee training, and remote work capabilities in ensuring user satisfaction in the workplace.

In conclusion, the most critical factors for end-user satisfaction in the context of IT service management in large enterprises are the quality of IT support services, the attitude of IT personnel, and effective communication and training. While technology and hardware are important, they're not the primary drivers of user satisfaction in this context. These findings emphasize the human aspect of IT support and service as the key to positive user experiences.

Neutral Scores

Based on the percentages of factors selected by end-users giving a neutral score (7-8) for IT service management, we see some differences compared to the positive scores.


  • Hardware and Equipment improvements could make people happier: The fact that 37% of respondents selected Computers & Equipment as a contributing factor and 16% selected Mobile Phones suggests that hardware-related issues are more prominent for users with neutral satisfaction. This indicates that ensuring the reliability and quality of devices is critical to improving their experience.
  • IT Support Services Remain a Consideration: While IT Support Services were selected by 34% of respondents, they were less dominant compared to those giving positive scores. It implies that some users with neutral satisfaction might be seeking better or more responsive support, but it's not as significant a concern as for those with negative scores.
  • Importance of Work Applications and Remote Work Tools: Work Applications (18%) and Remote Work Tools & Support (22%) are factors that users with neutral scores are concerned about. This underscores the importance of software applications and remote work capabilities in shaping their satisfaction.
  • Limited Focus on IT Personnel’s Attitude: Unlike those with positive scores, users with neutral satisfaction (8%) selected IT Personnel’s Attitude as a contributing factor to a much lesser extent. This may indicate that their primary concerns lie more with technical aspects than with the attitude of IT personnel.
  • Minimal Emphasis on IT Security: Only 4% of users with neutral scores selected IT Security as a contributing factor. This suggests that, for this group, security concerns are not a significant factor in their neutral satisfaction level.
  • Potential Areas for Improvement: The responses from users with neutral scores provide insights into specific areas for improvement. Addressing hardware issues, enhancing work applications and remote work tools, and maintaining a responsive IT support service is crucial to shifting neutral experiences towards more positive satisfaction.

Negative Scores

Based on the percentages of factors selected by end-users giving negative scores (0-6) for IT service management, here are some conclusions you might draw:

 

  • IT Support Services are the Primary Pain Point: A significant 62% of respondents selected IT Support Services as a contributing factor to their negative experience. This underscores the critical importance of responsive and effective IT support in addressing end-user dissatisfaction.
  • Hardware and Equipment Problems are Prominent: The selection of Computers & Equipment (41%) and Mobile Phones (17%) as contributing factors highlights hardware issues as a significant concern for users with negative scores. This indicates that equipment problems can lead to a poor user experience.
  • Challenges with Work Applications and IT Communication: Work Applications (26%) and IT Communication and Training (25%) are also factors contributing to negative experiences. This suggests that issues related to software applications and ineffective communication/training can be sources of dissatisfaction.
  • Importance of Office IT Facilities: The 24% selection of Office IT Facilities indicates that the state of the workplace's IT facilities plays a significant role in shaping negative experiences. This may include issues with infrastructure and facilities provided to employees.
  • Limited Emphasis on IT Security: Similar to users with neutral scores, only 4% of users with negative scores selected IT Security as a contributing factor. This suggests that security concerns are not a primary driver of their dissatisfaction.
  • Moderate Concerns about IT Personnel’s Attitude: While 20% selected IT Personnel’s Attitude, it's not as prominent a factor for dissatisfaction as support services. This indicates that while the attitude of IT personnel is a concern, it's overshadowed by more pressing issues.

Conclusion

End-user happiness with IT services is an outcome influenced by a combination of factors, including both people and processes. The quality of IT support services largely drives positive experiences, the attitude of IT personnel, and effective communication and training, highlighting the paramount role of the human element in delivering satisfactory IT services. However, the analysis also reveals that hardware and software-related concerns, such as Computers & Equipment and Work Applications, are equally vital in shaping both neutral and positive experiences.

In neutral experiences, the focus shifts to maintaining a balance between addressing hardware and software issues while ensuring the quality of IT support services. This suggests that neutral satisfaction is a result of addressing both people-centric and process-centric factors, emphasizing the significance of both the human touch and effective operational procedures.

In contrast, negative experiences are primarily process-related, with IT support services, hardware problems, and software applications playing central roles in shaping dissatisfaction. This highlights the critical need for process improvement, but it doesn't negate the importance of the human element in delivering IT services.

Ultimately, IT service management needs to consider a holistic approach to ensure a positive IT experience for end-users in a large enterprise setting.

 

Chapter_3_HSReport

Chapter 3:

Geographical differences in IT Experience

Like in previous reports, we observe cultural differences in how end-users perceive IT services and lost time. For example, end-users in Western Europe rate their happiness with resolved incidents lower than those in Eastern Europe despite reporting less lost time. 
 
In this report, we did additional work with the data quality to eliminate the unwanted impact on the average score of high response volumes in certain countries.

Key findings for regional differences in experience with incident resolutions

  • Western European end-users are still the most critical (+74) despite losing less time (2h34min) than in other regions.
  • Eastern European and South American end-users are the happiest (+88) with how incidents are resolved.
  • The spread of the happiness score between regions is higher with incidents than with requests. 

 

H1_23_Global_IT_Experience_Benchmark-Country-differences-incididents-experience-productivity

    

 

line

Happiness with fulfilled requests in different regions


Differences between regional differences for incidents and requests

We observe interesting regional differences between how the numbers vary between incident and request Happiness and Lost Time. 

These numbers appear to suggest that Western European end-users are more demanding in handling incidents, but North American end-users have higher expectations for requests. We’ll continue to track these numbers and report more in-depth in future reports. 

Findings for regional differences in experience with resolved requests

  • Eastern Europe, the Middle East, and South America have the highest happiness with requests (+88) but display some differences in lost time, especially in South America, where the average lost time is 6h 20min.

FINAL-H12023-Global IT Experience Benchmark_Webinar Presentation

Key insights

Different cultures perceive and evaluate IT services in different ways. A specific score in one region is not directly comparable to the same specific score in another region. Having comparable benchmark data helps set expectations and provides an external angle for a better understanding end-user experience. 

 

How to use this information in practice

IT service desk leaders can compare the scores to the country benchmark data to choose which countries to focus on. Using the comparison to benchmark data (in addition to internal averages) can help avoid pushing agents towards unachievable goals or reversely avoid getting too comfortable in regions where higher scores are culturally more common.  

  

Regional differences in response rates

Understanding the variation in IT survey response rates across different countries provides a perspective that the overall average fails to capture. These variations can be attributed to cultural differences, local work dynamics, and user expectations. Taking these considerations into account allows you to work specifically on communication methods and support approaches to encourage end-users to provide feedback. Making sure end-users know their feedback actually matters is the best way to drive higher response rates.  

 

H1_23_Global_IT_Experience_Benchmark-Response-rate-varies-between-countries

What influences survey response rates? 

In simple terms, if the survey recipient doesn't believe the response to make a difference, then the motivation to fill out the survey will be low. 

McKinsey studied this in more detail in this article and they conclude:

"A common belief is that survey fatigue is driven by the number and length of surveys deployed. That turns out to be a myth. We reviewed results across more than 20 academic articles and found that, consistently, the number one driver of survey fatigue was the perception that the organization wouldn’t act on the results."

Therefore, when looking at response rates across different countries, we encourage you to consider if the end-users in low response rate locations really feel that their voice matters as much as those in high response rate countries. 

Chapter_4_HSReport

Chapter 4:

Response rate findings

Response rate vs average response rate per end-user

In our benchmark report, we looked at the initial response rate data and scratched our heads. It seemed as though noise from different sources was impacting the data. After consideration, we opted to utilize the "average response rate per user" metric when assessing post-incident IT surveys. This choice was driven by our desire to gain insight into the typical response rate an end-user typically provides.

The standard way of calculating response rates is [sent surveys / surveys completed = response rate]. However, this method is less reliable in a ticket-based IT survey scenario where there is no limit to how many surveys a single user can get.

Could you limit the number of surveys sent to a single end-user? Sure! But that would mean that maybe the ticket that was business critical for the user now would not provide the possibility of giving feedback, while the three previous tickets for simple things did.

To illustrate this approach, consider the following scenario:

  • User ID #1 received 20 surveys and provided 14 responses, while
  • User ID #2 received just one survey and responded accordingly
  • User ID #3 received 50 surveys and replied to all 50, but these tickets were a "creative" use of tickets as to-do items.
  • User ID #4 received 1000 surveys but did not respond, as it was an automated monitoring ID and email. 


Calculating a straightforward overall response rate by summing all responses and surveys could be misleading due to potential outliers, such as super-users or automated monitoring inboxes. Why not just check the data quality and keep it perfect? This takes time and can vary with changes in a number of environmental changes. 

This approach drastically reduces the impact of these outliers by computing the average of averages, giving us a clearer representation of the typical response rate per end-user ensuring a more comparable reflection of response rates. This method effectively minimizes the influence of specific users with high ticket volumes, system-generated responses or super-users that raise tickets on behalf of end-users.

Large variations within an organizations

The typical response rate overall hides a more nuanced reality across different business units and locations. When we analyzed customer data with response rates across various offices in different countries, we found that the variations can be very high. 

We compared the country-specific response rates within customer environments and found variations of response rates between 5% and 45% within contexts that, in theory, have a standardized IT service delivery regardless of location. 

Just like with IT happiness, there seem to be very human influencing factors when it comes to response rates as well.

Response rates as an indication of statistical significance

Response rates alone do not tell the whole story. Depending on how large the end-user base surveyed is, the number of responses to reach 95% confidence levels is also impacted. 

Below is a table from "The Good Research Guide" by Martyn Denscombe that outlines what kind of response volumes are needed to reach statistical significance levels.

 

The numbers in each column for x% margin of error indicate the required number of responses

Number in the population 5% margin of error 3% margin of error 1% margin of error
50 44 48 50
100 80 92 99
250 152 203 244
500 217 341 475
1,000 278 516 906
5,000 357 879 3,288
10,000 370 964 4,899
100,000 383 1,056 8,763

Response rate changes with tenure in a company

Our initial hypothesis was that new employees would be more likely to respond to surveys from IT than employees who had been longer in the company. This proved to be false. 

According to our data, the employees who have been in the company for more than four years are both the largest group of employees and the ones who are most likely to respond to IT surveys. These numbers reflect that change in typical response rates compared to the average. 

The data below can give you an indication of what you might find in your own data. If your average response rate is 20%, according to our benchmark data, the response rate for more experienced employees with more than four years in the company would be 25.57% and new employees with less than six months in the company would have a response rate of 18.98%.

 

Employment age (tenure) Response rate compared to average Sample size
    Employed 0-6 months -1.02 % 53,304
     Employed 6-12 months -0.33 % 49,221
Employed 1-2 years -0.22 % 74,878
Employed 2-3 years +0.47 % 52,321
Employed 3-4 years -0.12 % 41,306
Employed 4+ years +5.57 % 231,700

Response rate decreases when tickets are open for a long time

Key finding: The longer a ticket is open, the lower the response rate is.

As tickets remain open for a longer duration, it's only logical that the response rate on post-incident IT surveys would decrease. This can be attributed to various factors, such as the fading urgency of the issue, a decrease in the user's motivation to provide feedback, or simply the passage of time leading to forgetfulness. It's essential for IT teams to understand this correlation between ticket duration and response rate, as it allows them to gauge the effectiveness and relevance of their surveys. By recognizing the impact of ticket duration on survey participation, IT departments can adapt their strategies and prioritize prompt resolution to ensure a higher engagement and response rate from end-users.

For this data, we looked at a longer time period of 12 months from October 2022 to October 2023

Time to Resolve Response rate compared to average Sample size
1 day +5.63 % 435,428
2 days +3.24 % 160,473
3 days +2.49 % 106,537
4 days +2.17 % 91,790
5 days +2.16 % 78,917
6 days +1.81 % 75,198
1 week +1.36% 75,227
2 weeks +1.04% 158,316
3 weeks +0.27% 81,245
1 month -0.37% 58,391
2 months -1.28% 64,501
3 months -2.27% 26,750
4 months -2.75% 15,443
5 months -3.42% 10,007
More than 5 months -5.65% 29,107

 

benchmark_data

Chapter 5

Time to Resolve findings

How MTTR changes by score and perceived lost time

Time to resolve shows interesting patterns

Some interesting patterns emerged, looking at how long a ticket is open in calendar time and comparing it to the human perspective of end-users, independent of SLA-clock pauses. 

  • There isn't a linear relationship between how long a ticket is open and the score that end-users give a ticket resolution. 
  • The ratio between the end-user perception of lost time and time to resolve is not linear, with a higher correlation between the two extremes.

Lost time and median time to resolve

When end-users estimate a loss of five working days (40 hours), the median resolution time can stretch to 221 calendar hours, roughly 80 working hours. This connection between user-reported time loss and ticket duration offers valuable insights for IT service desk teams in resource allocation, particularly when users perceive significant productivity impacts due to extended ticket lifespans. 

 

Estimated work time lost Median time to resolve
15 min 40 minutes
1 hour 6 hours
6 hours 26 hours
40 hours 221 hours

Median time to resolve per score given

First lets look at different scores and how the median time to resolve and average perceived time lost changes. All times have been rounded up to the closest full hour. 

We observe an interesting disconnect between perceived lost time and time to resolve. While the end-user perceived experience (scores 0-10) and perceived lost time correlate linearly, there is an interestingly short time to resolve on tickets where the score is 0 or 1. 

What would cause this disconnect? 


Score given Average perceived time lost Median time to resolve
0 16 hours 55 hours
1 14 hours 94 hours
2 12 hours 140 hours
3 10 hours 141 hours
4 10 hours 120 hours
5 9 hours 99 hours
6 8 hours 104 hours
7 7 hours 89 hours
8 4 hours 49 hours
9 2 hours 24 hours
10 2 hours 4 hours

 

H1_23_Global_IT_Experience_Benchmark-Time-lost-per-score

H1_23_Global_IT_Experience_Benchmark-TTR-correlatin-score

Is the disconnect between perceived lost time and time to resolve an indicator of business-critical services? 

The end-user perception of lost work time can highlight areas within the business where IT issues impact employee productivity. 

The table below shows when and where the end-user perception of time lost best correlates with how long a ticket is open. 

Reading the table below can be a bit tricky. Think of the ratio lost time to Time to Resolve (TTR) as an indicator of business criticality. If the ratio between how long a ticket is open and end-user perception of lost work time is high, it means that every hour the ticket remains open, a significant part of the end-user's ability to work productively is hindered. 

When the ratio of ticket open time to perceived lost time is high, it suggests that the service is crucial for the business. In such cases, end-users indicate that every hour the ticket remains open significantly affects their work. To understand what this would mean for your IT organization, having access to experience data allows diving into the operational data and other details to shed more light on the causes.

Our initial analysis is that these tickets could be worth prioritizing, as they're likely to have the highest impact on end-user productivity and potentially the business as a whole.

 

Score given Average perceived time lost Ratio lost time / TTR Median time to resolve
0 16 hours 29% 55 hours
1 14 hours 14% 94 hours
2 12 hours 8% 140 hours
3 10 hours 7% 141 hours
4 10 hours 8% 120 hours
5 9 hours 9% 99 hours
6 8 hours 8% 104 hours
7 7 hours 7% 89 hours
8 4 hours 9% 49 hours
9 2 hours 10% 24 hours
10 2 hours 39% 4 hours

 

 

Chapter_4_HSReport

Chapter 6

IT Service Desk Benchmarks

Incidents and requests


Our method for measuring end-user experience with Ticket-based IT (Incidents and Requests) involves sending surveys to end-users after each ticket resolution using HappySignals. These surveys ask end-users to provide feedback on their experience and cover the following metrics:

Happiness: End-users rate how happy they are with their recent service experience on a scale from 0-10. HappySignals then calculates the % of 9-10 scores - % of 0-6 scores = Overall Happiness (a number between -100 to 100).
 
Productivity: End-users estimate how much work time they lost due to the service experience.
 
Factors: End-users select from a list of suggested reasons – which we call Factors – that influenced their Happiness rating: e.g., “Service was slow,” “My ticket was not solved.” Multiple factors can be selected.

line

Happiness for IT Incident resolutions has remained stable in 2022

The upward trend over several years with IT incident resolutions happiness has stabilized. Average lost time with IT incidents is slightly going up or down a few minutes between 6-month periods but, all in all, the changes in H1/2023 were minor. 

 

H1_23_Global_IT_Experience_Benchmark-Incidents-Happiness-Lost-Time

Stability of average happiness, but what does the typical IT Experience Management journey look like? 

The explanation for the stable average happiness in our Benchmark report might feel like experience management at some point stagnates and improvements become increasingly hard to find, but the overall average hides the fluctuations over time in customer data. 

As you can see in the graph below, different companies start from very different starting points. As you look at the graph below, remember that neither the top performers nor the bottom ones are the same. Things happen in IT; as a result, even the best IT organizations can see dips in IT experience as perceived by their end-users. 

Notable in the graph below is how, after 18 months of IT Experience Management, not a single customer has an average score of below +40, which is a remarkable improvement over hovering around +10-20. 

 

H1_23_Global_IT_Experience_Benchmark-First-two-years-of-ITXM

The data analyzed here contains most of our customers, but we have excluded certain outliers that started from non-comparable situations. 

This graph, together with the following customer example graph, aims to highlight how experience is dynamic. The experience management journey looks different for each of our customers.

Customer example

The graph below is from one of our customers who started their IT Experience Management journey in September 2022. They have managed to improve their end-user experience in the first year, but as you can see, experience is dynamic. It changes daily, and understanding the dips and peaks allows IT to improve, step-by-step, improvement by improvement. 

Customer-graph-12m-first-year-Benchmark-H1-2023

 

Perceived lost time with incidents

The definition of lost time: End-users estimate how much work time they lost due to the service experience.

80% of lost time with IT incidents comes from only 13% of tickets. 

The data below was first reported in the H1/2022 Benchmark report but is still very relevant. The section below is identical to the previous report. 

Looking more closely at lost time across all incidents, we can see that the distribution of perceived lost time is unevenly represented on both sides of the spectrum. The historical trend shows how this polarization of the speed of service means that tickets solved in under eight hours are solved even faster year after year, but tickets that lead to more than eight hours of lost time take even longer. 

This explains the paradox of rising average lost times in the last two reports. Most end-users receive faster IT support with IT incidents, but those that don't have to wait an increasingly long time. This leads to average lost time going up due to the even longer waiting times for the tickets, leading to more than eight hours of perceived lost time. 

 

H1_23_Global_IT_Experience_Benchmark-80-percent-13-tickets

Understanding where end-users lose the most time provides a valuable focus

When IT identifies where end-users are losing time, they will find improvement opportunities that greatly impact every issue that gets solved. Understanding where people lose only small amounts of productive work time allows IT to identify automation candidates that would liberate time from agents to solve more complicated issues. 

 

H1_23_Global_IT_Experience_Benchmark-Average-lost-time-is-not-typical

line

Factors: The ‘Why’ behind end-user happiness or unhappiness

Understanding the reasons behind end-user's dissatisfaction with ticket-based IT services is crucial for improving them, and the HappySignals IT Experience Management Platform excels in identifying these factors.

 

H1_23_Global_IT_Experience_Benchmark-top-three-factors-incidents

 

Using a standardized list of Factors developed from research with IT end-users, we ask end-users to select the factors that best reflect their satisfaction or dissatisfaction with the service in a survey sent to them after a ticket resolution. The timing of the survey delivery is optimized to assess end-user feelings about the service at the moment of their experience, making our Factors data more reliable.

Different factors are presented to end-users depending on their happiness rating on a 10-point scale, and they can select as many factors as they wish from the list. Factors related to service agents are included in all three scenarios of negative (0-6), neutral (7-8), and positive (9-10) experiences. For example, for IT incidents, 75% of responders who gave a positive happiness rating and selected at least one factor were happy about the speed of service.

End-users can select multiple factors, thus making the percentages add up to more than 100%. By monitoring and analyzing these factors, IT organizations can gain valuable insights into end-user satisfaction and identify areas for improvement in their ticket-based services.

Historical data on factors for IT incidents

The factors that create positive, neutral, and negative experiences with IT Incidents for end-users remain very stable.

 
IT Incidents - Positive Factors 2019 2020 2021 2022 H1/2023
Speed of service 75% 74% 74% 75% 75%
Service personnel's attitude 52% 55% 56% 55% 55%
Service personnel's skills 48% 49% 50% 49% 50%
Service was provided proactively 28% 34% 36% 37% 38%
I was informed about the progress 29% 33% 35% 35% 36%
I learned something 21% 25% 26% 26% 26%
 
 
IT Incidents - Neutral Factors 2019 2020 2021 2022 H1/2023
Speed of service 58% 57% 55% 55% 55%
I had to explain my case several times 20% 21% 21% 21% 21%
It was difficult to know where to start 11% 11% 12% 12% 12%
I wasn't informed about the progress 11% 11% 10% 10% 10%
Service personnel's skills 10% 8% 8% 8% 8%
Instructions were hard to understand 7% 7% 8% 8% 8%
Service personnel's attitude 7% 6% 7% 7% 6%
 
 
IT Incidents - Negative Factors 2019 2020 2021 2022 H1/2023
 My ticket was not solved 40% 46% 47% 49% 51%
Service was slow 47% 44% 44% 43% 41%
I had to explain my case several times 29% 30% 29% 30% 29%
I wasn't informed about the progress 16% 16% 16% 16% 17%
Service personnel's skills 12% 12% 12% 12% 11%
Instructions were hard to understand 7% 8% 8% 8% 8%
It was difficult to know where to start 7% 7% 7% 7% 7%
Service personnel's attitude 6% 7% 7% 7% 7%

 

IT Support Channels

 

In service delivery, IT teams must also develop channels to enhance end-user satisfaction. To create channels that improve employee happiness, it's essential to obtain reliable and detailed experience data about how end-users utilize and perceive different channels.

Without acquiring and utilizing this data, IT teams may mistakenly allocate resources to add new channels unnecessarily, encourage end-users to use them, or focus on improving channels already performing well instead of those requiring attention. 

Usage of different channels for IT incidents

Our channel usage data reflects the recent trend in the ITSM industry of developing channels with automation and predefined user flows to reduce the workload on service desk agents. This trend is expected to continue as IT organizations strive to improve efficiency while enhancing the overall customer experience. Investments in service portals, smart AI-powered chats, and proactive monitoring of services with self-healing capabilities all aim to optimize the use of technology across different teams.

However, we advise against losing sight of end-user needs by continuously monitoring how their experience changes when support channel recommendations and usage are modified. If possible, establish a baseline for experience data before the change, track changes during the transition, and draw conclusions by assessing the experience a few months after implementation.

Note that the total percentages don't add up to 100% because we exclude channel categories that cannot be accurately categorized into the existing five categories.

Focus your resources on improving existing channels, not adding new ones

Based on the data from all our customers, there are only slight differences in overall happiness with the digital channels – Chat, Email, Phone, and Portal (all range from +74 to +81). The only channel with significantly higher happiness is Walk-in (+94). The perception of lost time is also by far the lowest for Walk-in IT support, with just 1h 24min on average per incident, 1h less than the second least time-consuming channel, Phone.

 

H1_23_Global_IT_Experience_Benchmark-Happiness-and-lost-time-channels

 

Historical data, year-on-year changes in which channel the incident ticket is raised, and the latest result reflecting the first six months of 2023.

Numbers rounded up to the closest full percent. 

Channel usage for IT Incidents 2019 2020 2021 2022 H1/2023
Chat 8% 9% 9% 9% 8%
Email 19% 17% 15% 15% 12%
Phone 30% 30% 27% 25% 25%
Portal 33% 33% 38% 39% 39%
Walk in 6% 3% 3% 4% 5%
Other 4% 8% 8% 8% 11%
 

 

line

Ticket reassignments impact end-user happiness and productivity significantly

Each time a ticket is reassigned, end-user happiness decreases by nearly eight points, and users lose an average of 1 hour and 42 minutes of work time, ranging from 0 to 4 reassignments. When a ticket is reassigned four times, it can result in a total loss of 8 hours and 22 minutes!

Our data, collected over the past four years, has shown consistent trends in the frequency of ticket reassignments and the corresponding impact on end-user happiness and lost time. Over the years, the amount of time end-users lose with each reassignment has increased, while the amount of reassignments has decreased for most customers.  

This is one of the most potential areas where IT Experience data can help IT teams get quick wins in increasing end-user productivity by ensuring incidents are directed to the right teams as soon as possible. 


Benchmark-The Impact of Reassignments

Historical data on reassignments

Percentage of reassignments per incident 

The percentages when tickets are reassigned three or four times do vary, but rounding up to the closest percent makes their proportion of all tickets seem like it doesn't change. 

Reassignments H1/2020 H2/2020 H1/2021 H2/2021 H1/2022 H2/2022 H1/2023
0 56% 55% 52% 51% 53% 54% 55%
27% 27% 30% 31% 30% 30% 29%
2 7% 8% 9% 9% 9% 8% 8%
3 3% 3% 3% 3% 3% 3% 3%
4 1% 1% 1% 2% 1% 1% 1%

 

Happiness per number of reassignments

Reassignments H1/2020 H2/2020 H1/2021 H2/2021 H1/2022 H2/2022 H1/2023
0 +76 +79 +81 +82 +81 +81 +82
+67 +70 +75 +77 +76 +77 +78
2 +58 +62 +65 +68 +68 +68 +69
3 +48 +51 +54 +61 +60 +63 +64
4 +43 +45 +46 +50 +51 +51 +51

 

Lost time per number of reassignments

Reassignments H1/2020 H2/2020 H1/2021 H2/2021 H1/2022 H2/2022 H1/2023
0 2h 9min 1h 53min 1h 45min 1h 54min 1h 55min 2h 6min 2h 2min
4h 1min 3h 41min 3min 23min 3h 28min 3h 37min 3h 38min 3h 37min
2 6h 6min 5h 23min 5h 0min 5h 10min 5h 9min 5h 45min 5h 44min
3 7h 56min 7h 9min 6h 27min 6h 49min 7h 7min 7h 5min 7h 5min
4 9h 51min 8h 2min 8h 29min 8h 16min 8h 15min 9h 23min 8h 22min

lineDifferent Support Profiles have different service expectations

While we've covered the experiences of IT end-users in previous sections, it's important to note that there are also differences in behavior and motivation among them. Knowing these differences can help tailor IT services for different types of end-users.

This is where HappySignals IT Support Profiles can be useful.

We conducted interviews with over 500 end-users and found that two main behavioral drivers, Competence and Attitude, have the greatest impact on end-user behavior and experience. Competence refers to the end-user's capability to fix IT issues independently, while Attitude pertains to their willingness to solve the problem independently.

By mapping these behavioral drivers, we defined four Support Profiles: Doer, Prioritizer, Trier, and Supported. For more information on using these profiles in the IT Service Desk, refer to our Guide.

Doers are still the most critical, Supported remain the easiest to please

Consistent with previous years, Doers again have the lowest Happiness of +75, while Supported are still the happiest with Happiness of +86.

 

H1_23_Global_IT_Experience_Benchmark-Happiness-lost-time-support-profiles

 

One interesting change in the data is the diminishing portion of doers and supported across the benchmark data. Prioritizers, on the other hand, have increased. This means more people today are competent IT end-users, but they still wish for IT to fix their issues. 

Does each Support Profile prefer different support channels?

Observing how different support profiles utilize various channels confirms the behavioral drivers identified in our original research about IT Support Profiles. The data on IT incident channel usage by different profiles reveals the following patterns:

Self-service portals are the preferred channel for Doers, who enjoy solving issues themselves and are least likely to use the Phone.

Prioritizers, who prefer minimal participation in issue resolution use the Phone more frequently than other support profiles.

Supported and Triers use Walk-in IT support 30-35% more often than Doers and Prioritizers, reflecting their preference for personal assistance and learning from patient service personnel.

For further information on how to customize services to serve different end-users in the organization better, we suggest downloading our Definitive Guide on IT Support Profiles.

The image below is from our previous report, but the profile characteristics and preferences haven't changed.

2022-10-Support Profiles

 

What is the business impact of understanding end-user IT Support Profiles?

Although you can't change your end-users, you can customize your IT services to suit various support profiles. One way to do this is by adjusting how service agents communicate with each profile when they reach out to the service desk. For instance, Doers and Prioritizers may prefer technical jargon, while Supported and Triers may benefit from simple language and step-by-step instructions.

Another approach is to analyze the data by profiles to identify which channels work best for different profiles. Then, you can develop and promote these channels to the relevant end-user profile groups.

Check out our comprehensive guide to learn more about using support profiles to enhance ticket-based services!

 

 

line

A brief look at IT requests

Key insights

  • End-user Happiness has continued to increase yearly, but so has the lost time with requests. 
  • Requests submitted through the portal have a very high level of happiness but still lose more time than requests by phone, chat, or walk-in IT support. 

 

We traditionally look at consecutive periods in the data to look at trends over time, but to gain a new perspective that has not been presented before, the graph below shows only the first six calendar months of the year in the last four years. While the data does not show anything dramatically different from previous reports, it shows the overall happiness trend very slowly improving, with increasing lost time per request. This is not unexpected as IT organizations are actively promoting self-service to reduce IT time spent with request management. While it reduces IT resource needs, it also pushes some of the time spent with requests to the IT end-user's work time. 

 

2023-final-Requests_Lost_Time_Happiness

IT Request channel distribution



  • The trend continues, and lost time is higher for IT Requests in Email than Portal.
    • Happiness with requests submitted via Portal has steadily improved, while lost time with requests submitted via Email has worsened.
    • The increase in lost time through Email is also reflected in lower end-user happiness. According to our data, Email is the worst-performing channel for requests in terms of happiness and lost time. 

 

2023-Channel usage for IT Requests

 

  • New channels are being used
    • In the last six months, over 20% of IT requests have been submitted through channels not reflected in our benchmark data. These have been categorized as “Other”.
    • These “Other” channels are mostly a matter of non-standard configurations in the ITSM tools and new, more experimental support channels. 

Analyzing Request Ticket Channels: Yearly Changes in Historical Data

 

Channel usage for IT Requests 2019 2020 2021 2022 H1/2023
Chat 1% 2% 3% 2% 2%
Email 8% 8% 9% 6% 3%
Phone 20% 18% 12% 13% 11%
Portal 63% 54% 57% 57% 60%
Walk in 1% 2% 1% 1% 1%
Other 7% 16% 18% 21% 23%
      
IT Request factors for happiness

Historical data on Factors for Happiness and Lost Time with IT Requests

The saying “Technology changes, People stay the same” rings true in our data. The Factors that create positive, neutral, and negative experiences with IT Requests for end-users have remained stable over the last four years. 

The only slight changes have happened in a decrease in the selection of service personnel's attitude and skills as factors, which could very well be explained by the increased amount of requests that don't require service personnel to intervene. Instead, requests are increasingly handled in self-service portals.

 
IT Requests - Positive Factors 2019 2020 2021 2022 H1/2023
Speed of service 79% 79% 80% 80% 80%
Service personnel's attitude 49% 48% 47% 45% 45%
Service personnel's skills 46% 46% 45% 44% 44%
I was informed about the progress 31% 34% 36% 34% 36%
It was easy to describe what I wanted 31% 32% 33% 33% 34%
Instructions were easy to understand 29% 31% 32% 32% 32%
 
 
IT Requests - Neutral Factors 2019 2020 2021 2022 H1/2023
Speed of service 58% 57% 58% 56% 55%
I had to explain my case several times 15% 15% 15% 15% 15%
It was difficult to know where to start 12% 11% 12% 11% 13%
I wasn't informed about the progress 11% 12% 11% 11% 11%
It was difficult to describe what I needed 8% 8% 8% 9% 8%
Instructions were hard to understand 8% 8% 7% 8% 8%
Service personnel's skills 7% 6% 6% 7% 7%
Service personnel's attitude 5% 4% 6% 6% 5%
 
 
IT Requests - Negative Factors 2019 2020 2021 2022 H1/2023
Service was slow 55% 57% 56% 55% 52%
I had to explain my case several times 33% 31% 31% 31% 33%
I wasn't informed about the progress 23% 29% 27% 26% 26%
Service personnel's skills 17% 16% 13% 13% 14%
Instructions were hard to understand 12% 12% 11% 12% 13%
It was difficult to know where to start 10% 10% 10% 10% 11%
Service personnel's attitude 8% 9% 8% 8% 9%
It was difficult to describe what I needed 7% 6% 7% 7% 7%

 

Chapter_5_HSReport

Final Takeaway

The benefits of Human-Centric IT

Based on our 2023 research data on ITXM (IT Experience Management), a human-centric approach to experience management can have significant benefits for organizations. Our analysis of this data from different angles presents a holistic snapshot of end-user experiences in enterprise IT. It's worth noting that the data used in this report is solely from HappySignals customer organizations that have embraced a human-centric approach to experience management.

One of the main benefits of ITXM is its ability to empower enterprise IT leaders to drive data-driven change. By prioritizing the needs and experiences of employees, IT leaders can make informed decisions that improve productivity and overall business outcomes.

Additionally, a focus on human-centric experience management can help organizations transform their IT culture, making it more empathetic and responsive to employee needs.

Another key benefit of ITXM is its impact on employee happiness. By prioritizing a positive experience for employees, organizations can improve employee retention rates and reduce turnover costs.

Happy employees are more engaged and productive, which can lead to improved business outcomes. By prioritizing employee experiences and focusing on data-driven decision-making, organizations can achieve better outcomes for both employees and the business as a whole.

Continue learning with more resources

Intrigued? Discover experience management by reading the IT Experience Management Framework (ITXM™) Guide. This downloadable 10-page read introduces ITXM™ and how to lead human-centric IT operations with experience as a key outcome.

Do you prefer learning through short video courses? Check out our ITXM Foundation Training & Certification Course , where in about one hour, you can learn the foundations of IT Experience Management and get certified for free. 

If you enjoyed this report, you may also want to visit our Learning Center for bite-sized videos and blog posts about topics from XLAs to optimizing your ServiceNow solution.

 


Download the full report

The Global IT Experience Benchmark Report H1/2023


benchmark_data

Interested in discovering more Benchmark Data? 

Read our previous Global IT Experience Benchmark Reports.

2022:

H1/2022 Report

H2/2022 Report

2021:

H1/2021 Report

H2/2021 Report


Ready to drive IT Experience Management in your organization?

Let's talk!


Book a 30-min meeting to discuss your challenges and how Experience Management can help.

Get in Touch
See how HappySignals Platform works

Watch Demo
HappySignals Platform

Ready to get started?

Contact Sales

Contact Sales