User Experience Analytics
What is User Experience Analytics?
The User Experience Analytics gives actionable insights into the user and session performance parameters of your environment.
- User Experience Analytics provides a comprehensive analytical solution for all the Sites across an organization in a single consolidated dashboard.
- User Experience Analytics analyzes user sessions based on important parameters that define its performance - Session Logon Duration, Session Responsiveness, Session Availability, and Session Resiliency.
- The performance metrics are baselined using dynamic thresholds. The thresholds help measure the Session Experience score and categorize sessions into Excellent, Fair, or Poor categories.
- The User Experience (UX) score is calculated with the individual Session Experience scores. The UX score quantifies the complete user experience in the Sites and allows for users to be segregated as having an Excellent, Fair, or Poor experience.
- The drilldown view further provides an overview of the user performance across factors and subfactors, providing specific actionable insights for users facing suboptimal experience.
How to access the User Experience dashboard
To view the User Experience dashboard:
Log on to Citrix Cloud and select the Cloud Customer.
On the Analytics Service tile, click Manage.
On the Analytics overview page, click Manage under the Performance offering.
Click the Users tab.
How to use the User Experience dashboard
The User experience dashboard displays session breakup based on the session protocol. Here, the total number of sessions that have been active in the selected sites over the selected duration along with their breakup into HDX, Console, and RDP sessions is displayed. All the sections on the dashboard display analytics relevant to HDX sessions only.
Use the Time Filter to select the required duration. Site selection is available if multiple Sites are present in the environment. The dashboard gives an overview of the user and session experience. You get,
- User classification of users running HDX sessions based on User Experience.
- Trend of user classification for the selected duration.
- Trend of user sessions and session failures for the selected duration.
- Session classification based on Session Responsiveness and Session Logon Duration factors.
The following section describes the various elements in the User Experience dashboard.
User Experience score
The UX score is a comprehensive end user experience index calculated based on the performance factors that affect a user session. Metrics that are measured through the session life cycle from its launch attempt to its end, contribute to the calculation of the UX Score.
Session Logon Duration represents the session launch experience.
Session Responsiveness represents the in-session responsiveness or session latency.
Session Availability represents the success rate of establishing a session connection when attempted by the user.
Session Resiliency indicates how the Workspace app recovers from network failures when the user is connected over a sluggish network. It measures the reconnection rate.
For more information about the UX score calculation and threshold calibration for user classification, see the UX score article.
Granularity of data collection is based on the selected time period. All data on the dashboard and the drilldown screens is obtained and refreshed from the database as per the data collection granularity. Click the refresh icon to update the data immediately.
User Classification by Experience
To view the classification of users based on the UX score:
On the Users tab, select the time period for which you want to view the User Experience. The last 2 hours (2H) time period is selected by default.
Select the Site. If you select All Sites, metrics consolidated across all the Sites are displayed.
The total number of active users in one or more selected Sites for the selected time duration is displayed.
Users distribution across each of the Excellent, Fair, and Poor categories based on their UX Scores is displayed. The User Experience score thresholds for classification of users are calculated using statistical methods.
- Users with Excellent UX: Represents users with a UX score of 71-100. Users with Excellent UX had a consistently good experience across all factors.
- Users with Fair UX: Represents users with a UX score of 41-70. These users had a degraded experience for a limited period across certain factors.
- Users with Poor UX: Represents users with a UX score 1–40. These users had a prolonged degradation across several indicators.
Users Not Categorized
Users might be Not Categorized if the session ICA measurements or Logon Duration measurements are not received.
Logon Duration requires Citrix Profile Management to be installed on the machines. Citrix Profile Management calculates the Logon Duration based on machine events and forwards the same to the Monitor Service. If a Remote PC Access deployment exists and a machine upgrade is not required, you can deploy the Profile Management components separately - Citrix Profile Management and Citrix Profile Management WMI plug-in. For more information, see the blog, Monitor and troubleshoot Remote PC Access machines.
ICA measurements require the End User Experience Monitoring (EUEM) service to be running and corresponding policies configured on the machines. For more details on the configuration of EUEM related policies see, End user monitoring policy settings.
User Classification Trend
The up/down arrows indicate the trend in the number of users. It shows an increment or decrement of the number of users in each category as compared to the previous time period. For example, in the following scenario,
In the last 2 hours, the Site had logons by a total of 157 users.
Of these, 27 users had an excellent user experience in the last 2 hours. This count is 4 users lesser than the number of users who had an excellent user experience in the previous 2 hours. So, the previous two hours had 31 users with an excellent user experience.
23 users had a fair user experience in the last 2 hours. This count is 9 users lesser than users who had a fair experience in the previous 2 hours.
8 users had a poor user experience in the last 2 hours. 5 users had a poor experience in the previous 2 hours.
Click the categorized user numbers to further drill down into the factors affecting those users. For more information, see the Factor Drilldown article.
The User classification based on Experience trend displays the distribution of users across the categories during the selected time period. The length of a color on the bar indicates the number of users in an experience category.
Hovering over the chart displays a tooltip containing the user classification for the specific data interval. Click the Excellent, Fair, or Poor region on the bars to see the drilldown displaying the classification of the specific set of users for the data interval represented by the bar.
A user session is created when an app or a desktop is launched from the Workspace app. The user interacts with the app or desktop through the user session. The experience the user has in each session adds up to the overall experience of the user in the Virtual Apps and Desktops environment.
The User Sessions section of the User Experience dashboard displays important session metrics of HDX sessions for the chosen time period and Site.
You can view the following user session data:
Total Sessions: Total number of user sessions over the chosen time period. A single user can establish multiple user sessions. The number includes all sessions launched or active during the chosen period.
Total Unique Users: Number of unique users who either launched a session or have an active session during the chosen period.
Session Failures: Number of user sessions that failed to launch during this time period. Clicking the failure count opens the Sessions based self-Service search. Hover over the graphs to view detailed information for a specific collection interval. The charts help identify the pattern in the failures versus the total number of sessions connected. The unique users trend helps analyze the license usage in the Site.
Failure Insights: Insights into the causes for session failure, drill down to specific users, sessions, or machines that the failures are associated with. Also available is a set of recommended steps to mitigate the failures.
Failure Insights provide insights into the root causes for session failures in your environment. Drilling deeper into specific metrics with these insights helps troubleshoot and resolve session failures faster. Failure Insights specifically help administrators to improve the session availability, which is an important factor that determines user experience.
Black hole machines
Some machines in your environment though registered and appearing healthy might not service sessions brokered to the them, resulting in failures. Machines that have failed to service four or more consecutive session requests are termed as Black hole machines. The reasons for these failures are related to various factors that might affect the machine, such as, insufficient RDS licenses, intermittent networking issues, or instantaneous load on the machine. These failures do not include failures due to capacity or license availability. The presence of black hole machines in the environment increases session failures resulting in poor session availability.
The Black hole machines section of Failure Insights shows the number of black hole machines identified in your environment during the selected time period.
Recommended steps to help reduce the number of black holes are provided,
- to check the RDS license status,
- to put the machine in maintenance mode, or
- to reboot the machine.
Clicking the black hole machines number opens the Machines based self-service view that is filtered to show all the black hole machines in your environment during the selected time period. Here, you can analyze the individual performance metrics of the machine to identify and understand possible reasons for the machine not accepting session requests. For more information about the performance indicators available in the Machines based self-service view, see Self-service search for Machines.
Further, clicking the machine name opens the Machine Statistics view that helps correlate the resource performance parameters of the machine with the session performance parameters during the same time period. For more information about the Machine Statistics view, see Machine Statistics view.
The Communication Error subpane lists the number of session failures due to communication errors between the endpoint (where the user launches the session) and the VDA. These errors can occur due to incorrect firewall configurations or other errors on the network path.
The two categories of communication errors are:
- Endpoint to machine—lists the sessions where communication errors have occurred between the endpoint and the machine.
- Gateway to machine—lists the sessions where communication errors have occurred between the gateway and the machine.
Additionally, the Communication Error subpane displays the following recommendations to resolve the errors.
- Check the firewall settings on the machine and gateway.
- Check network connectivity between the machine and gateway.
Clicking the failed sessions opens the sessions based self-service view that is filtered to show all the sessions that have failed due to communication errors in your environment during the selected time period. This view helps analyze the individual sessions that have failed and get a possible root cause. For more information about the indicators available in the sessions based self-service view, see Self-service search for sessions.
This feature is supported only on Citrix Workspace app 2103 and later.
Session Responsiveness represents the ICA Round Trip Time (ICA RTT). ICA RTT is used to quantify the response time. It is the amount of time it takes for user input to reach the server and the response to appear on the endpoint machine. It measures the in-session experience and quantifies the lag experienced while interacting with a virtual app or desktop.
The Session Responsiveness section has the following information:
Active sessions: Active sessions are user sessions currently in operation and connected to Citrix Virtual Apps and Desktops.
Session classification: Sessions are categorized as Excellent, Fair, or Poor based on their ICA RTT measurements over the selected time period. Click the classification numbers to view the Sessions based self-Service search for the selected set of sessions.
The thresholds for categorization are calculated for the current customer and are recalibrated dynamically. For more information, see the Dynamic thresholding documentation.
Sessions Not Categorized for Responsiveness
Sessions might be Not Categorized for Session Responsiveness in the following cases:
- The session is disconnected.
The session launch message is received but not the ICA measurements. ICA measurements are available only for sessions,
- launched from machines enabled for NSAP
- new CGP (Common Gateway Protocol) sessions, and not reconnected sessions.
Session classification trend
Session classification is plotted for the selected Site across the selected time duration. The legend displays the current thresholds used to plot the chart and the last updated time for the thresholds.
The session classification trend based on Session Responsiveness helps identify sessions facing network issues.
Session Logon Duration
The period from when a user clicks an application or a desktop in the Citrix Workspace app to the instant the app or desktop is available for use is called the logon duration. The logon duration includes the time taken for various processes in the complex launch sequence. Total logon time includes phases such as Brokering, VM Start, HDX Connection, Authentication, Profile Load, Logon Script, GPO, and Shell Launch.
Breaking down the Session Logon Duration data to individual phases helps troubleshoot and identify a specific phase causing a longer logon duration.
This section has the following information:
Total logons: The total number of logons to virtual apps or desktops in the selected duration and Site.
Session classification: Sessions are categorized as Excellent, Fair, or Poor based on their Session Logon Duration measurements over the selected time period. Click the classification numbers to view the Sessions based self-Service search for the selected set of sessions. The thresholds for categorization are calculated specifically for the current customer and are recalibrated dynamically. For more information, see the Dynamic thresholding documentation. The legend displays the current thresholds used to plot the chart and the last updated time for the thresholds.
Sessions Not Categorized for Logon Duration
Sessions might be Not Categorized for Logon Duration if the subfactors are not configured to be measured as described in Session Logon Duration subfactors](/en-us/performance-analytics/user-analytics/intermediate-drilldown.html#session-logon-duration-subfactors).
Session Logon Duration sorted by Delivery Groups
Session Logon Duration data is displayed in tabular format with the following information:
Delivery group and the corresponding Site.
Session distribution chart based on performance indicators- Excellent, Fair, or Poor.
Total number of sessions.
Number of Excellent, Fair, and Poor sessions.
By default, the table data is sorted based on the Poor Sessions column. You can choose to sort it based on any of the other columns. The first five Delivery Groups based on the sort criteria are displayed. Click See More Delivery Groups to see more data.
This table helps identify the Delivery Groups with the maximum number of poor sessions. You can troubleshoot further to identify policies causing higher logon duration on the specific Delivery Group.