User Experience Analytics
Citrix Analytics for Performance (Performance Analytics) is currently under Limited Technical Preview.
What is User Experience Analytics?
User Experience Analytics is an offering of Citrix Analytics for Performance that gives actionable insights into the user and session performance parameters.
- User Experience Analytics provides a comprehensive analytical solution for all Sites across an organization.
- User Experience Analytics analyzes user sessions based on important parameters that define its performance - Session Logon Duration, Session Responsiveness, Session Reliableness, and Session Resiliency.
- The performance metrics are baselined using dynamic thresholds. The thresholds help categorize sessions into Excellent, Fair, or Poor categories.
- The User Experience (UX) score is calculated based on the performance factor. The UX score quantifies the holistic user experience and allows for users to be segregated as having Excellent, Fair, or Poor experience.
- The drilldown view further provides an overview of the user performance across factors and subfactors, providing specific actionable insights.
Using User Experience Analytics, you get,
A comprehensive view of the performance of all Sites in your organization with a consolidated dashboard.
An overall idea of the Site(s) performance from the User Experience score and how it has trended over a period of time.
Drilldowns to help you arrive at the actual factors and subfactors causing suboptimal user experience.
How to access the User Experience dashboard
To view the User Experience dashboard:
Log on to Citrix Cloud and select the Cloud Customer.
On the Citrix Analytics Service tile, click Manage.
Citrix Analytics Service opens, click the Performance tab.
Click the User tab.
How to use the User Experience dashboard
Select the required time period and the Site to monitor. The dashboard gives an overview of the user and session experience. You get,
- User classification based on User Experience.
- Trend of user classification for the selected duration.
- Session classification based on Session Responsiveness and Session logon Duration.
The following section describes the various elements in the User Experience dashboard.
User Experience score
The UX score is a holistic end user experience index calculated based on the performance factors that affect a user session. For this, the metrics are measured through the life cycle of a session from its launch attempt to its end.
Session Logon Duration represents the session launch experience.
Session Responsiveness represents the in-session responsiveness or session latency.
Session Reliableness represents the success rate of establishing a session connection when attempted by the user.
Session Resiliency indicates how the Workspace app recovers from network failures when the user is connected over a sluggish network. It measures the reconnection rate.
For more information about the calculation of UX score and calibration of thresholds for user classification, see the UX score article.
- The granularity of data collection is based on the selected time period.
- All data on the dashboard and the drilldown screens is obtained and refreshed from the database every 15 minutes. Click the refresh icon to update the data immediately.
User Classification by Experience
To view the classification of users based on the UX score:
Go to Performance > Users tab.
Select the time period for which you would like to view the User Experience. The last 2 hours (2H) time period is selected by default.
Select the Site. If you select All Sites, metrics consolidated across all Sites are displayed.
The total number of active users in the selected Site(s) for the selected time duration is displayed.
Users distribution across each of the Excellent, Fair, and Poor categories based on their UX Scores is displayed.
- Excellent users: Represents users with a UX score of 71-100
- Fair users: Represents users with a UX score of 41-70
- Poor users: Represents Users with a UX score 1–40
The User Experience score thresholds for classification of users are calculated using statistical methods.
The up/down arrows indicate the trend in the number of users. It shows an increment or decrement of the number of users in each category as compared to the previous time period. For example, in the following scenario,
In the last week, All Sites had logons by a total of 559 users.
Of these, 56 users had an excellent user experience in the last one week. This is 20 users more than the number of users who had an excellent user experience the week before. This implies that, the week before had 36 users with an excellent user experience.
473 users had a fair user experience in the last 1 week. This is 173 users more than those who had a fair experience in the week before.
30 users had a poor user experience in the last 1 week. 21 user has had a poor experience in the week before.
Click the categorized user numbers to further drill down into the factors affecting those users. For more information, see the Factor Drilldown documentation.
The User classification based on Experience trend displays the distribution of users across the categories during the selected time period. The length of a color on the bar indicates the number of users in an experience category.
Hovering over the chart displays a tooltip containing the user classification for the specific data interval.
Factor categorization by Experience
The Factors affecting UX score chart displays the score distribution across users for each factor.
The individual factor thresholds are calibrated dynamically for the Session Logon Duration and the Session Responsiveness factors. For more information, see the Dynamic thresholding documentation.
Hovering over the factors chart gives a tooltip with the number of users in each of the Excellent, Fair, and Poor category for the specific factor.
This data helps you identify and drilldown into the factors that are major contributors to the poor user experience.
A user session is created when launches an app or a desktop from Workspace App. The user interacts with the app or desktop through the user session. The experience the user has in each session adds up to the overall experience of the user in the CVAD environment.
The User Sessions section of the User Experience dashboard displays important session metrics for the chosen time period and Site.
You can view the following user session data:
Total sessions: This indicates the total number of user sessions over the chosen time period. This includes all sessions launched or active during the chosen period. A single user can establish multiple user sessions.
Total unique users: The number of unique users who either launched a session or have an active session during the chosen period.
Session failures: The number of user sessions that failed to launch during this time period.
This is followed by charts of the Total Sessions, Total Unique Users, and the Session Failures plotted over the selected time range. Hover over the graphs to view detailed information for a specific collection interval.
The trends help identify the pattern in the failures versus the total number of sessions connected. The unique users trend helps analyze the license usage in the Site.
Session Responsiveness represents the ICA Round Trip Time (ICA RTT). ICA RTT is used to quantify the user experience and is based on the amount of time it takes for user input to reach the server and that response appear on the endpoint machine. It measures the in-session experience and quantifies the lag experienced while interacting with a virtual app or desktop.
The Session Responsiveness section has the following information:
Active sessions: Active sessions are user sessions currently in operation and connected to Citrix Virtual Apps and Desktops.
Session classification: Sessions are categorized as Excellent, Fair, or Poor based on their ICA RTT measurements over the selected time period.
Session classification trend: The session classification is plotted for the selected Site across the selected time duration. The thresholds for categorization are calculated for the current customer and are recalibrated dynamically. For more information, see the Dynamic thresholding documentation. The legend displays the current thresholds used to plot the chart and the last updated time for the thresholds.
The session classification trend based on Session Responsiveness helps identify sessions facing network issues.
Session Logon Duration
The period from when a user clicks an application or a desktop in the Citrix Workspace app to the instant the app or desktop is available for use is called the logon duration. The logon duration includes the time taken in for various processes in the complex launch sequence. Total logon time includes phases such as Brokering, VM Start, HDX Connection, Authentication, Profile Load, Logon Script, GPO, and Shell Launch.
Breaking down the Session Logon Duration data to individual phases helps troubleshoot and identify a specific phase causing a longer logon duration.
This section has the following information:
Total logons: The total number of logons to virtual apps or desktops in the selected duration on the chosen Site.
Session classification: Sessions are categorized as Excellent, Fair, or Poor based on their Session Logon Duration measurements over the selected time period. The thresholds for categorization are calculated specifically for the current customer and are recalibrated dynamically. For more information, see the Dynamic thresholding documentation. The legend displays the current thresholds used to plot the chart and the last updated time for the thresholds.
Session Logon Duration sorted by Delivery Groups: Session Logon Duration data is displayed in tabular format with the following information:
Delivery group and the corresponding Site.
Session distribution chart based on performance indicators- Excellent, Fair, or Poor.
Total number of sessions.
Number of Excellent, Fair, and Poor sessions.
By default, the table data is sorted based on the Poor Sessions column. You can choose to sort it based on any of the other columns. The first five Delivery Groups based on the sort criteria are displayed. Click See More Delivery Groups to see more data.
This table helps identify the Delivery Groups with the maximum number of poor sessions. You can troubleshoot further to identify policies causing higher logon duration on the specific Delivery Group.