User Experience Analytics
What is User Experience Analytics?
The User Experience Analytics gives actionable insights into the user and session performance parameters of your environment.
- User Experience Analytics provides a comprehensive analytical solution for all the Sites across an organization in a single consolidated dashboard.
- User Experience Analytics analyzes user sessions based on important parameters that define its performance - Session Logon Duration, Session Responsiveness, Session Availability, and Session Resiliency.
- The performance metrics are baselined using dynamic thresholds. The thresholds help measure the Session Experience score and categorize sessions into Excellent, Fair, or Poor categories.
- The User Experience (UX) score is calculated with the individual Session Experience scores. The UX score quantifies the complete user experience in the Sites and allows for users to be segregated as having an Excellent, Fair, or Poor experience.
- The drilldown view further provides an overview of the user performance across factors and subfactors, providing specific actionable insights for users facing suboptimal experience.
How to access the User Experience dashboard
To view the User Experience dashboard:
Log on to Citrix Cloud and select the Cloud Customer.
On the Analytics Service tile, click Manage.
On the Analytics overview page, click Manage under the Performance offering.
Click the Users tab.
How to use the User Experience dashboard
Select the required time period and the Site to monitor. The dashboard gives an overview of the user and session experience. You get,
- User classification based on User Experience.
- Trend of user classification for the selected duration.
- Trend of user sessions and session failures for the selected duration.
- Session classification based on Session Responsiveness and Session Logon Duration factors.
The following section describes the various elements in the User Experience dashboard.
User Experience score
The UX score is a holistic end user experience index calculated based on the performance factors that affect a user session. Metrics that are measured through the life cycle of a session from its launch attempt to its end, contribute to the calculation of the UX Score.
Session Logon Duration represents the session launch experience.
Session Responsiveness represents the in-session responsiveness or session latency.
Session Availability represents the success rate of establishing a session connection when attempted by the user.
Session Resiliency indicates how the Workspace app recovers from network failures when the user is connected over a sluggish network. It measures the reconnection rate.
For more information about the calculation of UX score and calibration of thresholds for user classification, see the UX score article.
Granularity of data collection is based on the selected time period. All data on the dashboard and the drilldown screens is obtained and refreshed from the database as per the data collection granularity. Click the refresh icon to update the data immediately.
User Classification by Experience
To view the classification of users based on the UX score:
On the Users tab, select the time period for which you want to view the User Experience. The last 2 hours (2H) time period is selected by default.
Select the Site. If you select All Sites, metrics consolidated across all the Sites are displayed.
The total number of active users in the selected Site(s) for the selected time duration is displayed.
Users distribution across each of the Excellent, Fair, and Poor categories based on their UX Scores is displayed.
- Users with Excellent UX: Represents users with a UX score of 71-100. Users with Excellent UX had a consistently good experience across all factors.
- Users with Fair UX: Represents users with a UX score of 41-70. These users had a degraded experience for a limited period across certain factors.
- Users with Poor UX: Represents Users with a UX score 1–40. These users had a prolonged degradation across several indicators.
The User Experience score thresholds for classification of users are calculated using statistical methods.
The up/down arrows indicate the trend in the number of users. It shows an increment or decrement of the number of users in each category as compared to the previous time period. For example, in the following scenario,
In the last week, all the Sites had logons by a total of 559 users.
Of these, 56 users had an excellent user experience in the last one week. This count is 20 users more than the number of users who had an excellent user experience the week before. So, the week before has had 36 users with an excellent user experience.
473 users had a fair user experience in the last 1 week. This count is 173 users more than users who had a fair experience in the week before.
30 users had a poor user experience in the last 1 week. 21 users had a poor experience in the week before.
Click the categorized user numbers to further drill down into the factors affecting those users. For more information, see the Factor Drilldown article.
The User classification based on Experience trend displays the distribution of users across the categories during the selected time period. The length of a color on the bar indicates the number of users in an experience category.
Hovering over the chart displays a tooltip containing the user classification for the specific data interval. Click the Excellent, Fair, or Poor region on the bars to see drilldown displaying classification of the specific set of users for the data interval represented by the bar.
A user session is created when an app or a desktop is launched from Workspace App. The user interacts with the app or desktop through the user session. The experience the user has in each session adds up to the overall experience of the user in the CVAD environment.
The User Sessions section of the User Experience dashboard displays important session metrics for the chosen time period and Site.
You can view the following user session data:
Total Sessions: This indicates the total number of user sessions over the chosen time period. A single user can establish multiple user sessions. This includes all sessions launched or active during the chosen period.
Total Unique Users: The number of unique users who either launched a session or have an active session during the chosen period.
Session Failures: The number of user sessions that failed to launch during this time period. Clicking the failure count opens the Session based Self-Service search.
This is followed by charts of the Total Sessions, Total Unique Users, and the Session Failures plotted over the selected time range. Hover over the graphs to view detailed information for a specific collection interval.
The charts help identify the pattern in the failures versus the total number of sessions connected. The unique users trend helps analyze the license usage in the Site.
Session Responsiveness represents the ICA Round Trip Time (ICA RTT). ICA RTT is used to quantify the response time. It is the amount of time it takes for user input to reach the server and the response to appear on the endpoint machine. It measures the in-session experience and quantifies the lag experienced while interacting with a virtual app or desktop.
The Session Responsiveness section has the following information:
Active sessions: Active sessions are user sessions currently in operation and connected to Citrix Virtual Apps and Desktops.
Session classification: Sessions are categorized as Excellent, Fair, or Poor based on their ICA RTT measurements over the selected time period. Click the classification numbers to view the session based Self-Service search for the selected set of sessions.
The thresholds for categorization are calculated for the current customer and are recalibrated dynamically. For more information, see the Dynamic thresholding documentation.
Session classification trend: The session classification is plotted for the selected Site across the selected time duration. The legend displays the current thresholds used to plot the chart and the last updated time for the thresholds.
The session classification trend based on Session Responsiveness helps identify sessions facing network issues.
Session Logon Duration
The period from when a user clicks an application or a desktop in the Citrix Workspace app to the instant the app or desktop is available for use is called the logon duration. The logon duration includes the time taken for various processes in the complex launch sequence. Total logon time includes phases such as Brokering, VM Start, HDX Connection, Authentication, Profile Load, Logon Script, GPO, and Shell Launch.
Breaking down the Session Logon Duration data to individual phases helps troubleshoot and identify a specific phase causing a longer logon duration.
This section has the following information:
Total logons: The total number of logons to virtual apps or desktops in the selected duration on the chosen Site.
Session classification: Sessions are categorized as Excellent, Fair, or Poor based on their Session Logon Duration measurements over the selected time period. Click the classification numbers to view the session based Self-Service search for the selected set of sessions. The thresholds for categorization are calculated specifically for the current customer and are recalibrated dynamically. For more information, see the Dynamic thresholding documentation. The legend displays the current thresholds used to plot the chart and the last updated time for the thresholds.
Session Logon Duration sorted by Delivery Groups: Session Logon Duration data is displayed in tabular format with the following information:
Delivery group and the corresponding Site.
Session distribution chart based on performance indicators- Excellent, Fair, or Poor.
Total number of sessions.
Number of Excellent, Fair, and Poor sessions.
By default, the table data is sorted based on the Poor Sessions column. You can choose to sort it based on any of the other columns. The first five Delivery Groups based on the sort criteria are displayed. Click See More Delivery Groups to see more data.
This table helps identify the Delivery Groups with the maximum number of poor sessions. You can troubleshoot further to identify policies causing higher logon duration on the specific Delivery Group.