User Experience (UX) Drilldown

Note:

Citrix Analytics for Performance (Performance Analytics) is currently under Limited Technical Preview.

The User Experience categorization in the User Experience dashboard shows users classified as having Excellent, Fair, or Poor experience. The classification is based on the readings of the performance metrics for the corresponding user sessions over the selected time period.

How to use the UX Drilldown page?

To drill deeper into factor metrics affecting the user experience, click the number in any of the Excellent, Fair, or Poor category in the dashboard.

  1. Consider the scenario, where all Sites in my environment during the last week had 56 Excellent Users, 473 Fair Users and 30 Poor Users. To understand the reason for 30 users facing a poor user experience, click the number 30 from the User Experience dashboard. User Experience dashboard

  2. A drilldown of the factors affecting the User Experience of Poor users in all Sites during the last week is displayed. UX Drilldown

  3. The left panel displays the selection filters for the User Experience and the factors. Use this to further filter the data. UX Drilldown - Left panel

  4. Click the Selected users number to drilldown to the user details table for the specific set of users. Selected Users

Factors affecting the UX

On the UX DrillDown page, the Factors affecting UX chart displays the user classification for each factor. Hovering over the chart gives a tooltip with the number of users in each of the Excellent, Fair, and Poor category in the specific factor.

Factors affecting UX

This is followed by drilldown of the individual factors into subfactors/types. The classification of users based on the specific factor measurements is displayed. You can click these classification numbers to further drill down into the Self-Service page with the current selection. Here, you can see the actual measurements of the subfactors.

UX Drilldown

Session Logon Duration

Session Logon Duration is the time taken to launch a session. It is measured as the period from the time the user connects from Citrix Workspace app to the time when the app or desktop is ready to use. The classification of users based on the logon duration readings of their sessions is displayed. Clicking these numbers drills down to the metrics for that category.

Session Logon Duration is broken down to subfactors that represent individual phases in the complex launch sequence. A row in the Session Logon Duration drilldown table represents the user categorization for an individual phase during session launch. This helps troubleshoot and identify specific user log on issues.

Session Logon Duration Drilldown

For each subfactor, the number of users in each category is displayed in each of the Excellent, Fair, and Poor columns. Use the information to analyze the specific subfactors that are contributing to poor user experience.

For example, if the GPOs show highest number of Poor Users, that indicates an issue with the GPO execution time for those users.

The last N/A column displays the number of users for whom the specific subfactor measurement was not available during the selected time period.

Note:

The total logon duration is not the exact sum of subfactor durations. A few phases might overlap with each other. In some phases, additional processing happens resulting in a longer logon duration than the sum.

The total logon time also does not include the ICA idle time.

Session Logon Duration subfactors

GPOs: It is the time taken to apply group policy objects during logon. GPOs measurement is available only if the Group Policy settings are configured and enabled on the virtual machines.

Profile load: It is the time taken for the profile to load. Profile Load measurement is available only if profile settings are configured for the user or the virtual machine,

Interactive Session: The time taken to “hand off” keyboard and mouse control to the user after the user profile has been loaded. It is normally the longest duration of all the phases of the logon process.

Brokering: The time taken to decide which desktop to assign to the user.

VM Start: If the session required a machine start, it is the time taken to start the virtual machine. This measurement is not available for non-power managed machines.

HDX connection: The time taken to complete the steps required in setting up the HDX connection from endpoint to the virtual machine.

Authentication: The time taken to complete authentication to the remote session.

Logon scripts: It is the time taken for the logon scripts to be executed. This measurement is available only if logon scripts are configured for the session.

Session Responsiveness

Once a session is established, the Session Responsiveness factor measures the screen lag that a user experiences while interacting with an app or desktop. Session Responsiveness is measured using the ICA Round Trip Time (ICA RTT) that represents the time elapsed from when the user pushes down a key until the graphical response is displayed back.

ICA RTT measures the traffic delays in the server or endpoint machine networks and the time taken to launch an application. This is an important metric that gives an overview of the actual user experience.

Session Responsiveness Drilldown

The Session Responsiveness Drilldown represents the classification of users based on the ICA RTT readings of the sessions. Clicking these numbers drills down to the metrics for that category. The excellent users in Session Responsiveness have had highly reactive sessions while the poor users faced lag in their sessions.

Note:

While the ICA RTT readings are obtained from the Citrix Virtual Apps and Desktops, the subfactor measurements are obtained from the Citrix Gateway. Hence, the subfactor values are available only when the user is connecting to an app or a desktop via a configured Citrix Gateway.

Further, these measurements are available for sessions that are,

  • launched from VDAs enabled for NSAP. This includes VDAs of version 7.16 and later.

  • new CGP (Common Gateway Protocol) sessions, and not reconnected sessions.

The rows in the Session Responsiveness drilldown table represent the user categorization in the subfactor measurements. For each subfactor, the number of users in each category is displayed in the Excellent, Fair, and Poor columns. This helps analyze the specific subfactor that is contributing to poor user experience.

For example, highest number of Poor Users recorded for Data Center Latency indicates an issue with the server-side network.

The last N/A column displays the number of users for whom the specific subfactor measurement was not available during the selected time period.

Session Responsiveness subfactors

The following subfactors contribute to the Session Responsiveness. However, the total ICA RTT is not a sum of the subfactor metrics, as only subfactors of ICA RTT that occur till Layer 4 are measurable.

  • Data Center Latency: This is the latency measured from the Citrix Gateway to the server. A high Data Center Latency indicates delays due to a slow server network.

  • WAN Latency: This is the latency measured the virtual machine to the Gateway. A high WAN Latency indicates sluggishness in the endpoint machine network. WAN latency increases when the user is geographically farther from the Gateway.

  • Host Latency: This measures the Server OS induced delay. A high ICA RTT with low Data Center and WAN latencies, and a high Host Latency indicates an application error on the host server.

A high number of poor users in any of the subfactors helps understand where the issue lies. You can further troubleshoot the issue using Layer 4 delay measurements. None of these latency metrics account for packet loss, out of order packets, duplicate acknowledgments, or retransmissions. Latency might increase in these cases.

For more information on calculation of ICA RTT, see How ICA RTT is calculated on NetScaler Insight For more information about onboarding Citrix Gateway, see Install the system.

Session Reliableness

Session Reliableness represents the rate of successful session launches. It is calculated as the number of successful session connections with respect to the total number of attempted session connections. An excellent Session Reliableness factor indicates the users being able to successfully connect to and use the app or desktop. A high number of Poor users here indicates inability to connect and use sessions. Since failure to launch sessions disrupts user productivity, it is an important factor in quantifying the user experience.

Session Reliableness Drilldown

The rows in the Session Reliability drilldown table display the failure types categorized with the number of users and the number of failures in each category. Use the listed Failure types to further troubleshoot the failures.

For more information about the possible reasons within an identified failure type, see the Failure Reasons Troubleshooting document.

Session Resiliency

Session Resiliency indicates how well the Citrix Workspace app auto reconnected to recover from network disruptions. Auto reconnect keeps sessions active and on the user’s screen when network connectivity is interrupted. Users continue to see the application they are using until network connectivity resumes. An excellent Session Resiliency factor indicates a smooth user experience in spite of network disruptions. Session Resiliency is measured using the Reconnection rate.

Auto reconnect is enabled when the Session Reliability or the Auto Client Reconnect policies are in effect. When there is a network interruption on the endpoint, the following Auto reconnect policies come into effect:

  • Session Reliability policy comes into effect (by default for 3 minutes) where the Citrix Workspace app tries to connect to the VDA.
  • Auto Client Reconnect policy comes into effect between 3 and 5 minutes where the endpoint machine tries to connect to the VDA.

The number of auto reconnects are captured in the Reconnection rate as the ratio of the number of auto reconnects per unit time.

Session Resiliency Drilldown