The HDX technology stack supports the delivery of multimedia applications through two complementary approaches:
- Server-side rendering multimedia delivery
- Client-side rendering multimedia redirection
This strategy ensures that you can deliver a full range of multimedia formats, with a great user experience, while maximizing server scalability to reduce cost-per-user.
With server-rendered multimedia delivery, audio and video content is decoded and rendered on the XenApp or XenDesktop server by the application. The content is then compressed and delivered over the ICA protocol to the Citrix Receiver on the user device. This method provides the highest rate of compatibility with various applications and media formats. Because video processing is compute-intensive, server-rendered multimedia delivery benefits greatly from onboard hardware acceleration. For example, support for DirectX Video Acceleration (DXVA) offloads the CPU by performing H.264 decoding in separate hardware. Intel Quick Sync and NVIDIA NVENC technologies provided hardware-accelerated H.264 encoding.
Because most servers do not offer hardware acceleration for video compression, server scalability is negatively impacted if all video processing is done on the server CPU. To maintain high server scalability, many multimedia formats can be redirected to the user device for local rendering. Windows Media redirection offloads the server for a wide variety of media formats typically associated with the Windows Media Player.
Flash redirection redirects Adobe Flash video content to a Flash player running locally on the user device.
HTML5 video has become popular, and Citrix introduced a redirection technology for this type of content.
Also, you can apply the general contact redirection technologies Host-to-client redirection and Local App Access to multimedia content.
Putting these technologies together, if you don't configure redirection, HDX does Server-Side Rendering.
If you configure redirection, HDX uses either Server Fetch and Client Render or Client Fetch and Client Render. If those methods fail, HDX falls back to Server-Side Rendering as needed and is subject to the Fallback Prevention Policy.
Scenario 1. (Server Fetch and Server Rendering)
- The server fetches the media file from its source, decodes, and then presents the content to an audio device or display device.
- The server extracts the presented image or sound from the display device or audio device respectively.
- The server optionally compresses it, and then transmits it to the client.
This approach incurs a high CPU cost, high bandwidth cost (if the extracted image/sound isn’t compressed efficiently), and has low server scalability.
Thinwire and Audio virtual channels handle this approach. The advantage of this approach is that it reduces the hardware and software requirements for the clients. Using this approach the decoding happens on the server and it works for a wider variety of devices and formats.
Scenario 2. (Server Fetch and Client Render)
This approach relies on being able to intercept the media content before it is decoded and presented to the audio or display device. The compressed audio/video content is instead sent to the client where it is then decoded and presented locally. This advantage of this approach is that the decoding and presentation is offloaded to the client devices, saving CPU cycles on the server.
However, it also introduces some additional hardware and software requirements for the client. The client must be able to decode each format that it might receive.
Scenario 3. (Client Fetching and Client Rendering)
This approach relies on being able to intercept the URL of the media content before it is fetched from the source. The URL is sent to the client where the media content is fetched, decoded, and presented locally. This approach is conceptually simple. Its advantage is that it saves both CPU cycles on the server and bandwidth because only control commands are sent from the server. However, the media content is not always accessible to the clients.
Framework and platform
Desktop operating systems (Windows, Mac OS X, and Linux) provide multimedia frameworks that enable the faster and easier development of multimedia applications. This table lists some of the more popular multimedia frameworks. Each framework divides media processing into several stages and uses a pipelined-based architecture.