The HDX technology stack supports the delivery of multimedia applications through two complementary approaches:
- Server-side rendering multimedia delivery
- Client-side rendering multimedia redirection
This strategy ensures that you can deliver a full range of multimedia formats, with a great user experience, while maximizing server scalability to reduce the cost-per-user.
With server-rendered multimedia delivery, audio and video content is decoded and rendered on the Citrix Virtual Apps and Desktops server by the application. The content is then compressed and delivered using ICA protocol to Citrix Workspace app on the user device. This method provides the highest rate of compatibility with various applications and media formats. Because video processing is compute-intensive, server-rendered multimedia delivery benefits greatly from the onboard hardware acceleration. For example, support for DirectX Video Acceleration (DXVA) offloads the CPU by performing H.264 decoding in separate hardware. Intel Quick Sync, AMD RapidFire, and NVIDIA NVENC technologies provide hardware-accelerated H.264 encoding.
Because most servers do not offer any hardware acceleration for video compression, server scalability is negatively impacted if all video processing is done on the server CPU. You can maintain high server scalability, by redirecting many multimedia formats to the user device for local rendering.
- Windows Media redirection offloads the server for a wide variety of media formats typically associated with the Windows Media Player.
- HTML5 video has become popular, and Citrix introduced a redirection technology for this type of content. We recommend the browser content redirection for websites using HTML5, HLS, DASH, or WebRTC.
- You can apply the general contact redirection technologies Host-to-client redirection and Local App Access to the multimedia content.
Putting these technologies together, if you don’t configure redirection, HDX does Server-Side Rendering. If you configure redirection, HDX uses either Server Fetch and Client Render or Client Fetch and Client Render. If those methods fail, HDX falls back to Server-Side Rendering as needed and is subject to the Fallback Prevention Policy.
Scenario 1. (Server Fetch and Server Rendering):
- The server fetches the media file from its source, decodes, and then presents the content to an audio device or display device.
- The server extracts the presented image or sound from the display device or audio device respectively.
- The server optionally compresses it, and then transmits it to the client.
This approach incurs a high CPU cost, high bandwidth cost (if the extracted image/sound isn’t compressed efficiently), and has low server scalability.
Thinwire and Audio virtual channels handle this approach. The advantage of this approach is that it reduces the hardware and software requirements for the clients. Using this approach the decoding happens on the server and it works for a wider variety of devices and formats.
Scenario 2. (Server Fetch and Client Render):
This approach relies on being able to intercept the media content before it is decoded and presented to the audio or display device. The compressed audio/video content is instead sent to the client where it is then decoded and presented locally. The advantage of this approach is that the are offloaded to the client devices, saving CPU cycles on the server.
However, it also introduces some additional hardware and software requirements for the client. The client must be able to decode each format that it might receive.
Scenario 3. (Client Fetching and Client Rendering):
This approach relies on being able to intercept the media content URL before it’s fetched from the source. The URL is sent to the client where the media content is fetched, decoded, and presented locally. This approach is conceptually simple. Its advantage is that it saves both CPU cycles on the server and bandwidth because the server sends only control commands. However, the media content is not always accessible to the clients.
Framework and platform:
Single-session operating systems (Windows, Mac OS X, and Linux) provide multimedia frameworks that enable the faster development of multimedia applications. This table lists some of the more popular multimedia frameworks. Each framework divides media processing into several stages and uses a pipelined-based architecture.
|DirectShow||Windows (98 and later)|
|Media Foundation||Windows (Vista and later)|
|Quicktime||Mac OS X|
Double hop support with media redirection technologies
|Browser content redirection||No|
|HDX webcam redirection||Yes|
|HTML5 Video redirection||Yes|
|Windows Media redirection||Yes|