Welcome to the series of band bandwidth HDX test blog! In the following posts I will share the results and key findings of a wide range of XenDesktop 5.6 / XenApp 6.5 bandwidth test. Already using XenDesktop 7? So stay tuned for the latest message of the series where I will discuss also. Before we begin, I want to take this opportunity to thank Andy Baker and Thomas Berger for their help and advice in managing this effort
XenDesktop Bandwidth :. The full set
Part 1 - Prologue: Methodology and Infrastructure
Part 2 - By The Numbers: Take the time to optimize
Part 3 - Bringing It All Together: daily average of the user and General Recommendations
part 4 - What About XenApp
part 5 - Do It Yourself: Starter Kit
part 7 - Who needs a part 6 when discussing XenDesktop 7.x (and XenApp 7.5!)
Analysis customer bonus
Bonus HDX 3D Pro
part 1: Prologue
It is not uncommon these days to have large LANs with what seems like unlimited bandwidth. I find myself at the office running multiple virtual desktops, download my files ShareFile while streaming Spotify to my machine at the same time, without notice. Even at home you have a nice fast connection just for you. Unfortunately, many users do not have this luxury because of quality connections expensive companies are often used. Overlooking this fact during a deployment can result when historical and disconnected sessions and an overall poor user experience. This leaves IT departments frequently asking "how much bandwidth do I need to XenDesktop?"
Of course, the answer to this question - in my opinion, more than most of the questions - is it depends. Why is that? Because it depends on what is on the screen at one time. The amount of bandwidth consumed will be close to nothing when the session is inactive, but can vary considerably as the user types, browsing a document, by running a slide show, or watch a video. Of course, there is always the magic number of 20-30kbps that has been around forever, but that was before the explosion of multimedia content both on the web and now more frequently seen in applications as well. (Although I would like to emphasize that we meet and beat this 20-30kbps area with some of our media intensive testing under).
To help better answer that our team decided to start running some tests ... ..a lot of testing. We measured the general everyday use, took a deep dive into the unique application testing, and put some optimizations and best practices to the test. In part one of this blog, I will discuss infrastructure and methodology for our first series of tests.
Infrastructure
The tests were run I will discuss in the environment below. A laptop was connected to a Apposite WAN Emulator and used as a benchmark for both the manual and testing Login VSI. The emulator was used to control the bandwidth limits for each scenario I describe later in this blog. The other end of the emulator has been connected to a switch that is configured to send all packets in and out of the emulator to a port mirroring monitored by a server running Wireshark. This ensured all communications between the client and the virtual office were captured without interfering with VSI scripts. The infrastructure of the environment and the office pool were also connected to the switch and communicated transparently to the laptop.
The environment was built using the product versions below. It is important to note that future tests are planned with XenDesktop 7 which has new codecs and algorithms for rendering. These tests were also carried out with the latest Login VSI that includes intense workloads than previous versions and a large random library of content.
- XenServer 6.1
- Microsoft Server 08 R2
- Windows 7 x86
- XenDesktop 5.6
- VDA 5.6.2
- Receiver 3.4 Enterprise
- Login VSI 4.0
[Notices
Before I start explaining the tests we ran I have to make some warnings. The first being that the quality of service (QoS) has not been implemented in this environment. This decision was taken because we wanted to look at the total consumption ICA session and decided that QoS would add another layer of complexity. QoS in a production environment can make a difference and do not recommend implementation with appropriate tests.
The other limitation we encountered was with Login VSI. Automated scripts running on the hosted infrastructure, not the client. This means that the mouse clicks and keyboard were simulated locally and not sent on the wire. We understand that this does not alter the results, although the vast majority of traffic during the session comes from the virtual desktop. In the results that I discuss in the next blog series, average bandwidth are only for the bandwidth of XenDesktop or XenApp host delivered to the customer.
Finally, since CPU, memory and disk can cause degradation in the user experience I made sure that there was an adequate supply of all three in the tests so that any deterioration would be a result of limited bandwidth. It is important to note that some of the changes and policies to reduce bandwidth can come at a cost of CPU and each environment will have different requirements in this regard.
Login VSI Tests
This series of tests was performed using a single user on a fixed network capacity. The tests were conducted using Login VSI 4.0. The deep dive application was run with 5 distinct workloads matching Microsoft Word, Excel, Outlook, PowerPoint and Adobe Reader. Each workload has been created by removing all the actions in the default VSI load average work except for those corresponding to the application of interest and will be available later in this series of blog. Workloads do not include pauses or breaks and are therefore heavy usage and not an "average daily user". The purpose of these tests was to focus on how different applications respond to a WAN and how low we could push with the right optimizations in place. These workloads were performed at 6 different bandwidth limits with 3 different configurations. The test ventilation are presented below.
- Applications
- Microsoft Word 2010
- Microsoft Excel 2010
- Microsoft Outlook 2010
- Microsoft PowerPoint 2010
- Adobe Reader XI
- Caps Bandwidth
- LAN (1Gbps)
- 2.048 Mbps
- 1.536 Mbps
- 1.024 Mbps
- 512 kbps
- 256 kbps
- Settings
- default
- Optimized
- Max Optimized
5 Applications X 6 bandwidth Caps X 3 Configurations X 5 coherence tests = 450 tests
But How much bandwidth?
now, because the average workload Login VSI is considered by many as a standard, I ran additional tests on each configuration to come up with a sort of "daily average". Not to be confused with the deep dive application of 450 tests above, I will share the results of the daily average in later series to bring everything together and summarize the recommendations from this experience. This average includes the above applications, and Internet browsing, viewing images, video 480p, and idle time. You can find the official workload here.
Tests Manuel
Each scenario has also been tested manually to get an idea of the speed and quality of the session. I did this by performing specific tasks using each application and evaluating the reactivity of each task. For example, using the default image to 1.536 Mbps, scroll through the slides in PowerPoint was "good", while playing a slideshow with pictures and transition animations was rated "average" due to slight delay.
limits the bandwidth
As I mentioned earlier, each test was performed in 6 different bandwidth limits. After much discussion we agreed on the boundaries 5 and the LAN speed to the baseline. These limits have allowed us to better understand when certain actions start to be affected by manual tests and then compare the bandwidth consumption and packages delivered during the VSI workloads. For example, we have also seen improvements in responsiveness through optimization of several actions were deemed "good" at the lower limits of bandwidth when heavier optimizations have been implemented, such as Excel photo below a 256kbps network limit.
every application at every bandwidth cap was evaluated and anything below "good" was commented. In the case above, paging down the Excel sheet in the default configuration was considered "average" due to a slight delay in the text. It was a film that was entirely pre-filled with text and highlighted cells. Or a blank sheet with a minimum of text performed very well in our manual testing the lower limits. In the case of the ribbon in Excel 2010, there was a slight delay the first time I sailed to another tab, but once the leg had been cached I could browse the tabs if the application was being running locally.
Metrics
to examine tracks test and compare to each other, we used a variety of measures which I will explain below. In terms of bandwidth, Wireshark capture every packet in the ICA session so we decided to consider the following:
- average bandwidth
- Total amount of data transferred
- Number of packets transferred
- struck the maximum bandwidth during the session
We also wanted a way of trying to measure user experience of a another way that the manual test that is an inherent element of subjectivity to it. To do this, we borrowed an engineering tool to help us enter two additional parameters for each session:
- frames per second transmitted to the client (FPS)
- ICA time return (ICART)
You can learn more about the use of these meters in the user experience Blog Frank Anderson. Since you've already seen that we did, in fact, use manual testing, it may not surprise you to hear that there were limits to the use of these measures in our specific test.
As Frank mentions in his blog, when network capacity is reached, ICART starts to increase due to the added latency. We actually see this happen as planned, but the peaks ICART measured in various tests works varied too widely ICART use as a basis for our rating. Instead, we examined trends between two different configurations and different bandwidth limits as an overall measure of responsiveness that I will reveal in the next part of this series.
SPF also shown its limits, particularly in optimizing configurations. As you'll see next time, some of our optimized configurations including limiting the FPS using Citrix policies. This will of course lower the FPS that we measure and proved a better metric of comparison in our tests "default".
Next Time
So what we find with so many trials? Well for starters, spending only a few hours by optimizing the visual settings can provide huge bandwidth savings and improve the responsiveness of the session and the overall user experience. In Part 2, I'll discuss the difference between optimized and non-optimized tests and optimizations that we have implemented.
Thanks
I would like to thank Marcel Calef Derek Thorslund, Frank Anderson and Mohit Oza for their contribution and expertise in making this test possible.
Thanks for reading,
Amit Ben-Chanoch
Worldwide Consulting
Desktop Team & Applications
Accelerator Project
Virtual Desktop Manual
Follow @CTXConsulting
0 Komentar