I started thinking about avoiding the manual exports from the VMCP website and remembered I used WGET in the past for this kind of stuff. The difference with before was that this time, I needed to login to the website before being able to get data from it.
Login in to the VMCP website is done using a POST method and luckily wget supports that. The next thing is to understand what the post data needs to be. This can be fetched from the source of the logon page:
The relevant POST data is derived to be
fuseaction=Security
Page=users_login.cfm
username=???
password=???
Since I run Windows XP on my laptop, I use Cygwin to run wget. Cookies are used to store a session ID for you logon session, so wget has to be told to store those cookies in a file. This is the resulting command line:
wget --keep-session-cookies --save-cookies cookies.txt 'https://optimize.vmware.com/index.cfm' --post-data 'fuseaction=Security&Page=users_login.cfm&username=???&password=???' --no-check-certificate -O output.html
Parsing the output.html file, you should be able to see whether logon was succesful. The success depends on many factors (local proxy, network settings, username/password, etc.). You can get more info by adding suitable options to wget.
This concludes part 2 of this series of articles. In this part, we used wget to logon to the VMware capacity planner website.
Friday, October 26, 2007
VMware Capacity Planner: taking the data offline (Part 2: Cygwin & wget)
Posted by Toni at 10:00 AM 0 comments
Labels: Capacity, Monitoring, Performance, Virtualization, VMware
Thursday, October 25, 2007
VMware Capacity Planner: taking the data offline (Part 1: Introduction)
Lately, I have been involved in some VMware Capacity Planner (VMCP) tracks. VMCP deals with monitoring a set of (physical) servers in order to assess possible virtualization scenarios.
VMCP works in the following way: a monitoring server is set-up with a tool that does (but is not limited to) regular performance measurements. This tools sends the data to a VMware database through the web (over a secure channel). Via a web interface, one can then query the information, get reports, view trends, show graphs, configure and create consolidation scenarios, etc. Usually, we let the system run for around 4 weeks to get a realistic idea of the performance characteristics.
What I like about VMCP is that the data is not located at the customer site, and available at all times (once it has been uploaded). This gives me the opportunity to regularly check on the status of the performance measurements.
The biggest disadvantage of VMCP is that the web interface is not the most flexible and fast interface around. Some things I would like to do are not available (lack of flexibility) but could easily be done in, e.g., Excel and at times when everyone in the world is awake it takes ages to refresh a page and get the result of a query. Moreover, it is not easy to get good-looking information to paste in a document.
When it comes to writing a report, the customer is obviously not only interested in a statement like: you will need 5 ESX server of type X to cope with the load. Therefore, I like to add tables with the most useful metrics (CPU, network, Disk I/O, ...) for later reference. I add this information as an appendix.
This is where I would spend at least half a day exporting CSV files from the VMCP website, loading them in Excel, laying it out as a nice table and paste it in the document. I started thinking about automating some of the steps required, and I covered the most time-consuming already: exporting the information from the website as a CSV file.
In the following part, I'll explain how I started this little adventure...
Posted by Toni at 9:00 AM 1 comments
Labels: Capacity, Monitoring, Performance, Virtualization, VMware
Wednesday, October 03, 2007
VDI vs. TS/CTX
Check out this blog article by Brian Madden.
Posted by Toni at 2:01 PM 0 comments
Labels: Virtualization, VMware