Powered by System Center
Posts tagged Hyper-V
It is only several days since Microsoft made the final bits of System Center 2012 SP1 available, although for a relatively limited audience. If you have a TechNet or MSDN subscription, you have access to SP1 now. All others have to wait till general availability in the course of January 2013.
One of my fellow MVP’s in the Cloud and Datacenter Management department, Graham Davies, raised my interest by claiming there was something in VMM 2012 SP1 that had gone unnoticed: a brand-new SMI-S storage provider for the onboard iSCSI Target Server role in Windows Server 2012! People using the iSCSI Target Server for both demo/test/development and production will get quite excited if they discover how well this software storage solution is integrated, not only in Windows but now also in System Center 2012 SP1.
Back in March 2011, I wrote several blogs about the new Fabric concept in Virtual Machine Manager 2012 which was still in its early beta. One blog focused on deep storage integration using a SNIA standard called Storage Management Initiative Specification (SMI-S). When I was contributing my Fabric chapters for the Microsoft Private Cloud Computing book, there was only a limited number of usually very expensive storage arrays to really dig into this subject. I had access to some HP and NetApp storage to test SMI-S integration which was still very limited at the time. When we saw how the iSCSI Target Server, which was previously a separate install on Windows Server 2008 R2, developed and became included as a role in Windows Server 2012, we begged the product managers responsible for storage in System Center 2012 to also provide SMI-S support. Well guys … here it is!
Fast forwarding about two years, a lot has happened in the storage space and most storage vendors have introduced SMI-S support for their most important storage products. I now consider storage that does not support SMI-S (and some of the other cool technology such as ODX and Unmap/Trim), as a sign that these products are on that vendor’s death list and will soon be obsolete.
Especially for those who want to manage their Windows Server 2012 servers and clusters, an important moment has come. System Center 2012 SP1 is not only RTM but also ready for download on Technet. Several of our projects were delayed because we needed the SP1 version of Virtual Machine Manager 2012, Data Protection Manager 2012 and several of the other components to manage Windows Server 2012 Hyper-V and Hyper-V Server 2012 cluster. The long wait is over!
Because the new Windows operating system introduced such a shipload of new features, you can imagine what this meant for developing SP1.
It is good to know that some of the ISOs provided contain the full product including SP1 which makes it easier if you want to do a completely fresh install.
Release notes and documentation SCVMM 2012 SP1: http://technet.microsoft.com/library/gg702209.aspx
Installation Guide: http://technet.microsoft.com/library/gg610669.aspx
Release notes and documentation SCOM 2012 SP1: http://technet.microsoft.com/library/hh561709.aspx
Installation Guide: http://technet.microsoft.com/library/hh298609.aspx
During the last two months hundreds of Hyper-V users have taken the time and effort to participate in The Great Big Hyper-V Survey of 2012 which closed last week. We cannot share any details about the results just yet as we are still counting and analyzing the results. We’ll let you know as soon as we can. Promise!
As a stimulus for you we offered a couple of Microsoft Private Cloud Computing books. Because Aidan Finn said he would select a winner for the book based on the responses he got on his own blog and on Twitter, I still have two books to give away. One is offered by co-author Damian Flynn and the other by me.
It was quite interesting to see how much attention has been raised on the web, on Twitter, on blogs, perhaps also on Facebook (I can’t tell because I got fed up with FB and closed down my account about a year ago). I bang “The Great Big Hyper-V Survey) and got 15,100 results. I googled and found 72,600 results. Of course Google also placed an ad on top, but please rest assured that this survey has no connection with Microsoft whatsoever. They are probably just as curious as you and me for that matter.
All right, now it is time to pick a winner… (please click on More)
As most of our readers know by now is that we are currently asking users of Hyper-V to complete our second edition of The Great Big Hyper-V Survey. Last year we had several hundred people who took part in the survey.
Our original plan was to close the survey after one month, but after some consultation with Aidan Finn and Damian Flynn, we decided to keep the survey open until December 1st.
You Can enter the survey here:
Meanwhile I had also offered a free copy of our Microsoft Private Cloud Computing book, which is a great resource on several System Center 2012 products and deployment of Hyper-V in a Private Cloud scenario. Because we have kept you waiting for another month, I am pleased to triple the number of books offered as a prize.
Both Aidan Finn and Damian Flynn were so kind to each offer an extra copy so now we have three Microsoft Private Cloud Computing books available.
So here is another opportunity for you to win one of the three books or enter the survey if you have not been able to do this. We are stressing that this survey is read by Microsoft and we are still able to influence what is important for the next release of Hyper-V. Be our guest!
All you need to do is tweet or blog with a reference to this blog:
Many businesses are changing their perspective on cloud services. A lot of arguments that prohibit acceptance of the cloud slowly start to crumble. Internet connectivity is a common good, bandwidths are increasing and even the legal aspects are dealt with. Instead of a threat the cloud now empowers possibilities. Microsoft anticipated on this change years ago with numerous game changing developments. They have set the bar with their public cloud offerings like Office365.com, Outlook.com and Bing.com just to name a few worldwide deployed services. They have even raised the bar further with their cloud operating system Windows Azure. Providing these services to millions of people gave Microsoft great insight.
Leveraging the knowledge learned through their public cloud offerings, Microsoft has created a great platform for the private cloud with Windows Server 2012 and System Center 2012. This scalable solution provides businesses an elastic environment of pooled resources with self-service capabilities and usage based metering.
Microsoft now takes it to the next level by bringing Windows Azure to Windows Server. This solution mainly focusses on service providers but if you do not consider yourself one, please continue reading. The new service has also immense potential for enterprise organizations and might even trigger IT organization to think about providing hosted services.
Windows Azure for Windows Server consists of a Service Management Portal and a Service Management API. Microsoft has released a beta of Windows Azure for Windows Server with two services and will continue to add services over time. The first two services are high density website hosting and virtual machine provisioning and management. I will focus on the virtual machine provisioning service in this blog.
Microsoft provides an end to end solution enabling service providers to create an IaaS (Infrastructure as a Service) offering. This end to end solution consists four layers.
- Windows Server 2012: The virtualization layer
- System Center VMM 2012 SP1: The management layer
- Service Provider Foundation for SCVMM: Multitenancy for SCVMM and a REST ODATA API
- Windows Azure For Windows Server (WA4WS): The Service Management Portal and API
Windows Server 2012 is the most cloud capable Operating System ever released. Combined with System Center Virtual Machine Manager 2012 as the management tool, they form the building blocks of the modern private cloud, providing scalability and availability features to meet even the most critical environments. Service Providers running hundreds or even thousands of workloads for their customers have these requirements. Leveraging the power of the private cloud, Microsoft enables multitenancy with the Service Provider Foundation (SPF). SPF also creates an industry standard Restful ODATA web service that developers can use for programming to System Center VMM 2012. This means that service providers can retain their current portal solutions and still benefit from the private cloud provided by Microsoft.
Windows Azure runs tens of thousands of customers and hundreds are added to the platform each day. As you can imagine the management portal, the primary tool for this platform, is used heavily. Since the launch of Windows Azure on February 1, 2010 the management portal has evolved to fulfill a diversity of requirements, from browser independency to easy administrating, monitoring and diagnostics. Microsoft has taken these developments to complete their end to end offering for service providers and created a Service Management Portal combined with a Service Management API for Windows Server that are now available in Beta.
If you install Windows Server 2008 R2 with Hyper-V, the pagefile will be automatically managed, which usually has a 1:1 ratio with physical memory. However VM’s use their own pagefile and do not use the host paging mechanism. If you have lots of memory, chances are your pagefile will be way too big.
If we compare two servers with 16GB of memory each, one installed with Hyper-V R2 and the other installed with Windows Server 2012 Hyper-V we clearly see different numbers. Both servers have run for several days.
In the figure to the left you will see that Windows has 16GB allocated but recommends 24GB (1:1,5)
In the figure to the right you see that Windows Server 2012 is much more intelligent and allocates considerably less memory for paging. It is fairly consistent with the best practice in R2 to set a fixed maximum for the pagefile between 4 and 6GB.
Although I haven’t found any best practices yet for the pagefile in Windows Server 2012 Hyper-V, this might suggest that you can use the autopilot for virtual memory in the latest server edition of Windows. Of course we need a little bit more experience in the field to really call this a best practice.
If you are looking for a very thorough blog on pagefile settings in R2, please visit this blog.
Here is another one on technet:
And finally a post about the Hyper-V Dynamic Memory and Host Memory Setting in R2:
Host Memory Reserve
The Host Memory Reserve, which reserves memory for the processes in the parent partition can be found in the following registry key:
Registry Key: HKLM\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Virtualization
Value Name: MemoryReserve
Value Type: REG_DWORD
According to this post, the Host Memory in R2 is calculated in MB as follows: 384MB + (Memory in GB * 64)
So a 16GB host in R2 will have a default Host Memory Reserve of 384MB + (16 * 64) = 1.408MB
In two 16GB Windows Server 2012 hosts I looked at, the default Memory Reserve was 2048MB. We have to conclude that the formula used for Hyper-V R2 does not apply to Windows Server 2012.
I also looked at a Windows Server 2012 Hyper-V host with only 4GB which had a currently allocated pagefile size of 704MB and a recommended pagefile size of 3.582MB. This small host did not have a Host MemoryReserve entry at all, probably because a default 2GB Memory Reserve would be disproportional to the physical memory.
I had an interesting discussion about MemoryReserve with Michel Luescher who works for Microsoft Consultancy Services at Microsoft Schweiz. Michel is writing the chapters in Windows Server 2012 Hyper-V Installation and Configuration which deal with this subject.
IMPORTANT NOTE: Do not manually configure the Host MemoryReserve entry and leave this to Windows Server 2012 which will auto-configure this value on an as needed basis
In fact any newly installed Windows Server 2012 server with the Hyper-V role does not have a Host MemoryReserve entry in the registry and we still need to figure out under what circumstances this entry is added to the registry.
Take a look at Michel’s blog (German) on the subject:
A few days ago I promised to send someone a free Microsoft Private Cloud Computing book, provided they tweeted or blogged about The Great Big Hyper-V Survey of 2012.
Of course we want as many people to participate in this survey as possible. That’s why we have been rather persistent on Twitter and we are glad so many of you have already given attention to our survey. Several hundred Hyper-V users from all sizes of companies including several Fortune-500 companies have already contributed.
If you haven’t completed the survey yet, let me offer the book as an incentive.
But I haven’t explained the rules yet:
Of course fellow authors and colleagues of mine are not eligible to this prize. Sorry Aidan, Patrick, Damian!
You must have retweeted my tweets on TGBHSof2012 or you must have named us in your blog
Mentions on Facebook are not valid (as I have abandoned Facebook some time ago). Not sorry for Facebook : –)
If you retweeted I must be able to find you in this list which keeps track of the retweets:
If you blogged and mentioned this page you are eligible:
I will announce an answer on the day of the close of the survey: October 31st 2012 19:00 CET
If you are using Red Hat Enterprise Linux you’ll find great support for Hyper-V as a standard feature in the new minor Linux 5.9 release for which a beta became available recently.
Running Linux distributions with native Hyper-V support will save you the trouble of separately installing Hyper-V Integration Components to provide support for multiple cores and synthetic drives for mouse, video, network and storage. The Hyper-V Linux drivers were recently accepted upstream by the Linux community. For Red Hat Enterprise Linux 5 this means running as a guest on Hyper-V will improve overall performance.
Microsoft has started offering Online Backup of Hyper-V Virtual Machines to Windows Azure using System Center Data Protection Manager 2012 SP1. In this blog I will explain how to set this up and I can assure you this is absolutely no rocket science.
First of all you need to register for an account to get access to Windows Azure Online Backup Preview. The registration process will not ask you for a credit card and offers you 300GB for the limited time of 6 months to test with. Well that sounded like an offer I couldn’t refuse.
The best way to start this process is to select the Management work pane from the DPM 2012 SP1 Administrator Console. When you are registered you can click on Manage Subscriptions from the Ribbon in DPM. It will ask you to login with your newly created [name]@[domain].onmicrosoft.com account.
After sign in you arrive at the Windows Azure Online Backup portal. Click on the Setup menu, download and install the Window Azure Online Backup Agent for Windows Server 2012. Note there is a special module for Windows Server 2012 Essentials.