Wednesday 29 June 2011

Native VHD data integrity issues / V2P [2]

Last time around I posted on the issue of native VHD data integrity considerations. Native VHD is a very fast to deploy imaging solution. However because having your whole hard drive in a single file increases your vulnerability to disk corruption which could make it much more difficult to recover data in case of bad sectors, I do not recommend Native VHD for computers on which the user will store all their own data, unless that data is stored on a different disk partition.

There are a couple of scenarios for addressing this:

  1. Move the user’s profile storage onto a separate disk partition. Telling Windows to use a different drive for the Users folder should actually be possible by using the MKLINK command, which creates what is known as a Directory Junction. There is plenty of documentation on how to do this on the web. However my primary concern is whether this scenario is officially supported by MS. There are some scenarios where moving directories that are expected to be put into C drive can break the installation of service packs or updates. So this might not be the best option, unless you have tested it and are completely sure it works.
  2. The second scenario is simply to do a V2P, converting the VHD to a WIM file for final deployment. At the moment this is what I am doing for a trial. We will partition the disk just as we do with VHD boot, but instead of copying the VHD to a partition and setting it up there as the boot drive, the WIM file will be applied with ImageX to a partition and BCDBoot will, instead, have to be invoked to cause the computer to boot from this different partition.

The scenario for a V2P is pretty easy to get started with. Simply mount a VHD to a physical computer. In our case I have a virtual server that has the image VHDs stored as files within its own virtual hard disk. Without shutting down that virtual server I can go into Disk Management and attach a VHD, which makes it accessible through a drive letter. I can then start up a deployment command prompt as an administrator, and run this command line:

imagex /capture source_path destination_imageimage name” /compress none /verify

  • source_path is the directory path to be imaged e.g. C:\
  • destination_image is the path and filename of the WIM file to be created
  • image_name is a text string that gets saved in the WIM file to say what it is
  • /compress is an optional switch to specify compression. Turning it off will speed things up
  • /verify is for some sort of integrity checking
  • Note that the /capture switch requires all three of the parameters specified in italics above.

This got me a WIM file in about 38 minutes which is quite reasonable for about the same number of GB. The actual WIM file itself is only 21 GB which is interesting considering compression was turned off. Windows automatically excludes a few paths but it does bring up the idea that the VHD file could be compacted, but I can’t be bothered doing this. Also WIMs support Single Instance Storage which is probably not operating in VHDs, this could also reduce storage.

I then booted up the target in PE and performed the steps similar to VHD imaging, running a script through Diskpart to create the partitions the same way, then rebooted to get the proper drive letters assigned to the disks. Back in PE, I had to copy ImageX to my network deployment file share as it is not included in a standard Windows PE boot image. It is deployed in WAIK and since my virtual server mentioned above is also a deployment server (it has WAIK installed on it) I logged onto that and copied ImageX to the deployment share. I then ran ImageX to apply the image to the target with this command line:

imagex /apply image_file image_number dest_path

  • image_file is the WIM file that contains the image
  • image_number is the number of the image in the WIM file (since WIMs can store multiple images). In this case 1
  • dest_path is where to apply the image to – in this case E:\

The apply process looked like it was only going to take about 15 minutes so I found something else to do while it was working. This is quite quick, as every other time I used ImageX it seemed to take a lot longer. Maybe I was trying to do backup captures with a lot more data. In this case, incidentally, the image is stored on a network share, so ImageX is doing pretty well considering it has to download it over the network connection, although to be fair the server is physically about 2 metres away from the target. There is a bit more distance in copper and 3 switches involved but all of those are in the same building and running at gigabit speed. As it happened the image was completely applied in 16 minutes which is a pretty good achievement.

Once you have finished running ImageX the next step is to run bcdboot in order to designate your deployment partition as the one that gets booted up. This comes about because Windows 7 (and Vista) use a separate system partition to boot the computer. The boot partition then starts the operating system from its own partition (the one the image got loaded onto). The command here is pretty simple:

bcdboot windows_path /s system_path

where windows_path is the Windows directory in the OS partition and system_path is the drive letter of the system partition.

We already used this command for our native VHD deployments in a command script so it looks much the same.

Once this is complete then try rebooting to see if Windows will start up. I found that indeed it booted up as expected. If your VHDs have been built already on that target platform then it is likely they will have all of the required drivers already incorporated. So you are unlikely to run into driver installation problems. As expected Windows has set the partition to C drive (which will occur irrespective of the drive letters that appear in Windows PE; in this case E:)

Therefore the likely scenario for us for laptop deployment is to convert the final deployment image into a WIM and deploy it using a modified version of our Native VHD deployment scenario. We will therefore keep the use of Native VHDs to two scenarios:

  • Directly imaging platforms where user data is not saved to the boot drive (such as student computers or other networked desktops)
  • Testing our laptop deployments only – the actual deployment will be physical.

I now have to decide whether to do a V2P on those laptops I sent out. Since we already partitioned the disks that would be a relatively simple ImageX step but I would have to make a backup of the VHD first. Both however are relatively straightforward, the main issue is how big their VHD now is since we copied all of their data into it.

V2P added to VHD image development does require that extra step so we hope that MS will develop a version of ImageX that works directly with VHD files – at the moment ImageX only works with WIM files. This would eliminate the VHD to WIM conversion step and therefore save more time as well as the need to perform that conversion each time the VHD gets changed, and the extra space needed to store the WIM files.

The overall lesson is not to be too bleeding edge, and to read all the documentation. If you were backing up your VHD regularly then there wouldn’t be a problem. But people don’t do this. Microsoft really does talk about Native VHD being a test scenario. I haven’t seen them support it as a production scenario. It is mostly something that lets you service VHDs without special servicing tools (in other words, in scenarios where you can’t service them except by booting up) and it is only supported on Enterprise and Ultimate editions of Windows 7. We will continue to use it for both virtual and physical deployments as a useful image development system, but the actual deployment will be physical where necessary.

Data Recovery (and native VHD data integrity issues)

There are a couple of important considerations with the use of native VHD based imaging. These are particularly necessary to think about when you are creating an imaging system that is based around the native VHD boot capability

Firstly, computer hibernation is not supported. Microsoft hasn’t particularly provided us with any reason why this is the case; they just state that that is so. This could be an issue with laptop computers.

Secondly, and this is the big one, all of your C drive is stored in that single file. In theory, this increases the risk of data loss from a disk if for some reason that single file is affected by errors on the physical hard disk.

Let me expand a bit on the second scenario. From time to time we see laptops that have a damaged hard disk. Even if the computer can’t boot Windows it is usually possible to recover most of the files on the disk simply by copying. It is rare to lose all of the data on the disk in a situation where there are only a few files that may be unreadable.

If you extrapolate this into the Native VHD scenario then if all your data is stored in a single file and that file becomes unreadable, then normal scenarios for copying files will not work. You will have to consider whether scenarios exist where you might be able to recover only part of the file and if there are any data recovery tools or systems readily available that can retrieve files from a damaged or partially recovered VHD file.

I have just about completed a deployment of laptops with Native VHDs and this scenario has somehow escaped me up to this point. It is very rare for me to have to deal with a situation where a hard disk fails to the point that data cannot be recovered off it, or it is unreadable by Windows (or Windows PE). Usually if there is a small amount of bad sectors I would boot to PE and then use Robocopy to copy as many files as possible onto a backup disk. If a few files got missed because of corruption we would look at recovery scenarios but this has never happened.

Imagine if a VHD file could not be copied in its entirety then you would have to make use of more sophisticated data recovery scenarios. There are some good Rescue CD options for this.

My idea for the moment is to take a look at V2P scenarios. I will post again later on once I have looked further into this scenario.

Monday 20 June 2011

Windows Thin PC Due For Release

If you have a subscription to the Volume Licensing Service Center, you may have noticed a MAK key for Windows Thin PC lately. WTPC is the replacement for Windows Fundamentals for Legacy PCs, and just as that was offered to NZ schools on the NZ MSA, it appears that WTPC will also be offered as part of SAB.

WTPC is a lite edition of Windows 7 and according to the FAQs it is based on Windows 7 Embedded. It will be downloadable through SA from 1 July. It would appear at this time that it could support older PCs with 512 MB of RAM and be able to be used primarily as a terminal server client. Basically this would be an alternative to installing Windows XP on those old machines. However we will have to see if a machine that doesn’t have its own Windows 7 driver could run WTPC successfully because such machines don’t have the WDDM driver that is recommended.

We do have some older PCs in classrooms that aren’t able to run Windows 7 so I expect this will be tried out to see what we can do with them in due course.

Wednesday 15 June 2011

What works in Windows 7 imaging and what doesn’t

With all of the experience I have amassed in Windows 7 imaging this year and with our experiences of different technologies it is becoming abundantly clear what works and what doesn’t, for the number of computers that we have.

  • What doesn’t:
    • MDT. Although this has a great deal of promise it is too complex for an organisation of our size. In order to get the best out of MDT it is necessary to spend a lot of effort setting and maintaining MDT itself, as well as scripting the installations for different platforms. This includes scripting the automatic installation of third party applications that don’t always work or aren’t as well documented or supported.
    • Deploying images that have been sysprepped from installation on one particular hardware platform. Our experience has been that it only takes small changes in a target hardware configuration for Windows to refuse to complete installation from an image that has been sysprepped, regardless of which drivers are involved.

The second point is a particularly troubling one. Our experience with Windows 7 is that it can refuse to complete installation from an image if the hardware is slightly different even where the drivers are not boot critical. This appears to be a fatal flaw in Windows 7 that should not even be an issue. If a device is not boot critical then it should be disabled and the installation continued allowing the affected device to be set up at a later time.

The second point also has an impact on MDT since it is sysprepping an image before it is captured. It may be that MDT’s driver injection process can work around this limitation but not having been aware of it at the time and not being a user of MDT now I can’t say if that is the case.

Along with the built in sysprep limit, this has caused me to review our imaging strategy as below.

  • What does:
    • Native VHD imaging. A great technology that makes it easy to deploy and back up images on target platforms simply by file copying procedures.
    • Deploying pre-sysprepped images to a target and sysprepping only that target as the final setup step. DISM is used only if there is a problem with a boot critical driver.

We have basically proved with our imaging experience to date that a pre-sysprepped image can be deployed to a relatively wide range of hardware. If there is a boot critical problem with such an image it is a relatively easy process to service the image with DISM and inject the necessary driver to the image to get it to boot on the target. Pre sysprepped images do not have the same problem with missing or incorrect drivers that sysprepped images do unless the driver is boot critical, such as for a hard disk or disk interface.

Because all images are only sysprepped as the final deployment on that computer, there is not going to be a problem of drivers providing that the pre sysprep image contains all the correct drivers for the target.

There will be fewer generations of images to store and thus a saving in disk storage space.

There will also not need to be as many different images for different hardware platforms especially those that are not used very often.

Thursday 2 June 2011

VHD Resize [2]

A couple of posts back I wrote about getting a native VHD down to a smaller size to use on a system with a smaller HDD than what it was originally built on. That time I just used standard defrag along with shrinking in Disk Management. This time around that wasn’t going to get the VHD small enough to fit on an 80 GB HDD so I had to try another option. Helpfully Defrag will put information into the event log about immovable files. Using that I worked out I needed to boot the VHD and turn off System Restore in the OS, then mount the VHD in another OS and defrag it again, then shrink again. This time, success!!!! The VHD shrunk massively to 20 GB. After considering the further options, I then expanded it again, up to 50 GB.

After doing this you still need to change it from a dynamic 127 GB file (in this case) to something smaller. Even though it only has a 50 GB partition in it, Windows 7 will still try to expand it to the full 127 GB at startup, which of course fails. So VHDResizer is the next step to get it to, in this case, a dynamic 50 GB file.

There are still a few things I don’t understand, like why a VHD that physically only needs 32 GB of disk space can’t be shrunk in partition size below 72 GB. I presume that the dynamic format compresses zeros or blank space, but if it can manage this then it should be possible to defrag that blank space as well, instead we get the fiction that the unmovable file can’t be shifted, yet we know it is possible with third party tools.

The next little hassle with the target was to get it to boot Windows PE. When I fed it the pen drive, it spat it out. So then I had to spend a lot of time creating a Windows PE 3.0 boot CD, and that is like chalk and cheese compared to the pen drive; it displays the old Vista progress bar instead of the Windows 7 startup animation, and takes a lot longer, with much more blank screen, to get to the command prompt window. From there, after following the standard amount of initialisation of the disk and copying the VHD, it was a surprisingly short step to booting up on the VHD and getting this unsysprepped image to start.