Thursday 25 November 2010

Using Network Access Protection with Remote Access Gateway [1]

In amongst the new functionality for Windows Server 2008 is the Remote Access Gateway (it acquired this name in Server 2008 R2 but was introduced as Terminal Server Gateway in 2008 R1) and Network Access Protection enhancements. The latter technology existed in previous editions of Server but was mainly concerned with enforcing protection against remotely connected clients whereas the current version can apply measures against computers in a local network. For the purposes of this post I am concerning myself with the use of NAP in the context of the use of the Remote Desktop services, and RD Gateway is used mainly in the specific context of a user logging in remotely to a network. Part of the enhancements are concerned with the specific tasks of verifying and enforcing system health checks. The SHA functionality consists of components that run in a client system and report back to the NAP server on the results of specific system health checks (for example the status of any antivirus software that is installed). The NAP server can then decide what kind of access it can grant or deny depending on the results of health checks and what constraints are applied such as enforcing logon hours for the connection.

Getting NAP working properly is a matter of setting up the server components, configuring the client computers’ NAP agents and testing. Windows Server provides consoles for the server component and it is a matter of following some fairly straightforward steps. Then comes the client testing. Without any client configuration I assumed it would work out of the box but it proved to be the case that the server was reporting my client as non-NAP capable. One of the first steps is to open the Windows Action Center and look at the NAP status being recorded under the Security category. This told me that the NAP agent service was not running so I went to Services and configured it to start automatically and started it. The next step was to find the NAP support forum on Technet from which a post gives me some instructions in how to carry out some diagnostic checks by opening a command prompt and running the command netsh nap client show state. This told me among other data that

Id                     = 79621
Name                   = RD Gateway Quarantine Enforcement Client
Description            = Provides RD Gateway enforcement for NAP
Version                = 1.0
Vendor name            = Microsoft Corporation
Registration date      =
Initialized            = No

Further testing and checking has suggested the next step is to run the client configuration console on my computer, this is by running NAPCLCFG.MSC. Alternatively the same post gives information about how to run commands in a command prompt to achieve the outcome of changing that Initialized status to a Yes. Although this has occurred the next logon did not result in a remediation of this status and the client is still non-NAP capable according to the server. At this stage I have to leave my checking and go to other work so I will continue with this process over the weekend.

Saturday 20 November 2010

Excel import limitations in Access and Word

Earlier this week I wrote about the frustration of transferring Excel data to Access with flaky OBDC connections or drivers. I’m not sure that was exactly the cause but a lot of error conditions were experienced trying to manipulate data with the linked tables which were Excel spreadsheets. Eventually I got the data to be transferred using VBA but it has come home to us this week in transferring data from Excel to Word (using mail-merge) for our school reports that there is a 255 character limit on some text fields. Excel basically will check the first 8 rows of a spreadsheet in order to try to determine the type of field and if strings are less than 255 characters in length they are cut to that length. From Access we know that to get more than 255 characters of text you have to use a memo field. If you use VBA code to automatically import a spreadsheet to a new table the data type of the table is set on each column automatically based on the above (text for strings up to 255 characters and memos for more than 255 characters). This can lead to data loss.

On the other hand if you manually import the spreadsheet from Excel into Access you get to specify the data type for each field as part of the wizard and can then choose to specify Memo for the fields which have more than 255 characters of text in them. The data is then imported according to the specified settings and is fully preserved. However I find it annoying that this can only be guaranteed to work if the manual import process is followed each time. If you create a table and change its field definitions and then use VBA code to import that table it will not honour the field data types, it just makes them up again which is pretty stupid because it should be able to detect what the user wants since they have no other way of programmatically specifying the data type for each field.

There is a registry key called TypeGuessRows which is supposed to have an impact on earlier versions of Office. It looks like support for this feature has been dropped in Office 2010 because changing the setting had no effect on the behaviour either in Word or Access.

I would like to say I am extremely annoyed at the way this functionality has been sprung on Office users through successive levels of arbitrary design and coding decisions taken at Microsoft. It is likely there are a considerable number of people doing mail merges from Excel sheets with text fields larger than 255 characters who will be impacted negatively by this design limitation. In previous versions of Office the registry mentioned provided a means of working around this problem. However the apparent discontinuation of this functionality in Office 2010 makes it impossible to guarantee a successful import or merge unless a clumsy workaround is used (this is to put a row at the top of your spreadsheet, preferably the 2nd row, that has more than 255 characters in each text field). Furthermore, that when using VBA code to import a spreadsheet directly in Access, there is again no surety, it would be OK if there was a means to specify the import specification which can be done with the TransferText method but this is not possible when importing a spreadsheet because that option simply isn’t available. You can only guarantee with a manual import and setting each field type at import time that you will get a successful transfer of all long text fields. There may be ways to work around this problem if you are using VBA  with methods other than DoCmd. However these means seem to rely on the same drivers which are the functionality where the problems exist.

My advice therefore is to use one of the following:

  • Migrate to MS Access (which has a lot better functionality as a database that Excel)
  • Migrate to some other database if you can so you don’t have to use any part of Office
  • Try the workaround for the second row of the spreadsheet for current worksheets you need to import to Access/merge to Word

Wednesday 17 November 2010

Native boot VHDs and MDT [4]

This has taken a break because I have been very busy with other things. However I had a look at it at home last night and used Wim2VHD to make a VHD containing Windows 7 Enterprise x86. I then copied it to a portable HDD and took it to work this morning. I think the main error I made is that the system partition on the computer has to be set as Active (i.e. boot partition). This makes sense as it will be booted by the Bios and it then runs the bootloader to get Windows started. The “active” name is very poorly chosen for its function. Once I made that change with Diskpart BCDEdit suddenly started working and I could see that it was configured to boot the VHD. Once the VHD had been copied to the computer and restarted it immediately booted the VHD and after going through the usual first time steps it is now fully functioning. I noticed that D drive is attached and this is actually the physical hard disk with the files present on it. Since we can’t remove drive D we would look to limit access to it by ordinary users and provide them with another VHD as drive E for any scratch space.

I have also tried VHD Attach at home and it works well with attaching files as drives after the computer has booted up so that should be OK. The next thing is to work out how to add drivers to an image that has been sysprepped so it can automatically load them when the computer starts with the new VHD image. There are a couple of ideas out of all that I have looked at: test if MDT can do this as part of a sysprep task, or use DISM to service the image by adding drivers to it. I think MDT uses DISM to perform the inject task anyway. So the next part of this series will hopefully be wrapping up these options.

It turns out from examining the documentation that DISM is very easy to use with a VHD file. If you have the VHD mapped to a drive letter you just have to open a command prompt and type a few simple commands. If I have my platform drivers all nicely imported into MDT I can just pass the root folder to DISM and tell it to extract all the drivers from it, for example (if my VHD has drive letter V and my MDT path is on drive S:)

  • DISM /image:V:\ /Add-Driver /driver:S:\drivers /recurse

Then, in totality, my deployment process may well end up looking something similar to the following:

  • Develop the image on a VHD using a virtual machine
  • Run MDT sysprep only task to sysprep the VHD
  • Run DISM as noted above
  • Copy the VHD for deployment.

Ideally a future release of MDT would automate these steps in a task sequence.

The steps to set up a physical machine would be different and I would like to see if they can be scripted in some form.

NZ Open Source Society Praises Microsoft

Remember… you heard it here first…

http://www.interfacemagazine.co.nz/articles.cfm?c_id=&id=791

http://wikieducator.org/Microsoft_Launches_Open_Source_Filter_for_Mediawiki

Peter Harrison, Vice President of the New Zealand Open Source Society commends the release.


The Internet provides humanity with a unequalled opportunity to leverage our communication technology to educate people across the globe. Through collaborative technologies such as Wiki people can work together to create rich common resources that are open to all. By enabling users to export their content from Word into MediaWiki Microsoft are encouraging the availability of a far wider range of educational resources online.

—Peter Harrison, New Zealand Open Source Society

Sunday 14 November 2010

MS Access and ODBC Drivers and “useful features” in Access 2007/2010

I have used MS Access for 15 years to do various types of reporting and it is my defacto tool of choice with input data that looks like some form of database and a report as output, especially when it has multiple pages, requires calculations, or data to be joined together from multiple tables.

One thing I have been aware of over the years is that there are numerous ODBC drivers about that don’t like giving a linked live view of their data into queries that are joining multiple tables. I can’t quite explain this but it is not uncommon in a production situation to get problems when linking data via an ODBC connection to see error messages or missing data.

The three situations I have seen this in are:

  • A holiday job where I had to write reports to extract data out of an accounting system and display it.
  • Integris, customised reporting from data in the SMS
  • Excel, linking worksheets into Access for reporting

In all three cases working live with the source data (Access linked table) via ODBC seems to be the problem. The solution in all cases is to implement an intermediate step whereby data is imported into a table in the database, and then that table is used to provide the data for reports. This has proved necessary so many times that it should almost be considered routine.

The second part of this article is that I found that in the process of running make-table queries to import the data I would get an error that I was violating the range of values that could be put into a table. Now, for a make-table query, that doesn’t make sense. The table is being created from scratch to contain some data. This couldn’t possibly make sense unless Access thinks that somehow it knows what type of data is supposed to be in each field of your new table. Maybe it looked at the previous version of the table; the one you may have just deleted before you ran this make-table query. But the whole point of a make-table query is that it creates the table from scratch. If I wanted to update the records of an existing table I’d code the query as that. Access asks you to delete the table and then gives you the error message. If you delete the table manually and then run the query, you don’t get the error.

This unfortunate design “feature” of Access 2007 and Access 2010 cannot be turned off. I’ve read that there wasn’t a problem in previous versions of Access. The only change I was making was to increase the length of a text field and put more data in it. The default text field length is something like 255 if I remember rightly. But Access must have somehow locked that text field down to the exact length and in this case when the length changed it threw the error. Since it has already deleted the table and is creating it from scratch that shouldn’t actually be a problem. Note that here we are importing data from Excel linked sheets.

Further to the above: the whole Access importing/linking from Excel seems to be a complete crock. I think it’s a very sad situation when data can’t be linked from a product made by the same company without problems. Actually, there are problems all across different MS products, the whole linking/embedding thing seems to be overblown. Had to make even more changes to the report I was working on trying to get around all the issues, now I’m wondering if it is really ODBC or Access itself is at fault.

Wednesday 10 November 2010

Native boot VHDs and MDT [3]

At the moment I am trying out the walkthrough applying a sysprepped image directly to a VHD file that I created with Diskpart. This tool can be used with VHD files as well as ordinary hard disks and then it can mount the VHD to a drive letter.

As noted in previous discussion while this gives me an image that has been sysprepped, what it doesn’t have is the injection of drivers for a target platform so I don’t know what will happen when it is booted up to a specific platform. The previous debate so far has been over ways to inject those drivers. If my walkthrough works I will have to move on to trying out the different forms of driver injection.

First problem on the walkthrough was that the PC is too old to run x64. OK so then I set up a VHD for x86, because I didn’t have a captured image for Ent x86 I used ImageX to apply install.wim directly from x86 Ent ISO. The rest went smoothly enough until reboot. Note that you need to use the Windows 7 version of ImageX, the version from WinPE 2.0 (Vista) does not support vdisk (VHD). In order to run this I used a MDT LiteTouch boot CD in command prompt mode. When I rebooted I got “Bootmgr is missing”. The problem seems to be that the instructions in the walkthrough are wrong in some way. The VHD was assigned drive letter C, but on reboot, the drive letter C has been reassigned to the computer’s physical disk, and there seems to be no way to make the letter C stay assigned to the VHD. When I run the recovery console and open a command prompt, I change to C drive and then dir and find that my VHD file is present instead of the Windows directories that are present inside the VHD image. I can use Diskpart to assign a different drive letter to the physical disk and to assign drive letter C to the VHD, which is where the Windows files are, and exit Diskpart and find that C drive looks like a normal windows boot disk. Then BCDBoot can be used to set up the boot. I tried this a few different ways, assigning for example drive letter F to the VHD, and then it should have booted, but the drive letter F was lost on reboot and it wouldn’t boot at all.

However on the other hand I did find this posting which tells me about a useful tool called wim2vhd which is produced by Microsoft and will make a VHD easily from a Windows install DVD. This appears to automate some of the manual steps listed in the walkthrough. Another post here gives more useful info. Right now, though, I am stalled until Technet can give me a way to make the drive letter assignment stick. Here is some more info on how to sysprep a VHD so that it can be copied to another machine. I plan to sysprep using MDT just to simplify things especially if we can automate driver injection in MDT.

Since first writing the above, it has been determined that the limitation is built into Window 7: attached VHDs are not automatically reattached at boot time unless required to boot an OS. I guess this could be because of a performance effect on startup. However, I think that Windows should provide support for turning on automatic reattachment as I can see a scenario that would be attractive is to have a data VHD as well as a OS VHD in place of physical disk partititions. Fortunately others have recognised this need and have addressed it with software and scripts (here’s another post suggesting how to set up as a scheduled task). My preference would be the free software, if it is free, followed by the Powershell script (which automatically scans and attachs all VHDs in a folder). This puts pressure on MS to include this feature in subsequent editions of Windows.

I am now working on a new VHD creation at home in my free time, the first step is to upgrade my PC from Pro to Ent. Also here I commend the addition of various useful features to later editions of 7-Zip, a free zip archiver. The latest release edition 4.65 can open and extract ISO and WIM files, and the development beta 9.14 can open and extract VHD files. So I didn’t have to buy software to get stuff out of an ISO file, and you don’t any longer have to mount a WIM in ImageX to get stuff out of it. The VHD functionality is mainly of benefit on older versions of Windows that don’t incorporate VHD attachment (i.e. versions prior to 7).

Thursday 4 November 2010

Native boot VHDs and MDT [2]

Continuing on from yesterday’s post, I did some thinking about this overnight and have made enquiries on Technet as well. I think the best scenario for creating your own native boot VHDs using MDT is to deploy an OS to a VHD using a standard deploy sequence. You would then run a sysprep only task sequence from it on MDT, using the LiteTouch script in the deployment share. What is particularly needed is to be able to provide or inject custom drivers before sysprepping, rather than this being done in a setup task, because we aren’t going to deploy using a deployment task sequence in MDT. The basic situation is that we are going to sysprep the OS installation in the VHD, and then deploy the VHD to platforms just by copying it. The platform is then native booted from the VHD and it runs through the normal first time startup sequence and finalises installation.

One possibility for the driver injection is to see if it can be added to a sysprep task sequence (it has to be done before the sysprep task is executed). When you look at the standard sysprep and capture task sequence in MDT you will see a task that adds mass storage drivers to sysprep.inf for Windows XP and 2003. However I’d presume in Windows 7 it is correct to use the Inject Drivers task sequence, the question is to whether this would actually work or not, and exactly where you would put it.

The second option is to use the Driverspath variable or Devicepath registry key. There are a few relevant posts listed below:

Obviously my preference would be to leverage MDT and get the drivers injected via a step in a Sysprep task and so I am trying to find out if this is possible. It should be because my experience with Sysprep is that the unattend.xml file contains the information for all of the Sysprep passes, of which the relevant one here is the specialise pass and offline servicing pass, hopefully I do not have to hand edit the xml file and hand run Sysprep, it should be possible to either use a inject drivers task or some means of customing unattend.xml before Sysprep executes and hopefully we don’t run into a lack of drivers.

There are a number of manual tasks to get the VHD able to be booted on a target system, so I’d be looking for some means of automating these to an extent, and buying a 16 GB or larger pen drive for the copying, otherwise I can use an external HDD.

Wednesday 3 November 2010

Native boot VHDs and MDT [1]

In my previous post I wrote about the development of the native boot VHD capability in Windows 7. As I noted, the specifics of this functionality is that a VHD can be copied to a destination computer’s hard disk and attached as a boot device. Windows can then be booted via the VHD file. There are considerable advantages for small scale imaging without the use of a full WDS environment such as we looked at in the past (about 4-5 years ago when I had more free time, I invested a considerable effort in setting up a Remote Install server and learning how to use RIS. The server did get upgraded to WDS but since then with Windows XP we have simply used Ghost for a very occasional re-image scenario).

My current plan is to have a look at a deployment walkthrough on Technet. However the next idea which has come along pretty quickly is whether MDT can help with creating a generalised VHD for native boot. So far I have found info on customising an MDT deployment task sequence to a VHD file on a physical computer. This doesn’t really address what I had in mind. My situation is that I want to be able to create a generalised VHD file that can simply be copied to target computers that applies the steps of the MDT deployment sequence, i.e. when the machine is booted to the VHD for the first time the operating system is deployed as if a user had selected a particular deployment task through the wizard. Generalised is important because the image has been sysprepped for deployment to multiple machines, and the MDT deployment task sequence can do useful things like inject drivers and install software allowing MDT’s customisation capabilities to be fully leveraged. The above scenario is basically to create an MDT deployment task sequence that will deploy the OS to a blank VHD on the target system instead of a physical hard disk. It is a good step in the right direction but I am looking for a bit more than that.

MDT does have support for using media to create bootable OS installs. For example you could deploy a task sequence using an external hard drive. What I am looking for beyond that is simply being able to copy a VHD to a computer, attach it as a boot device as described and then boot the computer which runs a task sequence saved on the VHD, applies an image and the deployment steps from the VHD and deploys the OS possibly to another VHD on the target platform. It may be that going through the media deployment steps is able to create the type of installation I am seeking, although the best scenario by far is to automate the deployment wizard steps, for which there are already means provided.

Backup imaging, VHD native boot, network management & remote desktop management

A while back I wrote about my experiments in using MDT to back up laptops. Some of my MDT shares got moved to a new server recently and the backup share had to be set up again from scratch, which for some reason proved exceptionally difficult but it has happened eventually. However while I was working on that I realised there is another option to imaging a laptop and that is the Disk2VHD tool from Mark Russinovich. This uses Volume Shadow Services to image a live disk (you run it on the computer while Windows is running) and turns it into a VHD file. The great thing about a VHD file is that you can simply hook it onto Virtual PC, or in this case, a new virtual machine that I created in Hyper-V, and in this case the virtual disk worked when I booted the VM from it so that particular laptop can be brought up in a virtual machine with all of the software and files on it that the user had. Although in most cases I find that the user does not need to have further access to their old data (we always transfer everything across when they change laptops) this is another option in special cases where they may have an old application and there is even the possibility that the VHD could be put onto their new laptop with Windows Virtual PC. Take note that there is a setting in Disk2VHD that is required when you create a VHD for Virtual PC because otherwise the VHD won’t boot even in Windows 7 VPC. Next idea may be to add VPC to our standard laptop image.

I’ve noticed that there is a capability in Windows 7 for a VHD based boot of the operating system, which can be deployed simply by copying the VHD file to the hard disk of the target computer. This would appear to be worthy of further investigation, since it removes the need to use, say, WDS for a mass deployment of an image to multiple computers. If you can install an OS simply by copying the VHD file which contains an image you have built on a virtual machine for MDT deployment then I can see that making upgrades to student PCs would be much simplified over the standard WDS type deployment scenario if you don’t want to install a WDS server. The scenario here is that you would build your reference image on a VHD using a virtual machine so that drivers for that platform are injected into the image as it is built, and presumably when the VHD gets deployed onto the target machine, the drivers for the platform are able to be installed when the hardware is detected on first boot. Native VHD boot requires Windows 7 Enterprise, which is part of Software Assurance in the MS Schools Agreement. I am planning to have a look at this scenario here, seeing if I can use one of my existing MDT deployment scenarios to deploy an OS to a new virtual machine and then capturing that using the MDT Capture and Sysprep task into a WIM file that can be turned into a VHD file as described in the walkthrough. Hopefully MDT will be updated to simplify the steps needed.

Network management takes many forms but one consideration is how to track the traffic moving through different physical branches and the SNMP protocols are designed for this. SNMP can be used for many things these days but part of its fundamental functionality is to enable monitoring of network hardware such as switches and logging the utilisation of the network through individual ports. SNMP is not a technology I have had knowledge of before now, however there are a number of software packages that can be used to collect data from SNMP and we are currently playing with the free edition of ManageEngine OpManager which allows 10 interfaces to be tracked, so I am using it to see what is happening on my wireless network and hopefully the two managed switches on the backbone of our network will be able to give me the information about the traffic through each port which will help us to determine where congestion might be occurring on our network.

If you have any number of desktops to manage or at least monitor to keep an eye on what is happening on them then it will be advantageous to you to have remote control software. In the educational environment there are a number of packages which exist of which the most common features include being able to control all remote computers, put an image of one computer onto all of them, lock the keyboards/mice and watch what users are doing on them. I have looked at these packages for years and while there are different capabilities and functional levels they are also valuable for IT administration to save having to visit every remote computer, or at least being able to perform useful functions like simultaneous logon and software installation. The cost also varies widely and for our number of computers there would seem to be little benefit in spending $100 per computer when there are packages that are flat licensed for a few hundred dollars for any number of clients.