Wednesday 22 December 2010

Using Network Access Protection with Remote Desktop Gateway [2]

It is almost a month since I wrote about this topic. I was able to defer the setup of the RD Gateway that I was previously doing until today when I had to finish it without completing the NAP client setup. This means that for the moment all clients connecting to our server will be assumed to be non-NAP capable and the system health checks will not be performed on them. However in the course of the work I was doing today I stumbled across this useful article on MSDN:

The articles contain other content that I was not aware of up until now, namely that you have to do things on the client with the server certificate and add the RD Gateway server to the Trusted Server list on the client. The second article provides a link to a script to help configure the client. There is also a lot of information there about how to test for the proper operation of the NAP client and SHVs etc. Although I am of course very busy at the moment with just a couple of days of the work year to go, I will have a look at this stuff in more detail over the break when I am actually on leave.

We are about to take delivery of 35 brand new computers which is a substantial order for us and they are all built locally by one of NZ’s top educational market computer companies. The installation of these will be our major project over the summer holiday break with just some minor maintenance work being carried out as far as the servers go. The majority of the computers will probably be configured to use native VHD boot with Windows 7.

We are about to set up a proper backup system using removable hard drives and commercial grade backup software, replacing the use of spare server capacity and scripting to back up stuff. This is important as once we can link our sites by fibre it will dispense with the necessity to have duplicate servers in the two sites and enable the main file servers to be consolidated into one.

Monday 13 December 2010

End of year Catchup

By now it will be apparent I have not posted any work related content here for a good while. The reason is mostly that we are hideously busy with end of year stuff.

Here is a useful code snippet. How to get a ping log with nice neat output, with the time and date displayed. It is great when you are having problems with your internet connectivity and need a continuous log of pings at a particular time/date.

@echo off
:top
for /F "tokens=*" %%i in ('ping 1.2.3.4 -n 1') do set Pingresult=%%i
echo %time% %date% %Pingresult% >>pingrg.txt
echo %time% %date% %Pingresult%
goto top

Basically this tells the Windows shell command interpreter to:

  • Do a single ping and store the results in an environment variable called Pingresult
  • Send one line of text to the end of a text file
  • Send the same line of text to the screen.
  • Repeat infinitely.

Now for more substance. We are working through various things including buying lots of new computers so I expect that some of the process of deploying a large number of new Windows 7 computers for students will be written about here within the holiday period that is starting soon.

Sunday 5 December 2010

Slingshot complaints feature in the Herald

Around two years ago I wrote about my trials in changing ISP from Telecom to Slingshot. The experience was bad enough that I ditched Slingshot and changed to TelstraClear and haven’t looked back. In fact I wondered why I hadn’t gone to Telstra in the first place. Today there’s an article in the Herald on Sunday about Slingshot. I don’t know if the experiences I had were similar to those mentioned. However I was fortunate in that Slingshot gave me a refund of advanced charges and waived the early termination fee; I only had to send the modem back to them.

One thing to be wary of is that Slingshot is not a member of the Telecom Dispute Resolution scheme. Most major carriers are and it gives you an avenue that you can follow if you feel a complaint has not been adequately resolved by the telco.

Thursday 25 November 2010

Using Network Access Protection with Remote Access Gateway [1]

In amongst the new functionality for Windows Server 2008 is the Remote Access Gateway (it acquired this name in Server 2008 R2 but was introduced as Terminal Server Gateway in 2008 R1) and Network Access Protection enhancements. The latter technology existed in previous editions of Server but was mainly concerned with enforcing protection against remotely connected clients whereas the current version can apply measures against computers in a local network. For the purposes of this post I am concerning myself with the use of NAP in the context of the use of the Remote Desktop services, and RD Gateway is used mainly in the specific context of a user logging in remotely to a network. Part of the enhancements are concerned with the specific tasks of verifying and enforcing system health checks. The SHA functionality consists of components that run in a client system and report back to the NAP server on the results of specific system health checks (for example the status of any antivirus software that is installed). The NAP server can then decide what kind of access it can grant or deny depending on the results of health checks and what constraints are applied such as enforcing logon hours for the connection.

Getting NAP working properly is a matter of setting up the server components, configuring the client computers’ NAP agents and testing. Windows Server provides consoles for the server component and it is a matter of following some fairly straightforward steps. Then comes the client testing. Without any client configuration I assumed it would work out of the box but it proved to be the case that the server was reporting my client as non-NAP capable. One of the first steps is to open the Windows Action Center and look at the NAP status being recorded under the Security category. This told me that the NAP agent service was not running so I went to Services and configured it to start automatically and started it. The next step was to find the NAP support forum on Technet from which a post gives me some instructions in how to carry out some diagnostic checks by opening a command prompt and running the command netsh nap client show state. This told me among other data that

Id                     = 79621
Name                   = RD Gateway Quarantine Enforcement Client
Description            = Provides RD Gateway enforcement for NAP
Version                = 1.0
Vendor name            = Microsoft Corporation
Registration date      =
Initialized            = No

Further testing and checking has suggested the next step is to run the client configuration console on my computer, this is by running NAPCLCFG.MSC. Alternatively the same post gives information about how to run commands in a command prompt to achieve the outcome of changing that Initialized status to a Yes. Although this has occurred the next logon did not result in a remediation of this status and the client is still non-NAP capable according to the server. At this stage I have to leave my checking and go to other work so I will continue with this process over the weekend.

Saturday 20 November 2010

Excel import limitations in Access and Word

Earlier this week I wrote about the frustration of transferring Excel data to Access with flaky OBDC connections or drivers. I’m not sure that was exactly the cause but a lot of error conditions were experienced trying to manipulate data with the linked tables which were Excel spreadsheets. Eventually I got the data to be transferred using VBA but it has come home to us this week in transferring data from Excel to Word (using mail-merge) for our school reports that there is a 255 character limit on some text fields. Excel basically will check the first 8 rows of a spreadsheet in order to try to determine the type of field and if strings are less than 255 characters in length they are cut to that length. From Access we know that to get more than 255 characters of text you have to use a memo field. If you use VBA code to automatically import a spreadsheet to a new table the data type of the table is set on each column automatically based on the above (text for strings up to 255 characters and memos for more than 255 characters). This can lead to data loss.

On the other hand if you manually import the spreadsheet from Excel into Access you get to specify the data type for each field as part of the wizard and can then choose to specify Memo for the fields which have more than 255 characters of text in them. The data is then imported according to the specified settings and is fully preserved. However I find it annoying that this can only be guaranteed to work if the manual import process is followed each time. If you create a table and change its field definitions and then use VBA code to import that table it will not honour the field data types, it just makes them up again which is pretty stupid because it should be able to detect what the user wants since they have no other way of programmatically specifying the data type for each field.

There is a registry key called TypeGuessRows which is supposed to have an impact on earlier versions of Office. It looks like support for this feature has been dropped in Office 2010 because changing the setting had no effect on the behaviour either in Word or Access.

I would like to say I am extremely annoyed at the way this functionality has been sprung on Office users through successive levels of arbitrary design and coding decisions taken at Microsoft. It is likely there are a considerable number of people doing mail merges from Excel sheets with text fields larger than 255 characters who will be impacted negatively by this design limitation. In previous versions of Office the registry mentioned provided a means of working around this problem. However the apparent discontinuation of this functionality in Office 2010 makes it impossible to guarantee a successful import or merge unless a clumsy workaround is used (this is to put a row at the top of your spreadsheet, preferably the 2nd row, that has more than 255 characters in each text field). Furthermore, that when using VBA code to import a spreadsheet directly in Access, there is again no surety, it would be OK if there was a means to specify the import specification which can be done with the TransferText method but this is not possible when importing a spreadsheet because that option simply isn’t available. You can only guarantee with a manual import and setting each field type at import time that you will get a successful transfer of all long text fields. There may be ways to work around this problem if you are using VBA  with methods other than DoCmd. However these means seem to rely on the same drivers which are the functionality where the problems exist.

My advice therefore is to use one of the following:

  • Migrate to MS Access (which has a lot better functionality as a database that Excel)
  • Migrate to some other database if you can so you don’t have to use any part of Office
  • Try the workaround for the second row of the spreadsheet for current worksheets you need to import to Access/merge to Word

Wednesday 17 November 2010

Native boot VHDs and MDT [4]

This has taken a break because I have been very busy with other things. However I had a look at it at home last night and used Wim2VHD to make a VHD containing Windows 7 Enterprise x86. I then copied it to a portable HDD and took it to work this morning. I think the main error I made is that the system partition on the computer has to be set as Active (i.e. boot partition). This makes sense as it will be booted by the Bios and it then runs the bootloader to get Windows started. The “active” name is very poorly chosen for its function. Once I made that change with Diskpart BCDEdit suddenly started working and I could see that it was configured to boot the VHD. Once the VHD had been copied to the computer and restarted it immediately booted the VHD and after going through the usual first time steps it is now fully functioning. I noticed that D drive is attached and this is actually the physical hard disk with the files present on it. Since we can’t remove drive D we would look to limit access to it by ordinary users and provide them with another VHD as drive E for any scratch space.

I have also tried VHD Attach at home and it works well with attaching files as drives after the computer has booted up so that should be OK. The next thing is to work out how to add drivers to an image that has been sysprepped so it can automatically load them when the computer starts with the new VHD image. There are a couple of ideas out of all that I have looked at: test if MDT can do this as part of a sysprep task, or use DISM to service the image by adding drivers to it. I think MDT uses DISM to perform the inject task anyway. So the next part of this series will hopefully be wrapping up these options.

It turns out from examining the documentation that DISM is very easy to use with a VHD file. If you have the VHD mapped to a drive letter you just have to open a command prompt and type a few simple commands. If I have my platform drivers all nicely imported into MDT I can just pass the root folder to DISM and tell it to extract all the drivers from it, for example (if my VHD has drive letter V and my MDT path is on drive S:)

  • DISM /image:V:\ /Add-Driver /driver:S:\drivers /recurse

Then, in totality, my deployment process may well end up looking something similar to the following:

  • Develop the image on a VHD using a virtual machine
  • Run MDT sysprep only task to sysprep the VHD
  • Run DISM as noted above
  • Copy the VHD for deployment.

Ideally a future release of MDT would automate these steps in a task sequence.

The steps to set up a physical machine would be different and I would like to see if they can be scripted in some form.

NZ Open Source Society Praises Microsoft

Remember… you heard it here first…

http://www.interfacemagazine.co.nz/articles.cfm?c_id=&id=791

http://wikieducator.org/Microsoft_Launches_Open_Source_Filter_for_Mediawiki

Peter Harrison, Vice President of the New Zealand Open Source Society commends the release.


The Internet provides humanity with a unequalled opportunity to leverage our communication technology to educate people across the globe. Through collaborative technologies such as Wiki people can work together to create rich common resources that are open to all. By enabling users to export their content from Word into MediaWiki Microsoft are encouraging the availability of a far wider range of educational resources online.

—Peter Harrison, New Zealand Open Source Society

Sunday 14 November 2010

MS Access and ODBC Drivers and “useful features” in Access 2007/2010

I have used MS Access for 15 years to do various types of reporting and it is my defacto tool of choice with input data that looks like some form of database and a report as output, especially when it has multiple pages, requires calculations, or data to be joined together from multiple tables.

One thing I have been aware of over the years is that there are numerous ODBC drivers about that don’t like giving a linked live view of their data into queries that are joining multiple tables. I can’t quite explain this but it is not uncommon in a production situation to get problems when linking data via an ODBC connection to see error messages or missing data.

The three situations I have seen this in are:

  • A holiday job where I had to write reports to extract data out of an accounting system and display it.
  • Integris, customised reporting from data in the SMS
  • Excel, linking worksheets into Access for reporting

In all three cases working live with the source data (Access linked table) via ODBC seems to be the problem. The solution in all cases is to implement an intermediate step whereby data is imported into a table in the database, and then that table is used to provide the data for reports. This has proved necessary so many times that it should almost be considered routine.

The second part of this article is that I found that in the process of running make-table queries to import the data I would get an error that I was violating the range of values that could be put into a table. Now, for a make-table query, that doesn’t make sense. The table is being created from scratch to contain some data. This couldn’t possibly make sense unless Access thinks that somehow it knows what type of data is supposed to be in each field of your new table. Maybe it looked at the previous version of the table; the one you may have just deleted before you ran this make-table query. But the whole point of a make-table query is that it creates the table from scratch. If I wanted to update the records of an existing table I’d code the query as that. Access asks you to delete the table and then gives you the error message. If you delete the table manually and then run the query, you don’t get the error.

This unfortunate design “feature” of Access 2007 and Access 2010 cannot be turned off. I’ve read that there wasn’t a problem in previous versions of Access. The only change I was making was to increase the length of a text field and put more data in it. The default text field length is something like 255 if I remember rightly. But Access must have somehow locked that text field down to the exact length and in this case when the length changed it threw the error. Since it has already deleted the table and is creating it from scratch that shouldn’t actually be a problem. Note that here we are importing data from Excel linked sheets.

Further to the above: the whole Access importing/linking from Excel seems to be a complete crock. I think it’s a very sad situation when data can’t be linked from a product made by the same company without problems. Actually, there are problems all across different MS products, the whole linking/embedding thing seems to be overblown. Had to make even more changes to the report I was working on trying to get around all the issues, now I’m wondering if it is really ODBC or Access itself is at fault.

Wednesday 10 November 2010

Native boot VHDs and MDT [3]

At the moment I am trying out the walkthrough applying a sysprepped image directly to a VHD file that I created with Diskpart. This tool can be used with VHD files as well as ordinary hard disks and then it can mount the VHD to a drive letter.

As noted in previous discussion while this gives me an image that has been sysprepped, what it doesn’t have is the injection of drivers for a target platform so I don’t know what will happen when it is booted up to a specific platform. The previous debate so far has been over ways to inject those drivers. If my walkthrough works I will have to move on to trying out the different forms of driver injection.

First problem on the walkthrough was that the PC is too old to run x64. OK so then I set up a VHD for x86, because I didn’t have a captured image for Ent x86 I used ImageX to apply install.wim directly from x86 Ent ISO. The rest went smoothly enough until reboot. Note that you need to use the Windows 7 version of ImageX, the version from WinPE 2.0 (Vista) does not support vdisk (VHD). In order to run this I used a MDT LiteTouch boot CD in command prompt mode. When I rebooted I got “Bootmgr is missing”. The problem seems to be that the instructions in the walkthrough are wrong in some way. The VHD was assigned drive letter C, but on reboot, the drive letter C has been reassigned to the computer’s physical disk, and there seems to be no way to make the letter C stay assigned to the VHD. When I run the recovery console and open a command prompt, I change to C drive and then dir and find that my VHD file is present instead of the Windows directories that are present inside the VHD image. I can use Diskpart to assign a different drive letter to the physical disk and to assign drive letter C to the VHD, which is where the Windows files are, and exit Diskpart and find that C drive looks like a normal windows boot disk. Then BCDBoot can be used to set up the boot. I tried this a few different ways, assigning for example drive letter F to the VHD, and then it should have booted, but the drive letter F was lost on reboot and it wouldn’t boot at all.

However on the other hand I did find this posting which tells me about a useful tool called wim2vhd which is produced by Microsoft and will make a VHD easily from a Windows install DVD. This appears to automate some of the manual steps listed in the walkthrough. Another post here gives more useful info. Right now, though, I am stalled until Technet can give me a way to make the drive letter assignment stick. Here is some more info on how to sysprep a VHD so that it can be copied to another machine. I plan to sysprep using MDT just to simplify things especially if we can automate driver injection in MDT.

Since first writing the above, it has been determined that the limitation is built into Window 7: attached VHDs are not automatically reattached at boot time unless required to boot an OS. I guess this could be because of a performance effect on startup. However, I think that Windows should provide support for turning on automatic reattachment as I can see a scenario that would be attractive is to have a data VHD as well as a OS VHD in place of physical disk partititions. Fortunately others have recognised this need and have addressed it with software and scripts (here’s another post suggesting how to set up as a scheduled task). My preference would be the free software, if it is free, followed by the Powershell script (which automatically scans and attachs all VHDs in a folder). This puts pressure on MS to include this feature in subsequent editions of Windows.

I am now working on a new VHD creation at home in my free time, the first step is to upgrade my PC from Pro to Ent. Also here I commend the addition of various useful features to later editions of 7-Zip, a free zip archiver. The latest release edition 4.65 can open and extract ISO and WIM files, and the development beta 9.14 can open and extract VHD files. So I didn’t have to buy software to get stuff out of an ISO file, and you don’t any longer have to mount a WIM in ImageX to get stuff out of it. The VHD functionality is mainly of benefit on older versions of Windows that don’t incorporate VHD attachment (i.e. versions prior to 7).

Thursday 4 November 2010

Native boot VHDs and MDT [2]

Continuing on from yesterday’s post, I did some thinking about this overnight and have made enquiries on Technet as well. I think the best scenario for creating your own native boot VHDs using MDT is to deploy an OS to a VHD using a standard deploy sequence. You would then run a sysprep only task sequence from it on MDT, using the LiteTouch script in the deployment share. What is particularly needed is to be able to provide or inject custom drivers before sysprepping, rather than this being done in a setup task, because we aren’t going to deploy using a deployment task sequence in MDT. The basic situation is that we are going to sysprep the OS installation in the VHD, and then deploy the VHD to platforms just by copying it. The platform is then native booted from the VHD and it runs through the normal first time startup sequence and finalises installation.

One possibility for the driver injection is to see if it can be added to a sysprep task sequence (it has to be done before the sysprep task is executed). When you look at the standard sysprep and capture task sequence in MDT you will see a task that adds mass storage drivers to sysprep.inf for Windows XP and 2003. However I’d presume in Windows 7 it is correct to use the Inject Drivers task sequence, the question is to whether this would actually work or not, and exactly where you would put it.

The second option is to use the Driverspath variable or Devicepath registry key. There are a few relevant posts listed below:

Obviously my preference would be to leverage MDT and get the drivers injected via a step in a Sysprep task and so I am trying to find out if this is possible. It should be because my experience with Sysprep is that the unattend.xml file contains the information for all of the Sysprep passes, of which the relevant one here is the specialise pass and offline servicing pass, hopefully I do not have to hand edit the xml file and hand run Sysprep, it should be possible to either use a inject drivers task or some means of customing unattend.xml before Sysprep executes and hopefully we don’t run into a lack of drivers.

There are a number of manual tasks to get the VHD able to be booted on a target system, so I’d be looking for some means of automating these to an extent, and buying a 16 GB or larger pen drive for the copying, otherwise I can use an external HDD.

Wednesday 3 November 2010

Native boot VHDs and MDT [1]

In my previous post I wrote about the development of the native boot VHD capability in Windows 7. As I noted, the specifics of this functionality is that a VHD can be copied to a destination computer’s hard disk and attached as a boot device. Windows can then be booted via the VHD file. There are considerable advantages for small scale imaging without the use of a full WDS environment such as we looked at in the past (about 4-5 years ago when I had more free time, I invested a considerable effort in setting up a Remote Install server and learning how to use RIS. The server did get upgraded to WDS but since then with Windows XP we have simply used Ghost for a very occasional re-image scenario).

My current plan is to have a look at a deployment walkthrough on Technet. However the next idea which has come along pretty quickly is whether MDT can help with creating a generalised VHD for native boot. So far I have found info on customising an MDT deployment task sequence to a VHD file on a physical computer. This doesn’t really address what I had in mind. My situation is that I want to be able to create a generalised VHD file that can simply be copied to target computers that applies the steps of the MDT deployment sequence, i.e. when the machine is booted to the VHD for the first time the operating system is deployed as if a user had selected a particular deployment task through the wizard. Generalised is important because the image has been sysprepped for deployment to multiple machines, and the MDT deployment task sequence can do useful things like inject drivers and install software allowing MDT’s customisation capabilities to be fully leveraged. The above scenario is basically to create an MDT deployment task sequence that will deploy the OS to a blank VHD on the target system instead of a physical hard disk. It is a good step in the right direction but I am looking for a bit more than that.

MDT does have support for using media to create bootable OS installs. For example you could deploy a task sequence using an external hard drive. What I am looking for beyond that is simply being able to copy a VHD to a computer, attach it as a boot device as described and then boot the computer which runs a task sequence saved on the VHD, applies an image and the deployment steps from the VHD and deploys the OS possibly to another VHD on the target platform. It may be that going through the media deployment steps is able to create the type of installation I am seeking, although the best scenario by far is to automate the deployment wizard steps, for which there are already means provided.

Backup imaging, VHD native boot, network management & remote desktop management

A while back I wrote about my experiments in using MDT to back up laptops. Some of my MDT shares got moved to a new server recently and the backup share had to be set up again from scratch, which for some reason proved exceptionally difficult but it has happened eventually. However while I was working on that I realised there is another option to imaging a laptop and that is the Disk2VHD tool from Mark Russinovich. This uses Volume Shadow Services to image a live disk (you run it on the computer while Windows is running) and turns it into a VHD file. The great thing about a VHD file is that you can simply hook it onto Virtual PC, or in this case, a new virtual machine that I created in Hyper-V, and in this case the virtual disk worked when I booted the VM from it so that particular laptop can be brought up in a virtual machine with all of the software and files on it that the user had. Although in most cases I find that the user does not need to have further access to their old data (we always transfer everything across when they change laptops) this is another option in special cases where they may have an old application and there is even the possibility that the VHD could be put onto their new laptop with Windows Virtual PC. Take note that there is a setting in Disk2VHD that is required when you create a VHD for Virtual PC because otherwise the VHD won’t boot even in Windows 7 VPC. Next idea may be to add VPC to our standard laptop image.

I’ve noticed that there is a capability in Windows 7 for a VHD based boot of the operating system, which can be deployed simply by copying the VHD file to the hard disk of the target computer. This would appear to be worthy of further investigation, since it removes the need to use, say, WDS for a mass deployment of an image to multiple computers. If you can install an OS simply by copying the VHD file which contains an image you have built on a virtual machine for MDT deployment then I can see that making upgrades to student PCs would be much simplified over the standard WDS type deployment scenario if you don’t want to install a WDS server. The scenario here is that you would build your reference image on a VHD using a virtual machine so that drivers for that platform are injected into the image as it is built, and presumably when the VHD gets deployed onto the target machine, the drivers for the platform are able to be installed when the hardware is detected on first boot. Native VHD boot requires Windows 7 Enterprise, which is part of Software Assurance in the MS Schools Agreement. I am planning to have a look at this scenario here, seeing if I can use one of my existing MDT deployment scenarios to deploy an OS to a new virtual machine and then capturing that using the MDT Capture and Sysprep task into a WIM file that can be turned into a VHD file as described in the walkthrough. Hopefully MDT will be updated to simplify the steps needed.

Network management takes many forms but one consideration is how to track the traffic moving through different physical branches and the SNMP protocols are designed for this. SNMP can be used for many things these days but part of its fundamental functionality is to enable monitoring of network hardware such as switches and logging the utilisation of the network through individual ports. SNMP is not a technology I have had knowledge of before now, however there are a number of software packages that can be used to collect data from SNMP and we are currently playing with the free edition of ManageEngine OpManager which allows 10 interfaces to be tracked, so I am using it to see what is happening on my wireless network and hopefully the two managed switches on the backbone of our network will be able to give me the information about the traffic through each port which will help us to determine where congestion might be occurring on our network.

If you have any number of desktops to manage or at least monitor to keep an eye on what is happening on them then it will be advantageous to you to have remote control software. In the educational environment there are a number of packages which exist of which the most common features include being able to control all remote computers, put an image of one computer onto all of them, lock the keyboards/mice and watch what users are doing on them. I have looked at these packages for years and while there are different capabilities and functional levels they are also valuable for IT administration to save having to visit every remote computer, or at least being able to perform useful functions like simultaneous logon and software installation. The cost also varies widely and for our number of computers there would seem to be little benefit in spending $100 per computer when there are packages that are flat licensed for a few hundred dollars for any number of clients.

Monday 18 October 2010

Backing up laptops on a network

As a recent post noted, Offline Files is still a temperamental technology that I have decided is not worth the effort on Windows 7. So what other systems can you use? Take note of the following:

  • By default, Windows 7 will create a restore point on the laptop’s hard drive every day. This can be used to retrieve a previous version of a file
  • Windows Backup can be used to make a backup of files onto an external HDD. However, it has very limited options, and I recommend considering the use of either an external disk with its own software, such as the Seagate Replica, or a third party program such as Karen’s Replicator from karenware.com
  • Commercial backup software such as MS DPM or Seagate BackupExec can have agents installed to automatically back up a laptop to a server, in conjunction with a server based installation of the software.
  • Robocopy can be run from a server to automatically back up using a script. In the rest of this article I will describe this option further.

Robocopy is a command line tool that has been available since the days of Windows NT and has been progressively refined since then. It is a powerful tool that we have used to do some of our backups up until now. There are many parameters and settings that can be used. The following is a list of these and how they are being used in the evaluation so far:

Source directory: typically \\%1\%2\%3\documents and \\%1\%2\%3\desktop are used (Robocopy is being invoked twice, once for each path). %1 is the Netbios name of the laptop. %2 is the sharename, %3 is the foldername within that share. Typically you will need to share the Users folder in Windows 7 or Documents and Settings in Windows XP, in which case %3 is also the Windows username of the user. To avoid visiting each laptop user, if you are the domain admin, use Computer Management to connect to each laptop in turn and then create this share. I use random strings of 20 characters followed by the $ sign to name each laptop share individually and keep it hidden. Grant Read permissions to Everyone on the new share. Windows file permissions will still prevent anyone other than the user and Administrators from accessing these files.

Target directory: Typically this might look like \\server\share\%3\backup\documents or \desktop depending on the source directory as above. This makes use of a target folder based on the username of the user, which is the same parameter as was used on the source directory.

Filespec: I recommend you create a settings.rcj file and use it with the /JOB:settings parameter. In the RCJ file you place the option /IF on a line by itself and then one filespec per line thereafter. At the moment this will look like:

/IF

*.do*

*.xl*

*.md*

*.acc*

*.pp*

*.pub*

which will pick up most Office files. Add, delete or change to suit.

Other options: /R:45000 /w:1 /B /FP /V /S /MIR /MT /TEE /LOG+:%3.log. Put the rest of these except the last (log) option into the RCJ file (one per line). Specify the log on the command line.

/R: /W: specify a retry option and wait between retries (in seconds). The settings shown wait for up to 12 hours. The recommendation would be to run this with Task Scheduler and get it to force termination at a lesser interval than the frequency you are running this on (e.g. 20 hours if you are running daily) and adjust the retry count as appropriate.

/B will try to use Backup mode if you don’t have the full file permissions set up for backup. /FP, /V and /TEE are logging settings. /S is for copying files in subdirectories. /MIR ensures that the backup mirrors the source. This option deletes files on the target if they are deleted from the source, so it isn’t the setting to use if you want to guard against file deletions. /MT makes Robocopy use multiple threads. /LOG is used to create a log file, the + option appends to an existing log file of the same name.

The actual calls to Robocopy are contained in our case in a Robobackup.cmd command script. I write an individual CMD file for each laptop and set up Task scheduler to start this, in the cmd script it makes the call to Robobackup.cmd passing in the parameters specified. The idea is one backup per day in a 12 hour window. But you could be more versatile and try for three backups each in a 4 hour window. I am still tweaking our test environment to see what we can do, and I expect a few different options will be tried out, and these might also include turning on Shadow Copies on the server where the backups are being stored to see if it will keep previous copies.

In the end I changed R and W to 0 and taskscheduled my task to run 12 times a day, every hour. This sounds like a lot and might well prove to be, but given that laptops will not be connected a lot of times, and that it will quit immediately if it is not, the amount of traffic probably won’t be as much as one would think. Another option is to trigger if the laptop is connected, but this requires a custom script or other means to detect connection (pinging would achieve this).

UPDATE:  Task Scheduler seems to have problems when it starts a command script that starts another command script that finally runs Robocopy. There also seem to be problems with a command script that has more than one command line in it.

After a lot of trial and error, therefore, my scheduled task is calling a cmd script that runs Robocopy directly using parameters that are passed into it. The task action passes four parameters into the command script that specify the laptop name, share name, username and folder for the source path. One of these also specifies part of the destination path, and another specifies another part of the destination path and the name of the log file. There are two task actions for each user, which arrange to back up the Desktop and Documents folders respectively. The common settings are passed in a .rcj file. There may be a fifth parameter to specify the rest of the destination path once I get my head around where these backups are going to be stored.

Saturday 16 October 2010

Solving Group Policy deployment headaches in Windows 7

One of the most distasteful features of using Windows Vista and Windows 7 computers in a domain environment is their tendency to freeze during the Group Policy application phase of user logon. Although it is known that there were some instances of this occurring with Windows XP as well, we never saw these happen in our experience and first saw the problem in Windows Vista and Windows 7. The most common scenario for these problems is during the deployment of Group Policy Preferences for Printers. Extremely long timeouts going up to overnight in duration have been observed on some occasions, most commonly when a new computer is logged into for the first time. Although today I saw a laptop take a very long time to apply Folder Redirection policy, this is unusual whereas GPPP is the most common scenario by far. Even though event logging is greatly improved in Windows 7, there is still insufficient information recorded in the event logs for these instances. My latest effort is to get all the logging options turned on, both those specifically for GPPP logging and user environment processing, and hopefully with the logs now being produced routinely on all our computers we might get some progress towards resolving this matter. The fact that MS has failed to address this to date is a serious concern from my POV, even to the extent their logon processing should force the logon to continue after a reasonable period of time without these very excessive delays.

Windows Live Essentials gets a makeover

Well, Windows Live Essentials has been updated to the 2011 edition and they have, at last, gone back to proper toolbars with icons, replacing text labelled buttons in the last few editions. In fact, the WLE applications (I’m using Writer to post this) are now using the Ribbon like Office does, this is the first time I’ve seen another app from MS that uses the Ribbon. I would guess there are more buttons and options on the toolbars as well, which is partly necessary because the menu bar has disappeared. There is a Quick Access toolbar as well, just like in Office, except that you can only turn on or off the preset options; you can’t add to these just like you can’t customise the ribbon. Another feature copied from Outlook 2010 is the conversation view for email.

A new functionality in WLE is Mesh, which lets you sync a folder on your computer with another computer running Mesh, or with a 5 GB space in Windows Skydrive. The functionality provided is pretty restricted, it is just one folder and everything in that folder. An interesting additional Mesh functionality is that it provides for remote access to your computers from another computer, presumably they all have to have the same Windows Live ID and password configured on them.

One annoying WLM feature was the modal dialog it would put up to say it was changing message flags. It would do this every time you clicked on a new message, doing this was very slow as was changing the message status from unread to read. This seems to be a lot faster in WLM2011 and doesn’t need that dialog box displayed.

Friday 15 October 2010

Offline Files is still a crock under Windows 7

Offline Files is a technology that was introduced in Windows XP. We used it for a time then but found it troublesome.

I have often wondered why Microsoft needed to use a special folder in the local computer instead of doing something like a straightforward automated sync between a visible local copy of the data and a server based one. Instead they use the special folder called CSC in Windows with the filenames specially changed etc. Then they use a complex system built into Windows that has to be configured through Group Policy setting and lacks functionality as well as the difficulty of getting this complex system to work properly.

The result is that Offline Files is difficult to configure and problems are difficult to solve. Currently I am getting a laptop that stops syncing after about a minute with “The specified network name is no longer available”. There is not any documentation on this problem that I can find out about.

There have been many times when I have thought there must be a simpler way to synchronise staff laptops when all you need is to have a script running on the server using one of the freebie programs like rsync or whatever that just syncs every so often. Another option with say Backup Exec is to look at automating backup of a share on each laptop using VSS. Best to avoid having to purchase and install the expensive BE agents as the cost soon stacks up and I am hearing that they have to be upgraded regularly. At the moment I am having a play with rsync guis on our backup server, customised only to back up specific file extensions (truly documents only *.doc *.xls etc etc) and seeing how it might be able to cope with the remote laptop going offline partway through and other considerations. Robocopy can cope with this (it can be told to retry or wait forever) but I don’t know how it would manage if the remote file had changed in the meantime.

I have now discovered Windows Mesh which can sync to Skydrive (5 GB free space) so I’m playing with that as well.

Monday 4 October 2010

New computer 2 days on

Still can’t get over how quiet this thing is. From the next room you can’t hear it at all. Although the old one turned out to have quite a bit of dust blocking things up. I got a serial backplane connector from an Asus motherboard and am pleased to report it worked the first time with the GPS. Am getting a 500 GB external HDD for backup and will also move two USBs from the front panel to the back and put an eSata port there as well, then we will be more or less finished customising.

Sunday 3 October 2010

New computer 1 day on

Well, such a big project and now it’s over. Very pleased with the new box, just a few niggles of things maybe I should have checked over before I started. Mainly that this board has only got provision for 8 USB sockets, one of those is taken by the card reader and I wanted more of them on the back. So there are only 4 USB sockets on the back. However, the suppliers don’t bring every board model into NZ and there is quite a step up in price ($50 or more) to get more sockets (12). Given that there are several models of board in between, Intel is being silly not to put more sockets onto the lower priced boards – that board with 12 socket connections is a high performance model with DDR3 memory. But Intel is funny like that. All the Series 4 and 5 boards have PS/2 sockets for keyboard and mouse, but most of the Series 3 boards left these out altogether. My PC at work is such a creature and everything has to be USB, not that that is such a bad thing. Either I could swap two front USB sockets to the back (using a backplane adapter) or I could put in an accessory card with 4 more on it, which I may do. Also likely to be appearing in the backplane slots are an eSata socket and the 9 pin serial socket.

The next thing to look at is backup. Many years ago I bought a 400 MB Iomega tape drive to back up a previous computer. Those were the days. After that died, it was occasional CD and DVD based backups. Then I bought an external HDD enclosure and put a 40 GB laptop HDD in it. A great move, the only problem is that it is now too small. So I have to get a bigger external disk, maybe 250 GB or more. I trialled running Windows Backup on the new system and it said it needed 70 GB to back up my documents and pictures. And the backup could also include a system image if I wanted it to, although I don’t know that I will do that. 250 GB is a lot but it will probably need to store more than one generation of backups. However, online backup services are starting to become affordable, except for our ridiculously high broadband charges in NZ. Maybe in the future it will be possible to use these more than I have looked at so far.

One thing I have finally worked out how to do is to change the location of My Documents and My Pictures. I couldn’t work out from the time I installed Windows 7 at home why IrfanView wouldn’t recognise the “new” location of My Pictures which is on a different drive so it is not in the normal userprofile location. Well, it turns out that even though you can change the folder locations in the Libraries, this doesn’t actually change the stored locations in the system. You still have to find the folders in your profile and tell Windows that they are supposed to be in a different location. Now I can use My Pictures in IrfanView to find all my photos like under XP.

Rebuilding My PC [4]

As we saw in the last post, I started it on the old PC and finished it on the new one. About halfway through writing, I saved the draft on the old PC and shut it down. I then booted from a Windows PE CD and started up ImageX to capture the boot disk to a WIM file. Once that was completed I did the same on the new PC with its new 250 GB HDD, first running Diskpart to partition the new disk and then again running ImageX to apply the boot disk image to it. After this I tried booting and got an error screen telling me to run the Windows Recovery Wizard. This is because the BCD command line tool needs to be run to fix the boot configuration. I had an MDT boot CD handy so I booted that and chose the Recovery Wizard option which fixed the boot configuration for me, then I rebooted and Windows 7 came up looking almost the same as it did on the old PC. While the old PC was imaging I tidied up all the power supply cabling on the new box.

A2000_20101002_003

You can see the power cabling has all been tied back to keep it out of the way of things. Particularly that CPU fan. This has become more necessary in the era where such large CPU fans are fitted that do not have enclosed blades like they did in the Socket370 era. But of course tidying up the cabling makes things easier all round for a nice and tidy PC inside.

Around this time I decided it would be safe to leave the new box on overnight so I went to bed leaving it running. Starting again in the morning I have lifted out the old box. Here is a picture of it:

A2000_20101002_004

I took the modem, 250 GB HDD and the old DVD writer across to the new box which will now have two 250 GB HDDs and two DVD writers and of course the modem for when the broadband falls over. At the moment I am copying some files off one of the 80 GB disks across. Then it will be all finished and ready to go. Here is the comparison of the WinSAT data for old and new PCs, both running Windows 7:

Component Old spec Old SAT New spec New SAT
CPU Celeron D325 2.66 Ghz 3.4 Celeron E3300 2.5 GHz 5.9
Memory 2.00 GB 4.4 2.00 GB 5.5
Graphics Intel D915 1.9 Intel G41 Express WDDM 1.1 3.3
Gaming Graphics Not detected 1.0 780 MB total available memory 3.4
Primary HDD   5.3 207 GB free 5.9
Overall Score   1.0   3.3

Note that in spite of similar GHz the CPU score is quite different, obviously the new CPU is a bit faster. The memory is DDR2-800 instead of DDR 400. Not much difference in the HDD even though it has gone from SATA1 to SATA2. The low graphics scores of the old PC being in large part due to Intel’s controversial decision not to produce WDDM drivers for the 915 chipset. The branding of these chipsets as “Vista Capable” led to a class action lawsuit against MS and Intel in the US.

The one other issue with the old PC in particular was noise. The new one is almost silent. I checked up with the old PC and found the case and CPU fans are particularly noisy. The CPU is the “Prescott” series with the infamous “Netburst” microarchitecture, which was notorious for the high thermal envelope and resultant heat output. It was this class of CPU that first made it necessary for cases to have an extra ventilation duct put into the side to allow the CPU fan to exhaust directly from the side of the case. Even with 2 GB of RAM the CPU fan runs nearly at full speed almost all of the time even at idle with Windows 7 installed. When I stopped those two fans, the power supply fan was fairly quiet. I rebooted into the BIOS hardware monitor and read off temp of 72 degrees C for the CPU and its fan was turning at 2600 rpm which is flat out. This in a room temp of 20 degrees and with the case completely open so plenty of airflow. In flat out burn in tests with the new box it never got above 55 degrees (frequently a lot less) and the fans only turned around 1100 rpm. Another dumb thing is the old box will not boot off a 160GB HDD, only an 80 will do for those old 915s. I am planning to take the old box to school, purely as a stopgap until we upgrade, because it is so limited that it won’t even run 64 bit editions of Windows.

Rebuilding My PC [3]

Now that we have the board completed, the next task is to install it into the chassis. Your existing chassis should already have spacers installed, the board sits on these spacers to hold it up clear of the chassis. Check that the spacers are in the same number and positions as the board needs, if not you will need to remove or install spacers to suit. Carefully slide the board into position getting the rear panel connectors in place, you may have to bend tabs on the I/O shield to make it easy. Just slide the board around carefully to get the spacers to line up with the mounting holes. Then use a magnetic screwdriver and take it really easy and get the screws into the holes and carefully tightened. Don’t use a power screwdriver and don’t overtighten the screws. If your screwdriver slips and hits the board the chances are very high that the board will be fatally damaged. So take it real slow and careful here. Once you have the board fastened down, connect the power supply connectors. The usual requirement is the 2x10 or 2x12 for the main power connector and a separate 4 pin connector for extra CPU power (this is NOT optional, even though the connector looks the same as the extra 2x2 that gets tacked onto the end of the old 2x10 power connector). Modern PSUs typically have a 2x10 and 2x2 connectors that clip together for the main connector, as well as a separate 2x2 for the CPU connector. As sometimes modern supplies have replaced the 2x2 with a pair of 2x2s that clip together, you may find as I did with the Enermax that the pair of 2x2s will need to be separated so that one of them can be plugged into the CPU power connector on the board. Getting the main power connector in can be a little tricky, especially with this board where there isn’t adequate support under the connector, so you want to take it carefully to avoid bending the board too much.

The next bit of fun is to connect the front panel cables to the board. Typically there will be front panel USB and audio connectors, perhaps these days firewire or eSata could be installed, and there are also power and HDD lights, power switch and perhaps a reset button. The switches and lights will usually be on one set of combined headers, while USB and audio each have their own set. In this case the Foxconn TS001 shines out with the clear labelling of the various connectors. As it happened most of them seem to work so far although I haven’t tested out the audio jacks. I also put in an old DVD drive from my old PC and it had to be plugged into the IDE port.

A2000_20100930_006

Now for the moment of truth. Connect VGA, Keyboard, Mouse, turn the thing on and see if it comes up to the Bios screen. Then select the Bios settings and go into Hardware Monitor to read out the temps and fan speeds. I started off reading around 60 degrees C for the CPU which is probably in an acceptable range. If this all works out then install an OS and some kind of test and monitoring software (for example burn in software such as Burn In Test, temperature monitor software such as SpeedFan). Run the software for a reasonable time frame to give an acceptable burnin test period. This is an optional step, just one you might want to try to see that everything is working OK, especially that the heatsink and fan are doing their job properly and keeping that CPU nice and cool.

Finally, before you get the thing into use, tidy up inside the case. Typically you will need to reroute cables so that they have no chance of getting caught up in fans or other moving parts, and make everything look really neat and tidy inside. 

This is my first blog post from the newly rebuilt PC, I’ll explain that further in the next post.

Rebuilding My PC [2]

After the power supply, the next thing to do is to assemble and install the main system board. The board comes with an I/O shield which fits into a space on the back of the chassis and this has the cutouts in it for the onboard connectors. Which in this case are PS/2 mouse and keyboard, VGA, four USB ports, RJ45, and three sound minijacks. Uggh, I just noticed there is no serial connector, which means the GPS won’t be able to connect to it. The board has a serial header, but you need to get a slot bracket with the connector and cable mounted on it, or a USB to serial adapter. I have to think about this one; I don’t use the GPS much, but I will have to make arrangements for it one way or the other. I hadn’t thought much about it because hardly anything these days is dependent on the old parallel or serial interface connectors, but Garmin has a backward attitude and even a 2 year old GPS still only comes with a bog standard DE-9 serial port interface. I think I can find an old slot serial connector at work somewhere (we have a small number of very old towers lying around), or I could look at getting an external adapter as they are very cheap nowadays. Four USBs isn’t much these days. The board has headers for another four, of which the two built into the front of the chassis and the card reader’s one will give me three at the front for seven total, one more than the old PC. Intel does make some boards that have more USBs on the rear; this board is a budget model.

Putting the board together is pretty straightforward but you just need to take it slowly and carefully as there are plenty of bits on it that can get broken and can’t be repaired. The key task you need to do before you put it into the chassis is to install the CPU and fan. On older style boards (here I’m thinking really old, like Socket 370) it was possible to install the fan with the board in the chassis, though risky; the fan was held on with a metal spring clip that took a lot of physical force to hook/unhook, and I have vague memories that I may have killed a board once trying to get the clip hooked on, because I missed hooking it over the retaining lug on the socket and gouged the board instead. The LGA775 boards use a heatsink/fan assembly that has four posts that lock into holes in the board, and believe me, it is nearly impossible to lock these into place with the board in the chassis. So put the CPU and fan in before you put the board in. And buy the model of CPU that comes with a heatsink and fan in the package (unless you are an overclocker of course). Have a good look at the locking posts on the fan to see how they go down and lock in position, because I found the arrows on them were anything but helpful. Basically you need to start by turning the posts in the direction of the arrows, put the post in, push the top part firmly down and then turn it in the opposite direction to the arrow. When you push it down, it’s best to put your finger under the board so you don’t bend the board too much. Check all the posts are properly locked so the heatsink-fan isn’t going to fall off. Then run the power cable around the posts to the onboard connector so the wires don’t get caught in the fan. Of course you should orient the fan in the first place so there isn’t any excess wire to float around and get caught in the fan.

A2000_20100930_005

The not-so-flat board with the heatsink-fan assembly locked into place on top of the CPU. This appears to be absolutely normal with these boards, although you’d think they could have extra support under that part.

A2000_20100930_004

The assembled board with, in this case, the RAM in place as well. Although, that is easy to install later. The heatsink in the middle is the northbridge which needs it because of the onboard GPU. Fortunately the board comes with this heatsink already installed. This is a microATX board so it is quite compact.

A2000_20100901_002

Earlier picture showing the board without the CPU installed, this was when I thought I could put the fan on in the chassis, I was sorely mistaken and had to take the board out again to get the fan in.

A2000_20100913_002

This picture shows the CPU being installed, the load plate still has to be lowered.

Rebuilding My PC [1]

Well, of course, we all know that PCs don’t last forever, and I am not interested in the extremist brigade that say you should try to get 8-10 years of life out of a PC. The way I see it is, a PC is obsolete after about five years and while it will still do some things, it is getting pretty slow and the parts are wearing out.

However, depending on how much PC standards have changed in that timeframe, it is possible to rebuild a PC into virtually as-new specification. There are some important considerations of course:

  1. Whether the PC supports standard design components. Forget this idea if your PC was made by Dell, HP or IBM, etc. These manufacturers prefer to use proprietary case and board designs that usually can’t be upgraded. No, this series of articles is strictly for those who have purchased a locally manufactured PC using generic off-the-shelf components. My host system was built by Cyclone Computers. The chassis is a Foxconn TS001, the motherboard was also made by Foxconn (Intel brand) and the power supply was made by Enermax. All of these items are ATX standard and can easily be replaced or reused.
  2. Whether the chassis and power supply in particular meet modern specs. These days the power supply requirement is ATX12V version 1.x as a minimum for the average type of board, although new PSUs are v2.x. The chassis of course should be ATX spec. Baby AT just isn’t going to cut it. If you are reusing a power supply check that the board you are planning to use matches the connectors available from the power supply. It’s also important to have SATA HDDs / DVD drives, although most new boards still have one PATA connector able to connect up to 2 drives, and even new PSUs still have Molex 4 pin power connectors.
  3. If you use Windows, MS says you need to buy a new license sticker when you change the motherboard, because effectively this is a new PC. Of course this won’t be an issue with a free operating system.

In my case, this PC is 5 years old and the power supply spec happened to be ATX12V 1.x with the right connectors available. However I chose to replace the power supply with a new Enermax Tomahawk 400W supply, a bit of an overkill but pretty good value with the deal I got.

Assuming you need to install a new motherboard, the minimum number of parts you are likely to need is the new board, new CPU and new RAM, because the two latter components are often matched tightly to the board and as specs change so often, it is unlikely you could reuse an older CPU and RAM. However, I was pleasantly surprised to find that the 5 year old board (Intel D915GAVL) is still LGA775, the same as the new board (Intel DG41RQ) meaning the CPUs might fit each other’s board though whether they would work is another question. The new CPU is a Celeron E3300, a dual core with Intel VT hardware virtualisation and some other features supported.  I bought 2 GB of DDR2-800 memory as well.

The first job was to remove all the bits from the old chassis, after which I put in the new power supply, just a simple job of doing up the screws.

A2000_20100323_002

Here is the new power supply in the chassis. Well, you can see in the picture that this photo was taken on the 23rd of March, which is just over 6 months ago. That was when I first started on this project, and there have been a few glitches and holdups since then. But now as I am writing this the task is nearly finished; the new PC is sitting next to me as I type this on the old PC at home, and as soon as I have finished testing and assembling the new PC and migrating the Windows 7 installation to it then it will be ready to go and to replace the old one (which incidentally uses the same chassis type as seen above, but it is a year older with a D915GAGL board inside).

Friday 17 September 2010

Advanced MDT

My production deployment of the Win7EntX64 image has had a few glitches we are looking into. Two deployments had errors, but two worked more or less as expected. I have started to leverage the VM produced monolithic images by creating custom task sequences to deploy them to other platforms, which is one way we can prove the benefits of using a system like MDT to set up our computers.

The next task is to see how I can package Windows XP Mode and a preconfigured VM with certain legacy applications into the custom image deployment task sequence so that it is all set up and deployed to certain platforms ready for use.

Another part of MDT is to use it to do backup captures of machines, something we do regularly with Ghost. To do this I set up another deployment share and customised it with a different path (to my backup share rather than my setup share) then put in a capture task and disabled the sysprep on it. Captures get started using the LiteTouch script on the share rather than using a boot CD so it is very convenient. The first capture with this system has now been successfully completed for backing up a laptop and the system will be used again with all the laptops that we need to return at EOL as well as the other occasions in which we back stuff up.

Thursday 16 September 2010

Switchcraft EH Series [2]

We received our order today of the inserts and some plates we had also bought from Jansens to go with them.

IMG_2862

Here is what one of these looks like straight out of the bag. You will see that the coupler looks like a standard VGA gender changer, and in fact that’s exactly what it is. This is good if you ever have to replace one of these, although the M/F changer is hard to find as it is actually a joiner, and I have never seen them in a supplier’s catalogue. The coupler is reversible when assembled (obviously only applies to the M/F type).

IMG_2863

And here, assembled to a plate. It can only be front mounted unless you make cutouts in your places to clear the VGA plug locking screws. I had thought it would be necessary to do this anyway with the front mount as shown but the design of the thing is such that the screws don’t go on deep enough to have a problem with the plate behind.

Thanks to Jansens who have recently taken the Switchcraft agency as this EH Series design is pretty well unique especially with the range of different connectors (Neutrik produce RJ-45, USB and Firewire connectors in this XLR size insert style as well, but not all the other stuff that Switchcraft have come up with).

Wednesday 8 September 2010

Building Windows 7 Enterprise x64 Image [5], Earthquake !!!!

On Friday I wrote about the difficulty I was having in getting proper support for software for schools making their own Windows images for HP laptops. After a very long and convoluted process involving the IT Helpdesk, Tela Helpdesk and Axon, which is the Tela laptop repair agent for HP, I have finally had my enquiry passed onto HP where they have advised me that software discs will be sent out to me. The process has been quite difficult but I hope that the HP discs will arrive soon. There is still uncertainty as to whether these discs will include WinDVD and Roxio DVD Creator which are supplied on the Tela laptop images. As it happens Windows 7 Pro/Ent include DVD playback in Media Player and the new Windows DVD Maker software for authoring, so it is not so critical now in Windows 7. However I have determined that the 6730b disks did include Roxio DVD Creator and have copied this to my new installation so that I can get on with it.

So I am continuing to put the image together with the necessary drivers in MDT, Windows includes some of these but one that will need to be injected is the network card driver, the same as I had to inject it into the Windows PE image, otherwise the deployment task fails partway through when it needs the driver to reconnect to the network share that contains the task. There will be some other drivers and bits of software like on the 6730b laptops. We are just moving ahead to get the deployment finished and the laptops out to their users as quickly as possible.

One thing we are changing with Windows 7 on laptops is utilising Offline Files as a backup system for documents stored on laptops. Basically we change folder redirection policy to put Documents onto the server and then sync it onto the laptop for offline use. We also change the policy settings so that Pictures, Video and Music are stored directly on the laptop itself and are not part of the offline files sync. Offline Files has been around since XP, but was pretty poor then. It is better in 7, although some users are reporting issues in Technet Forums.

Today what time I was able to focus on work was spent getting the Probook 6550b specific task sequence ready for a test run tomorrow. Basically there are three specific parts to the task sequence:

  1. Make the Windows 7 Enterprise x64 generic install image (operating system and software) which I described in previous steps. This becomes the basis of a custom install task sequence for deploying Windows to the target platform.
  2. Inject target platform-specific drivers
    1. Create a platform specific folder to store the drivers and import them to it
    2. Create a platform specific selection profile and include the above folder in it
    3. In the Postinstall group of the task sequence, following the generic “Inject drivers” stage, insert a new custom named “Inject Drivers” task and configure it to use the selection profile configured above.
  3. Install target platform-specific applications
    1. Create a platform specific folder to store the application items
    2. Determine the means of automating each application’s installation
    3. Add the applications to the above folder along with their silent installation command. Configure each application item by checking the box “Hide this application in the Deployment Wizard” to avoid distracting the user who runs the automated installation task.
    4. Create a custom group for application install in the State Restore group of the task sequence (suggested positioning is just before the generic “Install Applications”
    5. Insert an “Install Application” task into this group for each of the required applications. Configure each application install task by checking the “Continue on error” box.
    6. Determine how many reboots are needed and insert “Reboot computer” tasks between app install tasks as needed.

Now, err, that more delicate subject – the Canterbury earthquakes. There have probably been about 300 shocks in all since Saturday morning, but things took a bit of a new turn today when a number of shocks started to be centred in the Port Hills on the outskirts of Christchurch – or nearby, such as in Halswell or Quail Island. This explains why the Richter 5.1 quake at 7:42, widely cited as the most severe aftershock yet, centred in the Horotane Valley, could have such an impact with its far smaller magnitude than Saturday’s 7.1. I daresay that it has come as something of a rude shock for Cantabrians when things have seemed to be returning to normal. Civil Defence closed all schools in the region on Monday and Tuesday, then extended this to the whole week, then relented and left it up to individual Boards to decide if they wanted to open before next Monday – we will stay closed until then however.

The practical effect of the earthquake has been to throw a spotlight on our backup systems. The power has been off twice and each time a typical pattern of UPSs running down to flat batteries followed by servers going off has occurred. This brings its own challenges, on Saturday one of the UPSs failed to restart properly so that the servers could not automatically reboot as they kept losing power during POSTing in an endless cycle. Also the ISA server firewall service does not start automatically on a restart which means it has to be started manually each time. The UPSs don’t seem to last very long which suggests either inherent low battery capacity or battery life expired in any case.

Thursday 2 September 2010

Building Windows 7 Enterprise x64 Image [4], Thin Client, Home Computer

Firstly, thin client. The T5720 does have support for TS Gateway but in actuality is unable to run it, when flashed with the latest available XP Embedded image from HP. I am not going to delve too much into this but suffice it to say that with thin clients you are tied to the vendor for operating system support and if they can’t be bothered to support a particular functionality you are pretty well stuck. As I have not been able to get TSG functioning on the T5720 at the present, with our WS2008R2 RD Gateway server, I will just do some more testing to see if there is a compatibility problem with R2 by setting up a 2008 TS gateway inside our network. If this doesn’t work this is just another thin client I can use for something, albeit a more expensive one (the total amount I have spent on it to date is about comparable for T5720s in general secondhand, considering the extraordinarily low price I paid for the base unit in the first place). Whilst thin clients can generally run local apps quite well, the lack of space on the flash card (512 MB in this one) is a significant consideration with less than 100 MB free at the moment.

The home computer is progressing with the arrival and installation of the motherboard. In a day or two I will order the CPU and RAM.

IMG_2500

The laptop image for Windows 7 Enterprise is still in progress as I need to get the software from HP. Whilst Toshiba are easy to get stuff for from the Laptop Company which is a domestic retail outlet, the HP agents Axon are a different kettle of fish to deal with altogether, being more geared towards the enterprise market. It has been previously quite convoluted with a big run around to find out how to contact the right person in Axon to get the software and at the moment what has been supplied is incomplete.

Tela have announced that in a couple of months they are switching over to Windows 7 on their laptops. Obviously we are still waiting to see the details. If what is shipped is x86 only then it will still be unsatisfactory as all new laptops with the amount of memory that is now being supplied should be shipping with x64. As such I expect to have to continue new laptop imaging until such time as x64 becomes supported on the standard desktops.

Wednesday 1 September 2010

Building Windows 7 Enterprise x64 Image [3]

OK so recapping. Getting on with Windows 7 x64 imaging installing feature apps, then another capture, then testing full deployment. After that it will be customising with 6550b deployment task and testing the outcome on a 6550b.

The pre release of Update 1 had problems but the production release looks like working satisfactorily so I am migrating the previous (32-bit Pro) deployment share to it. Overall impressions of MDT are that it does what it says and that it is fairly straightforward for someone like me who does not use it regularly to pick it up again for a new deployment project at irregular intervals.

As MDT can do full captures it is probably the straightforward means for backing up old laptops as well. I haven’t used this yet but will probably look at it with the transfer of old laptops with these new ones we are setting up.

My second x64 capture, this time with the full apps, failed at first with the error message saying “there is not enough space on the disk”. I remember this situation happening before with MDT. Part of the capture stage is to create a reserved partition that, presumably, holds the Windows PE boot image, which is then used to reboot the captured system in Windows PE in order to carry out the actual capture process. If this partition is already present from a previous capture stage then this error results. The solution is simply to go to Computer Management, Disk Management and delete the reserved partition. After applying this step the capture worked normally.

The next step was to trial deployment to a target platform, in this case the same laptop I used to deploy the first x64 capture image. Once again this took about 40 minutes to deploy. The image is now a little over 6 GB which compares to about 2 GB for the base OS installation. I am unsure whether that really means 4 GB of stuff was installed or whether one is compressed and the other isn’t etc etc. The deployment was successful and everything on the laptop works about as expected.

This completes our task of building reference 64 bit images for the present. The next step is to create a platform specific task for deploying the Probook 6550b which I expect will be a similar process to that used to create the previous deployment for the 6730b laptops.

Tuesday 31 August 2010

Building Windows 7 Enterprise x64 Image [2] – Test Deployment Base OS & Core Apps

OK, start deployment capture using MDT 2010 Update 1 capture platform. Apply Windows PE capture step fails. Capture task eventually reports 8 errors. Logs are written to C:\MININT subdirectories and %temp%\SMSTSLog folder and %temp%\smsts.log.

Try updating deployment share. Install 7Entx64 as OS in MDT. Update deployment share again to use new OS boot images. This time capture task is going OK. Can’t RDP into captured machine due to network disconnect during Sysprep. Continue with direct access. Capture completed successfully, 0 errors.

Test deployment to target platform (6550b laptop). Will try to deploy this image on MDT 2010 Update 1 instead of Gold. Received an error about networking driver not installed after starting the Deployment Wizard (The following networking device did not have a driver installed. PCI\VEN_8086&DEV_10EB&SUBSYS_1471103C&REV_05). Abort MDT2 deployment. Download Intel network drivers from HP, import drivers, update deployment share & burn new LiteTouch CD.

Retry deployment on 6550b. Wizard starts & connects OK. Wizard commences deployment @ 14:36. Windows 7 install sequence commenced. Deployment completed automated around 15:11.

Still have a few devices not installed with this image so the drivers need to be injected in a custom deployment task sequence which is some way off yet. Next step is to install Feature Apps into the deployment image, recapture, test this then create the platform specific task for the Probook 6550b.

Building Windows 7 Enterprise x64 Image / Home PC

We have started deploying new laptops (HP Probook 6550b) with 4 GB RAM so x64 Windows 7 is the preferred OS due to 3.5 GB limit of x32 Windows. Previous image covered in the series of articles starting here was 32 bit Pro generic and specific, now building 64 bit Ent generic / specific. I expect most desktops hereon in to have 64 bit OS eventually standardising on this except special cases.

Basic deployment steps:

  • Deploy OS to target platform
  • Deploy “core apps” (Office, SEP, SMS, Adobe, Smartboard)
  • Capture generic x64 OS + core apps image
  • Deploy “feature apps” (e.g. Firefox, Google Earth etc)
  • Capture generic x64 OS + core apps + feature apps image
  • Either one of the following:
    • Deploy to specific platform for image capture & deployment
      • Deploy this image to specific hardware platform (Probook 6550b for example)
      • Make any specific hardware customisations
      • Capture platform specific image
      • Test image deployment
    • Create platform specific task sequence for deployment
      • Create specific task sequence
      • Inject platform specific drivers (in addition to Windows driver injection)
      • Install platform specific applications
      • Test task sequence deployment

The above is a summary of my experience to date in the MDT system as well as some possible options for future exploration. Obviously MDT is a specialised system that I need to have some knowledge of, but I am not an expert in this system as I do not use it every day. For deployment to specific platforms to date (only one so far) I have preferred to use a customised task and this will probably be the means implemented for 64 bit deployment. Also I have two MDT environments, one used for capture and the other for deployment.

Building new home PC continues with purchase of motherboard. Next month will buy the CPU and RAM hoping to complete assembly in a few weeks. Delivery of memory for T5720 thin client expected shortly so can test capabilities soon.

Monday 30 August 2010

LCD Form Factor Trend Hype

New laptops being shipped today have a native resolution of 1366x768. Previously it was 1280x800. Before that it was 1024x768.

Considering I can get 1280x1024 or 1440x900 on the desktop the trend to increase only the width of the screen while keeping the height roughly the same seems backward. Application and OS design favours horizontal toolbars which increase in size over time, for example the Office 2007 Ribbon etc. This then presumes that screen height will increase over time. However the emphasis of LCD manufacturers of late in base models has been geared towards increasing the width leaving the height essentially unchanged. The screen starts to look very cluttered when considering the height of these screens is essentially unchanged since the days of 17” glass, and that’s probably going back more than 10 years. Recently I was working on a HP 8510 laptop with a native resolution of 1680x1050 approx, a huge difference in resolution and one that makes the screen sizes of mid range business laptops look positively antiquated.

Sunday 29 August 2010

HP T5720 Thin Client

A2000_20100823_001

This is my latest acquisition from Trademe, and a very good one at that – the T5720 is almost a current model (not quite) and this particular one was made only four years ago. Although this one, which cost me $83, has 256 MB of RAM, I have ordered some more for it in order to be able to flash it with the 2008 update of XP Embedded. This will give it RDC 6.1 capabilities and therefore I should be able to take it home and connect over my broadband to the school system via our RD Gateway.

Our evaluation of lower end TCs continues and it is likely we will have more in classrooms by the end of the year (we have just one at the moment). Presently while there is a reasonably high volume being offered on Trademe, some of the prices being asked are more than I would expect to pay. However one of the major vendors (Core Technology Brokers) has told me they would discount to schools, this company also offers warranties, and I would be therefore inclined to choose them over ad hoc dealings with one off sellers and small players which sometimes are inexperienced in this line of product, also they would have the technical knowledge to be able to answer the various questions I have had up to now. Based on my experience to date I would recommend the T5300, T5510, T5520, T5700 models as being those which have sufficient capability to connect to a 2008 RDP server inside a network. So far I only have used the T5510 and T5520 models. Both of these are running Windows CE which is very suitable for RDP use because it has just enough capability and you don’t need to muck around too much to set it up. Here is a comparison table of some of the key specs of these different models of HP clients. I am preferring just to standardise with HP thin clients for now even though there are various brands.

 

T5300

T5510

T5520

T5700

CPU TM5600 533 MHz Crusoe 800 MHz Eden 800 MHz TM5800 up to  1 GHz
Flash ROM 32 MB 32 MB 64 MB up to 256 MB
RAM 64 MB 64 MB / 128 MB 128 MB 256 MB
Graphics Rage XC 8 MB Radeon 7000 16 MB S3 Rage XC 8 MB
Display modes 640x480 32 bit
800x600 32 bit
1024x768 32 bit
1280x1024 32 bit
1600x1200 16 bit
640x480 32 bit
800x600 32 bit
1024x768 32 bit
1152x864 32 bit
1280x1024 32 bit
1600x1200 32 bit
640x480 32 bit
800x600 32 bit
1024x768 32 bit
1280x1024 32 bit
640x480 32 bit
800x600 32 bit
1024x768 32 bit
1280x1024 32 bit
1600x1200 16 bit
Printer port DB25 DB25 DB25 DB25
Serial port No DB9 DB9 DB9
Display port HD15 VGA HD15 VGA HD15 VGA HD15 VGA
USB ports 1.1 x4 1.1 x4 2.0 x4 1.1 x4
Network port 100 Mbps RJ45 100 Mbps RJ45 100 Mbps RJ45 100 Mbps RJ45
Audio Internal speaker
In/out ports
Internal speaker
In/out ports
Internal speaker
In/out ports
Internal speaker
In/out ports
Keyboard port USB only PS/2 or USB * PS/2 or USB * USB only
Mouse port USB only PS/2 or USB * PS/2 or USB * USB only
OS WinCE 4.22.144 WinCE 4.22.144 WinCE 5.04.595 WinXPe 5.1.212
RDP version 5.1 5.2 5.5 5.2

* Note: T5510, T5520 have only one PS/2 port, for either a keyboard or mouse but not both. Unsure if port splitter can be used.

There are many other models but I have chosen four models than can be priced somewhere around $100-130. We have screens that are either 1024x768 or 1280x1024, meaning any of these models would suit. Personally my preference from all of the above would be the T5520 which tends to be at the higher end of current pricing, being the newest model. However I would be just as happy with one of the other three models for our typical classroom situation (subject to testing). As you can see the main difference between the 5300 and 5700 is the OS. For RDP support Windows CE is perfectly satisfactory (make sure you have the latest version on your platform; I had to flash the upgrade to the 5520 I had bought to fix problems with display redrawing). Note that all of these only support 4:3 native display ratios. If you are buying new screens make sure they are specced for the above list of resolutions. 15” screens (1024x768) are just about unobtainable new now but there are still plenty of 17” 1280x1024 screens available new and both sizes will be available second hand for years yet.