Wednesday 23 December 2009

Energizer Hard Case 2 Cell AA Swivelhead Torch PROSW2A

IMG_1178

Since the advent of white LED technology several years ago, I have waited for torch manufacturers to come to grips with the advantages of LEDs in high efficiency and long life, among others. Eveready-Energizer have held out for a long time with their existing bulb based products and it is a pleasant surprise to see that at long last they have grasped the LED nettle and started to engineer products which use the LED to its fullest advantage. Although LED technology is becoming very widespread in torches of many different types, most of them are using LEDs of fairly average brightness or without any reflectors. Regardless of the light intensity of the emitter, the beam drops off fairly rapidly without a properly engineered reflector. As battery life is always a consideration for torches, it stands to reason that making the maximum use of the beam by fitting a proper reflector is an advantage worth having.

Energizer have developed a line of high quality high durability torches called “Hard Case” which start at 2 AA and range up to 2 D size. Practically all of the smaller models now use LEDs, but a bulb is still used in the largest model. While I am still waiting for Energizer to give us a super bright LED 2 D torch with very long battery life, I’m happy to buy the 2 AA swivelhead which is the subject of this review. With a claimed light output of 75 lumens and 5.5 hour run life (presumably on Energizer alkalines), this product fits the bill for a strong bright light in a package that will survive the rigours of working conditions and will fit in a pocket. At around $60 though, it’s not cheap.

The Hard Case’s head normally rests at right angles to the body but can be rotated up or down through an arc just over 90 degrees. The rotating mount is made out of 5 mm thick metal and like everything else about this torch it has a solid rugged feel to it. The torch is claimed to withstand a 6 metre drop, the lens is claimed to be shatterproof and the torch has gaskets fitted at joins to keep water out. However, no claims are made about submersibility. As the torch is obviously designed to resist water I would expect it to survive being dropped into a bucket or some other container filled with liquid but Energizer has not given any guidelines about this. One of the advantages of LEDs is that the emitter can be permanently sealed into the lamp as with a life of 100,000 hours and immunity to burnout or vibration shock, it never needs changing.

Now that I’ve mentioned the emitter, this torch has not one, but in fact four of them, controlled by a pair of  membrane sealed push button switches. The upper switch alternates between two red (“night vision”) or one green (“pipeline inspection”) LEDs of high, but by modern standards, relatively ho-hum brightness level. The lower switch is the one you want to be more careful of. One push illuminates the main emitter’s fantastically bright 75 lumen beam, while a second push switches it to a lower power level. I would guess the emitter is probably about a 3 watt unit, and as everyone hopefully knows, on no account should you ever look into the light as it has the capacity to permanently damage your eyesight. With the advantage of a properly designed reflector in its favour, the swivelhead’s full power output easily outperforms a cheaper 2 D Energizer with four “ordinary” LEDs and no reflector. The emitter is bright, but the reflector is what makes the beam blinding and thus it outperforms by a wide margin anything else in its class. Simply put as a torch it packs the best performance I have ever seen in 2 AAs and I am now waiting for that 2 D high power LED Hard Case, or perhaps an LED Dolphin would be nice (maybe I can get one of those drop in LED bulbs for my customised Dolphin, which has been upgraded with a gel cell).

FOOTNOTE: Since I made my original posting, a friend noted that the colours output by this torch are the same ones used in hand signals for trains – red, green and white. I have thought about this a little, and in theory you could use this torch for this purpose, but in practice it could be too slow and awkward to be able to change the colours in situations where a rapid change is needed as, for example, changing quickly from a Go to Stop signal (red to green). When I was involved some years ago in a heritage railway, it was always a point of interest that LED technology was developing to the point where all three colours could be displayed in the same device, which would probably have to be custom assembled built due to the very specific requirements for the three colours used, as we well know that there is virtually nothing out there that is readily available and cheap – I expect that there are commercial devices produced for this purpose, probably at a cost of hundreds of dollars. This particular model of torch may be able to be used in situations like these and is a well made rugged device for outdoor use in all conditions especially where it may be dropped.

FOOTNOTE 2: Energizer do have a 2 D LED model available, PRO2D1, though I have yet to see it in NZ. It has a Cree 3 watt emitter, although the output is actually lower than the PRO2SWA at 65 lumens (about the same as a standard Dolphin torch), the beam has double the useful distance, probably due to a larger reflector, and its runtime on alkalines is a very useful 15 hours. In general this model has similar performance overall to the standard Dolphin but is much more solidly made, and on 2 D cells it is cheaper to run than the 4 Ds or 6V lantern battery of the Dolphin. Also of interest is the TUFRC1, which is a bigger unit in a handheld spotlight form factor though it uses a 3 watt LED, has about the same range and output as the PRO2D1, and is rechargeable. I think that this model is available in NZ but is getting fairly pricey. Top of the range is the specially engineered high power Dolphin, the 108MK6R, with a special halogen bulb, which can run for about 1 hour on a charge and pumps out a massive 210 lumens, but it would be very expensive if available, and the short run time puts it in the same league of impracticality as numerous handheld halogen spotlights already in the marketplace. As is typical with such lights, the 6 V 10W halogen bulbs are hard to come by – in fact, from experience, 6V halogen bulbs for handheld spots are almost impossible to buy in NZ. So the TUFRC1 looks like the best option for replacing any of my large and bulky cordless spots even if the beam won’t go as far. One day I hope a cordless spot engineered with LEDs will become an affordable and practical proposition in a similar power output as the 55/100W 12V models commonly available.

Tuesday 8 December 2009

Vegas Movie Studio shines where Premiere Elements fails (Update)

Well, DVD Architect did a superb job of the DVD, incredibly simple to send the completed project from VMS to it in order to put the DVD together, especially simple with the automation of creating a menu from the markers I had previously put into the project in Vegas. It came together very quickly and just needed a little tweaking to get things exactly as I wanted them. And, still no crashes at all. What didn’t work out so well was my idea that I could burn three things at once. I am now going to set up a couple of old desktops with a burner each to enable three simultaneous burns, it is still cheaper than upgrading to the full version of Nero that supposedly can do simultaneous burning. It seems from the testing I have done that ATAPI cannot handle more than one burner process running at a time, it just seems to cause too many hiccups. Lightscribe now has competition from Labelflash, which at this time is not yet mainstream enough in NZ despite support from some bigname vendors. For example, Sony has models that do it, yet even the Sony Shop does not sell them.

UPDATE: Still no crashes in Vegas/DVD Architect yet :) And I learned how to trim a clip and how to fade it out at the end. Mainly I wrote this because another solution has come up for the multi Lightscribe burn debacle. Although Nero doesn’t have a multi Lightscribe option, another vendor, Acoustica, does. Their product is called CD/DVD Label Maker and I plan on testing it out very soon. So I will have my main computer with three Lightscribe drives installed to burn all three at once, and some old desktop heap with just an ordinary burner in it to burn the video part of the DVD. I had already heard of Acoustica when we had a look at their Mixcraft audio mixing software for possible school use. Although we haven’t yet determined a requirement for that level of mixing, I was very impressed with Mixcraft’s capabilities, and like most of Acoustica’s product range, the price is very cheap for such a good software package.

Monday 7 December 2009

Vegas Movie Studio shines where Premiere Elements fails

Last month I wrote about my experiences with Premiere Elements 4 and how useless a piece of software it has turned out to be. At the time I was struggling to be able to find another piece of software that had similar functionality and so forth, until I discovered there is a cut down version of Sony Vegas, Vegas Movie Studio currently in v9. The initial testing with a demo version seemed straightforward enough but serious editing has had to wait until I could purchase a license. Our local Sonystyle shop was very helpful in getting a copy sent down from Auckland overnight so I will forgive them for having relocated to the odious Westfield Riccarton mall. Having managed to get out of there still sane, I installed Vegas Movie Studio along with DVD Architect onto my optimised Vista box – which is basically the Vista side of this dual boot PC, with a minimal installation of applications, on the hardware side it does have 2 GB of RAM, an extra HDD just for video, two Lightscribe DVD writers and a third writer. I’m looking forward to testing it out to see if it can burn a DVD and two LS labels all at the same time.

Now the contrast with Premiere Elements could not have been more extreme, from the fact that Vegas installed very smoothly, to getting a project started, getting its steps done and outputting it all in a couple of hours. Although I have yet to master the DVD which requires the separate Architect package that I haven’t yet used, and although Vegas does have its own quirks and learning curve, it is a huge improvement, even if just one thing, the fact it can read the MPEG files from the camera almost natively, is recognised. But VMS turns out to be a whole lot more, as in a whole lot more stable, a whole lot more reliable, a whole lot quicker and so on and so on. One tip is that you really need to update to release 9.0b because the version on the install DVD which is 9.0a will not be able to open an AC3 audio stream on MPEG files that use this audio format and so your production will have no soundtrack. Once I got the upgrade installed, the audio was recognised without any hiccups and everything went very smoothly. With cameras that use their own “special” MPEG format (like JVC Everios with the MOD files) you may find it useful to download a utility from the Internet that changes the file format slightly from MOD to a proper MPEG. I did make use of this tool (SDCopy) in the process of trying to work out why there was no sound in the clips but it turned out to be the Vegas version being the issue with the fix in v9.0b for MPEG AC3 sound. So all in all it looks very much like Vegas is massively better value for money than Premiere Elements.

And so to DVD Architect – a simple matter of pressing a button at the end of the VMS render, with all the scene markers already created and saved – then it turned out to be a very simple process to set up menus – and unlike Premiere Elements, still not one single crash. I am just getting ready to burn a sample DVD to take home with me for testing out, then there’s just a little bit of work tomorrow to get the Lightscribe label set up and burn a master for the school to sell to people who want it.

Saturday 7 November 2009

Dual boot Windows 7 and Windows Vista

As part of the march onward towards Windows 7, I turned my Windows 7 RC box into a dual boot system with Vista as the other operating system. Normally you’d install Vista first and then 7, but I already had 7 on the box and didn’t want to reinstall it. So I installed Vista on a separate HDD and then ran EasyBCD in 7 to create another boot entry. Interestingly enough, though Vista is on D drive from 7’s POV, when Vista comes up its boot drive is C as far as it’s concerned. Overall converting this to a dual boot system was far less difficult than trying to make my other Vista box into dual boot Vista/XP which as we know was rather traumatic and time consuming. The plan overall is that where I currently have two physical machines at my desk, in future I will only have one.

One of the reasons for setting up Vista on this newer box was to try to get a more optimal setup for Premiere Elements – which I have persevered with this week to get my video edited – and which surprisingly enough I have managed to cope with all that time despite many, many crashes. When it came to burning the DVD though, no way it could burn one with a menu on. Another very glaring issue was that it simply could not cope with a different disk path from the original location it and the files were saved in. After struggling with the silly “Media offline” message I gave up and moved the files into a drive that I had mapped to the same drive letter as the original. Then it worked properly again. My overall impression has unfortunately not changed that much, especially after finding that others had similar experiences. Adobe claims this software won some sort of award. I just don’t understand how they can have any real credibility with such a flawed product, why don’t they make a bit more effort to get it to work properly. Anyway, enough of that.

Once the release version of 7 reaches us, I’ll be putting that onto this machine and rearranging its disks a bit. It currently has four disks totalling 1 TB, but three disks of around 750 GB should be enough, including the two boot disks of 160 GB each. I’ve also spent a bit of time putting together a backup desktop on a Hyper-V virtual machine. The idea is that when the PC is booted into Vista, I can log onto this Hyper-V desktop (running 32 bit Vista) and access all my email in Outlook seamlessly. I can also work on stuff that won’t run properly on 7, for example our SMS has ODBC drivers that are only 32 bit and won’t install on 7 x64. The backup desktop will be useful in a variety of situations and again it emphasises the versatility of virtualisation.

Monday 2 November 2009

World’s most disappointing video editing software… Adobe Premiere Elements

Where do I start? I had prior experience of the full edition of Adobe Premiere, version 5.x. This impressed as a dependable, powerful software package on a PC, considering that Premiere was originally developed for the Mac. Premiere 5 was stable, and it did a lot of good things and it did them well. So I expected a lot from Premiere Elements 4 when we purchased it here at school, just one computer, my computer, so that I can author DVDs from footage shot of school events and occasionally a personal video as well. Unfortunately the experience with Premiere Elements 4 has proved to be extremely disappointing. I hate having to write a negative review of any product, but the cumulative problems add up to a lot of heartache. I expected that it would be possible with this package to put together a reasonable effort of a DVD based on previous experience using the basic DVD authoring package supplied with older versions of Nero. The experience of Premiere Elements has been so disappointing compared with previous experiences that I cannot recommend it to anyone. A large part of that is a perception that Adobe have invested inadequate resources into this package, in its design, and in supporting it.

Here is briefly a list of the major problems I experienced with Premiere Elements 4:

  • XP install is almost complete when the install suddenly self cancels and rolls back
  • When opening a project in XP it crashes with an error in <some file name.cpp>

So from these two errors, it’s a waste of time trying to make it work on Windows XP. If you can get it to install on XP you are doing very well indeed. The second issue is a glaring example of how not to write software. In programmers’ jargon, this is called an “unhandled exception”. Instead of getting a meaningful error message, you get a cryptic message about a line number in a file, which means nothing to most people.

  • Message about “running low on system memory. Please save and proceed with caution”.

Like, what does this mean? I am using a 500 GB HDD with over 200 GB free. In fact the PC has three HDDs, the other two have 50 GB each free. The PC has 2 GB of RAM and doesn’t have any antivirus package, and no other software running. So this is another example of a meaningless, useless error message.

  • Cannot change the menu structure

When creating a disc menu, I want my menu structure to have (I think) four menu items in the main menu, and no secondary menu. But this just can’t be done. The only menus you can use are the built in templates, which have a fixed structure that you can’t deviate from. If you try to insert too many main menu items, Premiere Elements completely ignores the fact that you obviously want them all to be on the main menu, and creates a Scenes submenu and puts your items onto that. I don’t want that at all, but the default behaviour can’t be overridden in any way.

OK that is a short list but it sums up a lot. You will see numerous frustrations in this package, it will crash a lot without warning, I try to change to a different menu and it crashes, I try to get ready to burn a DVD and it crashes, or it crashes halfway through burning as it did once. When you go to Adobe support, there are some user forums and that’s about it. Nothing can really disguise the fact that this is one flaky piece of software. I will never recommend Premiere to anyone, ever again. At the moment on this project I am up to about the 12th attempt to get even the most simple option put together, just all the clips one after the other, no menus or anything fancy because you just can’t do it, as I say this is about the 12th attempt because every other attempt the program has crashed and lost all the work I have done so far in spite of pretending to save every few minutes.

You can add in a lot more issues, like that it won’t work with any source format except AVI, which means that if you have files in another format, they have to be converted to AVI first, which is hours more work if you have a lot of video files. When it works, it works well, but more often than not, I find Premiere Elements finicky and temperamental. I tried to save something to MPEG as it has a file output to that format the other day, and somewhat predictably, it didn’t output anything. It just sat there and pretended to do something.

At the times when Premiere Elements 4 was working properly, I found it easy and convenient to use. I have not attempted to use some of the more advanced features, of which there are a lot buried in different layers of the menuing. The ever-present threat of a crash, however, overshadowed my entire editing experience and created an unpleasant overtone that is difficult to disperse.

I find it hard to understand how Premiere Elements 4  got such good reviews when it was released and can still be considered a satisfactory product for this type of application. With my experience I have a definite hesitation in recommending this product to anyone in the future. But there are a lot of other packages in its price range from Ulead and others that are a lot less temperamental, and can handle different file formats with greater ease.

Saturday 31 October 2009

Back to XP at home

Back in March, seven months ago, I wrote that I was putting Vista onto my home PC. Today I reinstalled XP instead. The main reason for doing this is that there is no reasonable upgrade path to Windows 7 for these old Intel 915 boards, and that is because there is no Vista driver produced by Intel for them. The result is they can only use the built in display driver on Windows 7, and that one won’t work with DirectX. Now it is true I could put a new graphics card into this box and get Windows 7 drivers on it, but it is still going to be an old slow heap. So I decided it will have XP on it until I upgrade. XP went on today after the usual major prep work, made a lot easier by having three HDDs, and I’m already enjoying the superior speed and stability.

Windows 7 RC at work, incidentally, seems to have a major stability problem whenever I try to open the Documents folder (Library) into the old My Documents area, instead of all my files it just hangs. After you crash out of the app and try to open again, it works the second time. Roll on SP1.

Monday 19 October 2009

Getting There: Remote Access with ISA 2006, EX2007 etc etc

Our remote access setup through ISA 2006 and EX 2007 is getting closer. There are many steps for the uninitiated that have to be completed to commission an ISA server and secure an Exchange server for web / remote access.

We determined first of all that, once ISA was more or less ready to go, that it could be put in parallel with the direct connection through the hardware firewall. (The ISA server is in a back-end configuration, meaning it is inside a hardware firewall) With a rule set up to give maximum access through the ISA firewall – for now – we started testing web proxy and installing the ISA Firewall Client. The most major issue found to date is a conflict with AB Tutor Control v6’s client application, which had to be uninstalled from a group of PCs that we had hoped to remote administer with this software. Maybe later on I will try testing it on a non-production machine. My Windows 7 box played up a bit the first time the FW client was installed as well, but a restart fixed this and so far there haven’t been further problems. As we have one Mac and the occasional non-domain Windows PC on the network, they will have to use the Web Proxy to authenticate when we force user authentication. When I first tried that I realised that people without the FW client installed would be locked out, so at that stage we programmed mass installation of FWcli through a GPO. DHCP was also set to provide the default WPAD URL, having previously been configured for a 4-hour lease time to effect the changeover as quickly as possible. User authentication is essential if you want to have usernames logged against every access, the main challenge being that Web Proxy can only work for the protocols that support proxying on the computer in question.

At this time we will leave ISA running as is for a week while some other steps are completed. After purchasing the certificate from GoDaddy.com, I had to install it, which fortunately they provide instructions for. After I changed the IIS server’s https binding to use my new certificate, Outlook 2007 kept popping up a security warning saying “The name of the security certificate is invalid or does not match the name of the site”. I found a solution here, which involves configuring Exchange to recognise the new certificate. After that, all of the warnings have gone away. The next step is to put rules into ISA for the main services/ports that are needed – such as IMAP, IMAPS, SMTPS, POP3S etc. Then a proper rule for web browsing (HTTP/HTTPS) will be put in. The main timing issue is simply waiting to see what problems show up after each step. You may ask why we need two firewalls, the front firewall (hardware) is free and facilitates our web filtering service. The ISA firewall facilitates easy configuration and publication of Microsoft services such as TSGS and OWA / OA. It also gives us full logging of internet access and the option to add internet quota management software at a later date. And finally it is two layers of defence against the outside world. I am a bit paranoid about this, but I think this pays off, because in a smaller school with limited support resources, we need strong defences to cover for the fact that our primary focus isn’t security. Locking this thing down to the max is a better defence strategy and it also gives us monitoring capabilities that simple hardware firewalls can’t do (although the front firewall has Netflow, which we are monitoring with a free application, it can’t authenticate the users like ISA can).

So in a way I am glad we missed our planned deployment deadline… because it was unrealistic.

Wednesday 30 September 2009

The Mess of Managing Printers through Group Policy

This is a subject I have written about numerous times before. Here are the previous articles:

Here is a brief history of managed printer deployment options in Windows:

  • Prior to Windows Server 2003 R2, administrators mass deployed printers to workstations using logon scripts (VBScript etc).
  • WS2003 R2 introduced Print Management via Group Policy (PMCSnap). Using the Deploy Printers extensions to GP, and a client executable called PushPrinterConnections.exe (PPC), printers can now be specified in Group Policy and pushed out to XP and later Windows clients. This is supposed to work for both per-user and per-computer printers identically. In practice we have only made per-computer work reliably and find that the old printer connections are not always removed when the GPO is removed from the OU.
  • WS2008/Vista introduced printer management via Group Policy Preference extensions. This works a little differently from Deploy Printers. Network shared printers can only be specified per-user (rather than per-computer) and on Vista and later clients, printer drivers are not automatically installed as they are on XP. Both XP and Vista require a Client Side Extension to be installed (distributed as a KB update) to process GPP settings. One nice little feature of GPP is to set a default printer. I am somewhat of the opinion that a mixture of PMCSnap and GPP might overcome the various issues, where a default printer absolutely must be settable.

So… we started using PMCSnap when we got our first 2003 R2 server. Then we started using GPP when we got our first Vista workstation. Now we have gone back to PMCSnap for post-XP clients so that their drivers are installed. We were only able to make PMCSnap work for per-computer and GPP is only really practical for per-user in its current form. To add further to this mish-mash, I decided to switch a select group of staff PCs running XP back to PMCSnap. Here I ran into yet another problem, different versions of PPC (PushPrinterConnections.exe). This is a client executable you deploy via startup or login script to process the Deploy Printer GPO settings. It is only required on XP or below. First time I tried it, I was using a Windows Server 2008 box to run GPMC. No problem, I thought, grab PPC from C:\Windows\System32 just like the documentation says. Version 6.0.something. But it didn’t work. Printers weren’t pushed. Try -the log parameter. GPResult tells me the policy was applied, but no log file so PPC hasn’t been run. Strange. After a lot of testing I decided to grab the WS2003 version of PPC from the R2 server that we still have (C:\Windows\PMCSnap). Version 5.2.something. Well to cut a long story short, it works. Just like that.

Tuesday 29 September 2009

SMS Integris (Omnis) Compatibility on Windows Vista and Windows 7

My previous articles on this subject are published here and here. Our site experienced considerable difficulties in making School Management Systems’ Integris 6.90.xx function successfully on Windows Vista even though the vendor does not have a history of problems. The majority of difficulties to date are on 32 bit Vista systems. We do not have a 64 bit edition of Vista for testing. The Integris software is widely used in the UK and Australia by primary and secondary schools, as well as in New Zealand.

On Windows 7, 32 bit and 64 bit Hyper-V virtual machines as well as physical 64-bit installations have been used for testing. So far all problems were experienced only in virtual machines. Difficulties have not been found to date in the physical computers running the Windows 7 Release Candidate, all 64-bit. Our 64-bit VM has Windows 2000 compatibility mode set, but no compat settings have been needed on any physical x64 PCs. It is not necessary to set this application to be run as an administrator on physical x64 PCs, although it is recommended on Vista from our experience. After the compat settings were removed on the x64 VM, no problems have been experienced. Likewise there were no problems initially setting up Vista on a physical x86 computer, which was my own workstation. It was later x86 computers that had problems on Vista. The inconsistencies in problem occurrences on different machines cause me to consider that an update or service pack, or a particular application that a hardware vendor may be supplying, has caused the problems. The Hyper-V server has just been updated to WS2008 Service Pack 2.

The vendor has highlighted that the Omnis runtime 3.3.3.x is not certified by Tiger Logic for Windows 7. It would be therefore inadvisable for any school sysadmin to roll out Integris on Windows 7 site wide until RM-SMS have updated the runtime to a Windows 7 certified edition. There are two possible workarounds for sites that wish to push ahead with Windows 7 rollout:

  • Using Windows XP Mode to run Integris. This has to be set up on each client machine, and it requires that the CPU supports Intel VT.
    • Using a Terminal Server to run the Integris application for end users. This does not require individual configuration of clients, nor does it require clients to support Intel VT. We will be using this option at our site to allow for remote access to Integris with the secondary benefit of resolving the compatibility issues. We are assuming the 3.3.3.x runtime is Windows Server 2008 compatible as this is the environment that hosts our terminal server.

In the main, while XP Mode is a nice idea, the virtual machine has to be set up for each computer that it runs on. Moreover this requires a compatible CPU as the Windows 7 version of Virtual PC requires hardware virtualisation support. While AMD support VT on all of their non-Opteron CPUs, lower end Intel Pentiums typically omit it. I have the galling situation that all of our recent desktop purchases do not support VT because we did not know about this feature and its significance for future desktop OSs. I think that changing CPUs to get VT support is not worth the hassle for most of the PCs at our site which do not have it, compared to the TS option even though this requires CALs at additional cost. Those CALs have a dual function for enabling remote access and thus the cost is not wasted on physical machine resources that are irrelevant to remote access.

Wednesday 9 September 2009

Ministry of Education renews Microsoft Schools Agreement for 2010-2012

The NZ Ministry of Education has renewed the Microsoft Schools Agreement for New Zealand schools for 2010-2012. Whilst I have yet to see the agreement, it continues the trend of these agreements and will provide welcome continuity for school administrators and IT staff. The new agreement provides effective transitions for most existing software packages whilst it also adds Windows 7 Enterprise Edition as a new operating system choice. As Windows XP support is phased out, schools will need to look hard at moving their Windows OS platform to Windows 7, preferably skipping over Vista due to the latter’s many problems which are experienced in domain type environments. Our site is a Windows site for the most part. This leverages the high cost benefit of Windows Server operating systems for managing Windows desktop OSs, the latter being effectively free under these deals except for the modest cost of lower end desktop OEM licenses on new PCs. Microsoft continues as a market leader in new emerging technologies such as virtualisation, in which the developments are likely to benefit education significantly.

I expect as Windows 7 becomes available it will start to be deployed to our staff computers next year and that the Ministry’s leased laptops will start to be delivered with it preinstalled, if not we will install our own house image, building on experience already gained with Vista. That has been a bit of a watershed, and I am still disappointed that Microsoft is not resolving the significant problems that Vista has had in terms of its speed and reliability. I will still have a PC running Windows Vista at my desk for some considerable time, years even, along with XP, because there are still some things out there that won’t run on 7 or have not yet been ported. Although, it is fair to say, with a Hyper-V server, I can run some of those things on an XP virtual machine (for example, the Remote Desktops MMC snap-in) to similar effect without the physical machine. Virtualisation continues to offer new opportunities, and schools such as ours could well extend the use of older PCs using Remote Desktop Services the way that it has traditionally been used in other institutions for years.

Wednesday 2 September 2009

Exchange Management Console issue on Windows 7, and miscellaneous firewall errors sending mail on Exchange Server 2007

ISSUE: When you install Exchange Management Console for Exchange Server 2007 SP1 onto a desktop PC running Windows 7 RC or Windows 7 Gold, you may experience error messages when attempting to access the Client Access item of Server Configuration in the console tree. The messages received are similar to the following:

 

The following error(s) were reported while loading topology information:

Get-ActiveSyncVirtualDirectory
Failed
Error:
An error occurred while trying to access IIS (Internet Information Service) metabase. Make sure the Internet Information Service Manager component is installed and configured properly.
Unknown error (0x80005000)

Get-OabVirtualDirectory
Failed
Error:
An error occurred while trying to access IIS (Internet Information Service) metabase. Make sure the Internet Information Service Manager component is installed and configured properly.

Unknown error (0x80005000)

 

Get-OWAVirtualDirectory
Failed
Error:
An error occurred while trying to access IIS (Internet Information Service) metabase. Make sure the Internet Information Service Manager component is installed and configured properly.

Unknown error (0x80005000)

Note: Get-ActiveSyncVirtualDirectory, Get-OabVirtualDirectory and Get-OWAVirtualDirectory are all the names of Powershell cmdlets for Exchange Server 2007. This implies the same errors will be encountered if using these cmdlets in Exchange Management Shell as well.

CAUSE: The IIS management console is not enabled on the client machine.

FIX: To correct this problem, open Programs and Features in the Control Panel. Select to “Turn Windows features on or off”. In the tree expand “Internet Information Services”, then “Web Management Tools”. Check “IIS Management Console”. Click OK. Close and restart EMC or EMS.

REFERENCES:

http://social.technet.microsoft.com/Forums/en-US/exchangesvrtransport/thread/82e1587f-51bd-4839-8867-0ae904670e2d

 

ISSUE: When sending mail to some addresses using Exchange Server 2007 SP1, you may experience occasional errors similar to the following. You are using a Smarthost send connector to send this mail, and there is a Cisco firewall between your exchange server and the external server specified in the Smarthost send connector settings.

Delivery has failed to these recipients or distribution lists:

Xxx Yyy
An error occurred while trying to deliver this message to the recipient's e-mail address. Microsoft Exchange will not try to redeliver this message for you. Please try resending this message, or provide the following diagnostic text to your system administrator.

The following organization rejected your message: <remote SMTP server FQDN>

  _____ 

Sent by Microsoft Exchange Server 2007

Diagnostic information for administrators:

Generating server: <local Exchange server FQDN>

xxx.yyy@domain
<remote SMTP server FQDN> #500 Firewall Error ##

CAUSE: The Cisco firewall has a configuration entry like the following (it may have additional parameters specified after <inspection-list-name> in addition to esmtp):

ip inspect name <inspection-list-name> esmtp

This problem occurs because of incompatibilities or restrictions caused by the Cisco firewall configuration. It is more likely to occur if you are sending an email to multiple recipients or using a distribution list in Exchange.

FIX: Disable this entry in the Cisco firewall configuration by inserting the word “no” at the beginning of the line as shown, so that it should now read something like

no ip inspect name <inspection-list-name> esmtp

REFERENCES:

 

Monday 24 August 2009

Changes to RSAT tools for Windows Vista SP2

RSAT is the Remote Server Administration Tools, which replaced the Adminkit that was supplied in Windows Server 2003. RSAT is for managing a Windows Server 2008 box in Vista. The RSAT and the related Hyper-V management component were updated when SP2 of Vista was released. I found a lot of difficulty trying to locate the correct version of the Hyper-V tools for SP2 but eventually I got the right file version.

However you may see an error message which reads “Access denied. Unable to establish communication between <server> and <client”. The fix is to change DCOM permissions as described here.

Hyper-V is so good that I expect our site will eventually just have two physical servers (instead of four at present) and these will be two Hyper-V servers.

Last week I talked about rebuilding some desktop PCs. We have another scenario of computers in tower cases. Rebuilding these is a much more viable option because the power supply is likely to be a drop in replacement (standard ATX style, although I haven’t yet checked). The issue of rebuilding is that basically, an old PC still has two useful components that may be in as-new condition but their actual market value is next to nothing. These components are, the Windows chassis sticker, and the chassis itself. Together these could be worth around $200-250, about the same as the resale value of a 4-5 year old PC system unit in working order. Obviously in that price you are getting much more than a chassis and license. This is providing that the Windows volume license allows direct upgrades from the version of Windows on the sticker.

When we last upgraded the PCs at our site, the chassis were old style pre-“Prescott” ATX, meaning they lacked the ventilation duct in the side, and power supplies were failing. However provided the chassis you have now has the duct fitted, modern boards should be OK for this chassis. We discarded most of those old PCs, and when you do that, you are effectively throwing away that Windows OEM license. But if you can rebuild the PC by replacing most of its internal components, you can retain the replacement value of both the chassis and sticker. Rebuilding isn’t for the faint hearted, however. Installing the CPU into the board, installing the CPU heatsink/fan and installing the mainboard into the chassis are tasks that I have found challenging in the past, even though these days the boards don’t have the jumpers to worry about. A little slip could result in expensive damage due to fragility of components. Once you have the thing assembled you must run a burn in test, for which you may have to buy software. The burn in helps guard against future component failures by stress testing the components to ensure they are reliable.

Friday 21 August 2009

The Windows 7 Week Wrap-up

Although this is only the fourth day, it is Friday today so it is the last day of the week. The old 915 bit the dust and got binned, the “new” 915 got a fresh Vista installation on its second 80 GB HDD after all attempts to restore the original installation were unsuccessful. Obviously I should have just ghosted the original image from the start, an easy thing to do here where we ghost PCs all the time. However it is only fair to say that the problems I experienced were completely unpredictable, as hardware failure often is; the old 915 must have corrupted the hard disk so that it couldn’t be made to boot. I breathed a sigh of relief when that new Vista installation came up for the first time on that PC. It has a good XP installation so there won’t be any more issues with it. Four year old PCs with board problems are not worth repairing; the Foxconn TS-001 chassis and its Windows sticker are together worth about $250, and if form factors are still the same, they will come in handy when the time comes next year to rebuild either or both of my secondary work PC or my home PC with new parts as referred in a recent blog.

And now back to Windows 7. This PC running the Release Candidate has rapidly become my main work computer much as I expected. Being both Windows 7 and x64, there are going to be a few hiccups, but these haven’t been especially major. The general experience worldwide is that this RC is so good that many people are running it as a primary PC. I expect the production version will become my primary just as Vista did before it, but I need that secondary for the times when 7 doesn’t stack up, until the first SP comes out. As yet I haven’t done much experimentation with the new features like libraries, or tried any additional hardware on it; the card reader is going to stay on the secondary for now. Integris, which has been pretty patchy at our site on Vista, seems to be OK on this 7 installation for now. I haven’t really noticed any issues with other applications except the printer drivers which seem to take two or three goes to install. There are a couple of other PCs here at site running 7 and I expect I will get some feedback on them, and what will come out eventually and predictably is that we will offer the release 7 version to staff users as an option. Rolling out Vista has been a terrible disappointment. As an OS it is only any good for home PCs, in a domain environment there are massive challenges and only now do I fully understand why people have chosen not to deploy it.

Next week I plan on being back to the more humdrum, or in this case, ISA 2006 and making Outlook Web Access, Outlook RPC over HTTP and Terminal Services Gateway work through it. All three of those services running on the same server make use of SSL, and ISA, which will be brought into production as a firewall on our site at that time, will filter and route all of the SSL traffic to those three applications. Having all of them on SSL reduces the number of ports we have to have open on the firewall down to just two for incoming traffic, a few more for outgoing traffic. All of the rules for these have to be set up in ISA. It is pretty much the final stage of a big year long project with setting up the two servers, the mail/terminal server gateway and the firewall. And I’d hope that in future it would be viable to have Exchange hosted offsite so that when the two oldest servers come up for replacement at the end of next year, we can downsize.

Thursday 20 August 2009

Windows 7 Week, Day 3

Today was the day that I rashly decided to do what I didn’t want to do yesterday. That is, to turn the D915 (old XP box) into a dual boot Vista/XP system, and the Q35 (old Vista box) into Windows 7. All this would be relatively simple, just a matter of moving some HDDs and peripherals around wouldn’t it? Unfortunately a long way short of simple is how it has turned out.

The first problems came with the D915 which refused to boot either 160 GB HDD – Vista or XP – the dreaded “Error loading operating system” message came up. Even trying a fresh XP install onto its 160 GB disk initially seemed to work, until first reboot. Vista diagnostics and EasyBCD could not make the Vista disk boot at all. The only thing that would work, initially, was the original 80 GB boot disk with XP from this machine – until something else phutted and it stopped booting, then the BIOS seems to have had a fit and reset itself to the original settings, or some default mode. At this point I have tried another identical computer, which still couldn’t boot anything except its own 80 GB XP disk, so that is what I’m using, and right now I am ghosting the Vista image to see if it can be made to boot if it is loaded onto an 80 GB disk also, which it will just fit onto. It seems strange these BIOSes would not be able to boot a 160 GB disk, since LBA 48 has been around for quite a few years, and I don’t remember any problems with my D915 at home, which I’m fairly sure with an older BIOS version is running Vista on a 160 GB HDD (Update: Well I was wrong about that, it is booting off the second newer 80 GB HDD). I tried putting the Vista HDD back into the Q35 but it still is no go. It looks like something in the BIOS of the D915 has scrambled part of the disk. Funny thing is the D915 recognised a 160 GB HDD and a 500 GB HDD in the same machine… just didn’t want to boot a 160.

So far the only thing that has worked as planned is moving 7 from the temporary box into my Q35, completely problem free, as expected due to identical hardware. Let’s hope that ghosting the Vista image onto a smaller disk works, if not I could still try ImageX on this disk from XP and then restore everything except the boot files which get installed with a fresh Vista install, now this is getting a bit desperate LOL). The big disappointment of the day is that Intel has absolutely nothing on its website referring to this disk limitation on these boards. The ATA-6 standard that introduced 48 bit LBA first came out in 2001 and has been supported by Windows since then, and as the D915 chipset boards are able to recognise drives greater than 128 GiB when they are not boot disks (I’ve used a 500 GB tertiary disk in one of these machines for a couple of years with full access to all its capacity) it is not a very helpful situation when a major motherboard manufacturer has got this kind of limitation in some of its boards and yet has absolutely no documentation of this situation.

Wednesday 19 August 2009

Windows 7 Week, Day 2

Today was relatively ho-hum, the main excitement has been turning the Vista box into dual boot XP/Vista and taking some bits out of the old XP box to put into the temporary 7 box. But that four year old Intel 915 system won’t get the boot just yet. When 7 becomes real next year, the Vista box will become the 7 box (just swap over the HDDs) and then the HDD from the Vista box goes into the old XP box. At that point XP may have to be reinstalled if there are HAL problems, and surely Vista will have to be reactivated, but this is going to be the quickest and most pain free way of getting everything set up again. (Why don’t I do that now? Um, err… it hadn’t occurred to me) – actually I’m not quite ready to make the big box the 7 system with all the stuff in it like the Lightscribe DVD writer and card reader, I still need those to be able to run in Vista or XP in case there are problems in 7. Instead I spent a lot of time squeezing a 500 GB HDD into the 7’s low profile case and setting up the Vista box to dual boot XP. Which itself was surely fun as well…and also an opportunity to discover neither of these systems ever had a floppy drive in them – Intel got all carried away with minimalism on some of the Q35 chipset boards and dropped the FDD connector, as well as the PS/2 keyboard and mouse ports. Well guess which three connectors have re-emerged in the desktop line (and the parallel IDE connector still exists as well)

Anyway, workwise, things have settled down a bit with 7, except for compatible printer drivers (Brother). Some work and some don’t. New ones are coming soon. The lack of a driver that would work on the colour printer led me to fire up the Vista box of sheer necessity in order to print some pictures, and to use its card reader to download them from the camera. That’s why I am setting it up to dual boot with XP, because XP may still be needed in some form, and in the meantime I want three OSs to be on only two PCs, which is quite reasonable. The day is concluding with SP2 going onto the Vista system.

Tuesday 18 August 2009

One day of Windows 7

Having set up a PC at work with the Windows 7 x64 RC installed on it a few weeks ago, I decided that today would be the day that I would start doing as much as possible of my day to day work on 7. It has been reasonably straightforward so far although this PC grunts a bit with only 1 GB of RAM. This is not a long term solution, of course. Where we go to with 7 is very dependent on getting a new Schools agreement between the MOE and Microsoft. As such it may be that the RC’s time limit runs out before 7 becomes available to schools. So it is hard to say where any school will go in terms of 7, as I expect the agreement will have to be renegotiated, but also because it isn’t very clear at this stage what the MOE has planned for schools in terms of these agreements in the future. To administer a server from a Windows 7 PC you must have the Windows 7 version of the RSAT (Remote Server Administration Tools). Microsoft released a new RSAT soon after the RC of 7 came out, but have now withdrawn it. Fortunately, I downloaded the x64 edition of the RSAT when I first installed 7, and therefore I have been able to install it on this PC.

To use one of the more interesting features of Windows 7, the XP Virtual Mode, your PC must have a CPU that supports hardware virtualisation. This is of course the same requirement for running Hyper-V on a Windows 2008 server. Of course, Hyper-V isn’t available on desktop OSs, and previous releases of Virtual PC for desktops have run on any old CPU, but now the version of Virtual PC that XP Virtual Mode is based on (renamed Windows Virtual PC) requires the VT-x feature. It turns out that my main work PC that I bought two years ago to run Vista, which is the same hardware spec as the box that I’ve got running Windows 7, will only need the CPU changed (the cheapest VT-x model is the Pentium E6300 dual core) to make it compatible with this feature (and possibly a BIOS flash). Same goes for some of our other newer PCs that we have in our school. It would be preferable just to change the CPU, for example, in three relatively new PCs that our office staff got last year, since these are expected to have a life of several more years. At home it is different of course, I would need to replace the motherboard, CPU, memory and power supply all in one hit to get the full works of 7 there. Sometime I will look at that kind of upgrade, probably just swap that PC with a similar aged one from work that has had the new bits put in.

Last time I posted about options for upgrading older PCs to Windows 7. In respect of that particular discussion, the main issue we would face for our PCs is getting the right power supply for this particular case. As far as I can tell, FSP of Taiwan is the only company that makes power supplies that will fit the Foxconn DH153A chassis, and then I would have to find a supplier. The 300 watt model provides the rails that newer motherboards need, although there is some question over peak power ratings, but I wouldn’t expect power use of this type of PC to be high, so it may be OK, since the CPU power rating is practically the same. The model number needed is the FSP300-60GLV. I would want to do a good bit of testing of a prototype before committing anything towards production because of questions like the power supply. The cost that we can get these parts for makes it quite a favourable option in our present economic climate where prices have risen sharply in the last year.

So here ends the first blog post made on Windows 7…

Saturday 15 August 2009

Rebuilding vs Replacing?

With the imminent introduction of Windows 7 and consequent phase-out of XP, all schools in NZ (and elsewhere) will be compelled to consider upgrading older PCs if their spec is insufficient to be able to run Windows 7. I am currently interested in whether this would be a viable option for some of our school’s older PCs, which are almost 5 years of age. The economics are favourable if you have access to wholesale or nearly wholesale pricing, and can rebuild the PCs within your school. To get a Windows 7 PC which has a reasonable amount of memory, 64 bit and Intel hardware virtualisation capability, the following are examples of what would need to be purchased

  • Intel DG31PR mainboard (may be available cheaper in a 10 pack)
  • Intel E6300 Pentium Dual Core boxed CPU (including heatsink and fan in the box)
  • DDDR2-667 memory: 1 GB or 2 GB
  • If you have only CDRW drives, you may wish to purchase a replacement DVD writer.
  • If your power supply is less than 300W it may need to be replaced. In our case, a TFX supply is the type required and should fit into the case (Foxconn DH153)
  • A card reader is desirable in today’s media-conscious environment. Sony make the MRW6202 which is internal and can be installed in the FDD bay (the nearly-obsolete FDD being discarded). You could buy one or two USB external FDDs to keep for a rainy day when an occasional Floppy might turn up from somewhere.

The key questions needing to be answered include how well everything will fit into an existing case. This depends very much on whether there has been much change to the microATX form factor in the past five years. Another question is how much life you can expect the existing HDD to provide. However an HDD replacement is a fairly straightforward procedure to carry out where it is required.

The main benefit is in recycling the case and Windows XP Home license. We are assuming here that the MOE will get a new license deal for 7 that will give effectively a free upgrade from XP. I’d prefer to wait until the new licensing comes out to be able to confirm that will be the case before proceeding with any upgrade programme. However the case and license together could be worth as much as $300 depending on specs. Since most cases are well made, they can be expected to give many years of useful service and as such can be effectively recycled without problems. If reuse of case and license work out this would be extremely worthwhile since this value would never be recoverable in resale of these items. In our situation our existing LCD screens which are four years old can be expected to last for years yet. In fact it is fair to say that no one really knows how long LCD displays will last because they haven’t been in production that long.

Assembly is relatively straightforward if you are confident about your skills of putting a board together and inserting it into a case. If you can find assistance within your school community for the assembly then it could be cost effective and economic to consider a rebuilding rather than replacement option.

Wednesday 5 August 2009

The Windows 7 lock-in / lock-out

Everyone knows that Windows 7 has progressed through its various stages and been well-received. It is now in production having been released to manufacturing a few weeks back and will appear in retail channels towards the end of October. Windows 7 is a mixture of new technologies and features, and fixes to parts of Vista that have caused endless trouble to users who adopted version 6.0 of Windows when it was released in January 2007. Here lies the rub. W7 fixes a lot of problems people have been having with Vista, particularly the Business edition on corporate networks. It is smoother and more stable, but Microsoft expects you to pay an additional license fee, rather than releasing additional service packs to Vista to fix all those problems with it. This situation is called lock-in or lock-out, and its previous appearances in Microsoft products led directly to the well-known anti-trust case against the company by the US Department of Justice and similar cases in other jurisdictions such as the European Union and Korea.

In a previous post I referred to the lockout that my site had experienced with ISA Server that has forced us to set up an additional server because of non support of ISA on the Windows Server 2008 platform. The latest example of this approach in the server market is that Windows Server 2008 R2 will not be able to support Microsoft Exchange Server 2007. I think it is very likely that the 2008R2-EX2007 and Windows 7-Vista scenarios in particular are likely to result in further legal action against Microsoft in major jurisdictions, and probably (political) pressure in the US to extend the anti-trust case.

To me, Microsoft is something of an enigma. Sysadmins like me recommend and install Microsoft products because, in the education market at least, they offer the best combination of features, value and support out there. The Linux community at large is yet to get their head around the idea that a GUI, integrated documentation and professional levels of support are worth having. Until we see that kind of commitment from that community I would hesitate to suggest that they have any idea of what is needed by administrators who don’t want to have to learn the nuts and bolts of a new unfamiliar operating system. The comparison between the Linux startup screen with screeds of text gibberish, compared to Windows’ graphical initialisation with occasional progress messages is a case in point. Against this we have the constant monopolistic behaviour resulting in the lock-in/lock-out situations with the results of extra expense to end users. Hmmm…..

I have already made my views known also on what appears to be a diminishing standard of free end user support, where Google searching will usually turn up answers on half a dozen third party or “community” sites before any official Microsoft site.

Saturday 25 July 2009

Exchange Server 2007SP1 and IPv6 Issues

Our Exchange Server has been in production now for a week and there are some significant issues coming up that relate to speed. It appears that running IPv6 on this server creates a major speed issue. This could be seen in very slow client access (oftentimes you will see a message like “Trying to connect to Microsoft Exchange” and a user may even be prompted for their username and password) and the Exchange Management Console and Shell are also performing very poorly with waits up to several minutes to access information or initialise. However it is not as simple as turning off IPv6 on the network adapter as the result can be that the ExchangeTransport service will hang and crash. That was our experience this week. Our consultant had turned off IPv6 in the network adapter’s properties, but the next time the server was restarted, ExchangeTransport Service would not start and would usually crash every 5 minutes. This service is key to the Hub Transport role and your Exchange server will not be able to deliver any mail if it is not operating. Rechecking IPv6 on the adapter caused the problems of ExchangeTransport service to go away but the performance issues immediately returned.

Our next step to try (tomorrow probably) is to disable IPv6 outright as described in this MSKB article. Additional relevant information can be found here, here, here, here, here and here. This MSKB article is also very important because it details situations where the KB952842 article may be incorrect due to the roles in use on the server. It will be noticed that most of the articles referenced above are not concerned with performance, but with the ExchangeTransport service failure. I have not yet found very much about performance issues, but it seems highly likely this is the cause of our problems at the present time.

Tuesday 21 July 2009

Rolling out Vista to Student Desktops - 3

Last time I looked at this was some 9 months ago. It will become more important as time goes on, but it hasn’t been a priority much lately. The main difference is that with a Hyper-V server I now have some Vista VMs to use as test configurations. If we ever went to a lab of Vista machines there would be a lot of work needed on things like a Key Management Server for activations, for example.

The latest oddity to surface from Vista on some user accounts recently is being unable to browse the Internet unless your account is a local administrator of the machine. I noticed this working on a Vista laptop in our site today and have confirmed it is not confined to that machine. Since the virtual machines have a fairly minimal configuration, we can rule out a lot of things from the extra software that laptop makers typically provide these days (such as security managers or TPM type stuff). At the moment that doesn’t really get me any further except that I have confirmed it is not happening on XP machines so it seems just to be a function of Vista to date.

Just for a comparison I started up one of my Windows 7 virtual machines to see how it would handle a mandatory profile login. There were no issues that I could see. Both the Windows 7 machine and the Vista machine were able to redirect the Documents folder correctly. Both have the same problem in the web browser with what appears to be obsolete proxy configuration settings, meaning that I have to test out what the settings actually are for some of our computers and where they have actually been set up. I have not at this stage attempted further testing of Start Menu redirection which we have used in the past, and I still have to deal with Vista’s dual Document folders (we would want to remove the local folder).

Thursday 16 July 2009

OMNIS (RM Integris) on Windows Vista may be incompatible with Microsoft Groove

Refer to my previous article about setting appcompat settings for Omnis.exe to Windows 2000 in order to eliminate calls to EncodePointer and DecodePointer (functionality added to Windows XP Service Pack 2).

On a new Windows Vista system, RM Integris (Omnis.exe) crashed upon first execution when it reached the “Open / Create Datafile” dialog. The application was traced with Microsoft Dependency Walker and it recorded an Access Violation in GrooveUtil.DLL, a component of Microsoft Groove (itself a part of Microsoft Office Enterprise 2007) (Error 0xC0000005). This occurred without any user notification (the application unexpectedly quit without displaying any error dialog, and no event was logged by Windows). Uninstalling MS Groove from Office 2007 has cured the problem to date.

Saturday 11 July 2009

Automating Windows Vista Installations - 4

After capturing our new Vista image using Ghost and ImageX, the reference PC was rebooted to effectively apply it. A second PC of identical hardware configuration had the image applied to it using Ghost. First time startup with the image takes around 20 minutes to complete before the normal Windows logon comes up.

Issues found for this specific installation:

  • Still asked for a user account and password to be created. In future we may specify a dummy account and a very long gibberish password so the account doesn’t have to be deleted later.
  • Still asked for a network location to be specified
  • Display drivers were not installed
  • Domain was not joined (Error 0x5 was recorded in the Setup logs, this translates to “Access denied” which could mean a million things in this context)
  • And of course, the Tablet PC input panel came up again.

In other words, the only change that has been applied is that the administrator account is enabled with its password set. As I had researched the other changes or checked the existing settings were correct, I don’t know what else I can do at this stage to rectify the above issues. They are minor points but it is quite annoying to have gone through this amount of work so far and done everything supposedly “by the book” yet have the issues unrectified. The display drivers, of course, might require Intel’s custom setup application to be run, and so may have to be automated a different way.

The Tablet PC input panel, of course, is not being affected by Sysprep. It is just a default setting for newly installed PCs and is a trivial matter to disable. This time around the Language Bar didn’t open so that is one less thing to do.

At this time I am still using Ghost but have decided to install WDS on one of our servers and then skill myself in the techniques. I like the PXE based solution because you can eliminate the manual step of having to start up Ghost before each installation and don’t need a working CD drive on the target machine. We also don’t have enough Ghost licenses at present. So if WDS works out I will be using it to deploy Vista and later, while still using Ghost with our legacy XP images.

Friday 10 July 2009

Automating Windows Vista Installations - 3

Continuing the series on automation of Windows Vista installations, the first article covered the creation of our first Vista image. Prior to this time I had some experience of ImageX from using it to image a PC that was running Vista, however I had not used Sysprep with this PC and therefore the image was not suitable for cloning. When we updated Ghost to version 11, I became aware that it was Vista capable. At that time also I was mainly familiar with the use of Ghost without Sysprep, in which with Windows 98 machines in the main a SID regeneration process was not required. My next step when our network transitioned to XP was to learn the use of RIS with the consideration that we might eliminate the use of Ghost within our organisation. Due to the limitations of RIS (such as not being able to copy junction points, meaning incompatibility with Microsoft .NET Framework) I moved back to the use of Ghost and learned in relatively recent times how to use Sysprep with its answer file to prepare an image for cloning on our network. Thus it is only in the last two years that we have made use of Ghost with Sysprep for imaging Windows XP machines. Rather than learn WDS I will be continuing the use of Ghost, but also hedging my bets by creating a WIM version of each Vista image as well, in the event we might decide to use ImageX or WDS in the future instead of continuing with Ghost.

The second article covered the lessons learned in our first production deployment of an image to a laptop, and the issues created. Subsequently the same image was deployed to a different hardware platform as a first step to creating a specific image for that platform.  This third article will cover changes made to our configuration and answer file as well as the imaging and production deployment experience. As before, I am making use of a Hyper-V virtual machine to test the WIM version of the image. The first issue I am addressing is the non enablement of the administrator account. I identified that I had entered the NET USE command instead of NET USER, However, the WSIM documentation does not advocate the use of the NET USER command to enable; instead it suggests using Autologon settings. I addressed the issue of non joining the domain by using the account specified in the answer file to join this reference machine to the domain, so that hopefully the account should be recognised this time. One thing that did work out was the locale settings, which turned out to be more or less correct, but I am changing them using this reference. The final issue that I would like to resolve this time around is the installation of the graphics driver. There are a lot of methods provided in Vista Setup tools to add the drivers. The easiest for me is to specify the drivers path in the answer file, since I am already familiar with this use of WSIM. The best place, of course, to have these drivers is in a file on the laptop’s HDD that is deployed with the image. This is why HP laptops have the SWSETUP directory; a variation of this is very commonly seen in mass deployed PCs such as OEM built systems. One thing I learned with RIS, waaaay back, was how to deploy Intel NIC drivers at the early stages of RIS text based setup when the target PC had to connect to the network for the very first time to get Setup started. That is a boot level driver deployment and may be required for WDS type scenarios, or using Windows PE. We don’t need it here because of using Ghost instead. In this scenario the graphics drivers are to be deployed in the OfflineServicing pass.

Having completed our answer file, I copy a predefined command script to C:\Windows\System32\Sysprep and then copy my answer file and rename it to the filename specified in the script. I then run the script to sysprep the reference PC and automatically shut it down, before Ghosting and ImageX-ing it.  The results will be published in Article 4 of this series.

Thursday 2 July 2009

Automating Windows Vista Installations - 2

Following on from the comment in the first article, I created the xml file for Sysprep and used it to sysprep the image of the source machine. I then took two machine images, one using Ghost 11 and one using ImageX. I then loaded the Ghost image back to the original machine after formatting its C drive. The ImageX image was applied to a new Hyper-V virtual machine for further testing. Both of these computers were first booted using Windows PE. After rebooting into the respective OS images, Sysprep has done its usual thing, which is massively slower than on XP.

The documentation that is supplied with WSIM is quite comprehensive, yet at times it is too technical, for example when I set locale information I have no idea where to put in “en-nz” and where to put in “en-us” because the only “en” that I could find mentioned is “en-us”.

Issues with the process of the Sysprepped images:

  • Windows forces you to create a local user account with a password and hint, even though we set up the administrator account to be enabled and gave it a password. This creates an unnecessary extra step of having to remove this account at first login.
  • The administrator account is still disabled even though I specified that it be enabled.
  • The computer hasn’t joined the domain even though I put in all the settings for it into the sysprep xml file. Instead it has joined a workgroup with the name specified.
  • The network location setting that I specified with WSIM is not being applied.
  • The graphics driver that was installed on the reference computer has not been reinstalled at first startup. It would be quite reasonable and convenient for Windows to cache the previously installed drivers for use when the machine is loaded; but it doesn’t do this. I suspect the lack of joining the domain is due to a lack of network card drivers, although it could also be a user account problem. This is all a complete pain because we now have to specify all the drivers to be loaded with each image so it just makes extra work when a particular image is set up for use with particular hardware.
  • The Tablet Input Panel has come back when I banished it originally (the interactive whiteboard drivers are treated as a tablet device. This does have the advantage that you can use the interactive whiteboard to do some tabletty type things; but you don’t have the button controls on the interactive whiteboard that an actual tablet PC would have).

I expect most of these issues will eventually be solved satisfactorily. It is a new learning curve with a complete new set of tools. The main advantage of using Ghost is that the disk doesn’t have to be partitioned and formatted; with ImageX this is an additional step. However these are not great differences that would really make me choose Ghost over ImageX. Ghost is still the best system for deploying to multiple machines and over a network, whereas ImageX is the simple one to use with a USB hard disk or some other file based mechanism (although I mapped a network drive to install it off the network). Unless we decide to use WDS in the future, I don’t really know whether Ghost or ImageX will turn out to be the better solution for our needs.

Automating Windows Vista Installations

When you are a system administrator, pretty much an important thing is to work out how to streamline setup and installation of computers using various kinds of mass duplication techniques. Historically one of the best known systems for doing this has been Binary Brothers’ Ghost software, now known as Symantec Ghost Solution Suite. Microsoft has always provided support for third party software such as this, while progressively over time devising its own equivalents. In the era of Windows Server 2003/Windows XP, the equivalence was provided for the first time in a system called Remote Install Services. I learned how to use it alongside my existing Ghost experience, and eventually concluded that due to certain RIS limitations, we would continue with Ghost for the time being; I then learned how to sysprep Windows XP images with Ghost and began cloning them across various architectures and platforms.

With the introduction of the Vista/2008 platform, Microsoft has revisited the automation scene and created new tools; they reasoned, correctly, that new imaging system are not only useful to mass duplication scenarios, the techniques employed can also be applied to personal installations on single computers. Specifically, disk imaging techniques are used to perform Vista and Windows 7 installations onto computers when using setup DVDs. This has resulted in a much faster installation experience for the majority of users. Microsoft has also sought to enhance its network-based installation technology, RIS, into a new product called Windows Deployment Services.

Of course, as a sysadmin, I now have to learn the new systems that Microsoft has compelled implementation of through Windows Vista. The latest version of Ghost can image Vista, but in order to get to a useable image at the end of it, we still have to use Microsoft’s technologies to prepare the source computer, and along the way, I’ll be covering my bases by using Microsoft’s imaging software, ImageX, as well. To get things going, I found these articles:

Deploying Vista with SysPrep and Imagex – The basics and getting started

Windows Vista USB Automated Install – creating the unattended installation file for Sysprep

To get started I am creating the sysprep.xml file using the settings shown in the first article. One question is that we are currently using a MAK. I don’t know whether I will put the key into my Sysprep.xml file. I might leave it out so that Vista can try activating to a Key Management Server, because although we don’t have enough Vista computers to use a KMS right now, we will in the future.

Before you can get going with this stuff, you have to install the Windows Automated Installation Kit from Microsoft. There have been a couple of releases; since I have SP1 of Vista I am using the 6001 release. It comes as a download of an ISO file; you can extract the files using 7Zip. An annoyance I am finding with Windows SIM is that it won’t mount an image that is on a network share. So much of this happens in Vista, that installations won’t run from a network share, yet the error message is some other silly thing. It seems in many ways that Microsoft has created this newly restrictive environment in Vista, yet they haven’t integrated the restrictions in the operating system in a way that is transparent to users or produces meaningful error messages.

Friday 26 June 2009

Windows protection features may cause application compatibility problems

Since Windows XP SP2, Microsoft has implemented a range of protection mechanisms to Windows to guard against malware. These include Data Execution Prevention, which is a hardware functionality in a CPU that is also implemented at a software level. Another lesser known technology is a pair of function calls in Kernel32.DLL of the Windows API, these functions are “EncodePointer” and “DecodePointer”. These function calls are used to protect pointers by obfuscating them using a “secret” value for an encoding or decoding process.

For reasons which I don’t really know about or have time to go into, we encountered a problem with a legacy application (RM Integris Classic) at our site which turned out to be much slower in execution and eventually hung. I decided to use an API tracing tool (Microsoft Dependency Walker, in this case, a tool which shows DLL dependencies) to see if there were problems with missing DLLs or function call exports being experienced in the application.

In this case, a major difference noted between two systems running Windows Vista, one of which could run Integris and one which could not, was the execution on the latter system of numerous and repetitive calls to Kernel32.DLL’s EncodePointer and DecodePointer functions. When I looked up these functions to see what they are used for, they turn out to be part of this new functionality that has been put into later releases of Windows. Specifically, EncodePointer and DecodePointer were introduced in Windows XP SP2. However it is interesting to note that I have not seen any compatibility problems before now with Windows XP running RM Integris.

I haven’t gone further into why the application might be incompatible with these calls, instead I have just looked to use the compatibility settings in Windows Vista and Windows 7 (on another computer that is running the Release Candidate). In the Vista (x86) computer, setting Omnis compatibility mode to Windows 2000 fixed the problem, but trying the same on the Windows 7 (x64) computer did not. However this gives me a solution for our planned migration to Vista; I’m currently building a master image of Vista for our HP laptops and intend to start deploying it early next week, and Integris is important to our site. Having that fix is useful when we will need to implement a transition to Windows Vista over the next year or so.

Tuesday 16 June 2009

VOIP internet telephony still fraught with pitfalls

So, now I’ve been using VOIP services over the Internet for four months, and it hasn’t been a straightforward experience. I think that it is quite fair to say that you get what you pay for. The level of service on VOIP providers falls well below what you get for an ordinary phone line from Telecom or Telstra. A comparison is warranted here with setting up a broadband connection to an ISP. I suppose all in all it is roughly similar, because each broadband modem has to be configured to work with a particular provider. However in NZ, the broadband infrastructure is nearly all owned by two providers, and modems are specified to work with either one or the other. So even though there are lots of different brands of modems, they pretty much all work without much mucking around and configuration problems in NZ.

The VOIP situation is a lot more different, because there isn’t that kind of stratification in the infrastructure. VOIP doesn’t worry much about what’s running underneath for the broadband connection, and in that sense it is like Gmail or some other commodity internet service that you buy from a range of providers. But whereas cable and ADSL are pretty much standardised worldwide, not enough of the VOIP protocols are standard, and neither are telephones, so there is a lot more potential for problems and possibly a lot more work needed to get a VOIP connection working in a range of situations. I made, now it would be admitted, the mistake of buying a D-Link VOIP router to begin with, not because I think D-Link is any good, but because it was available and had the features I needed for handling routing with the cable modem, and connecting a phone as well. The routing works very well, just like my experience with the D-Link ADSL router I had, but the phone side of things has been a lot of trouble, and I’ve had to flag away the DVG-1402S and replace it with a Linksys SPA2102. Because, among other things, it turns out that the D-Link is “incompatible” with some providers’ hardware/software in NZ. Yet, overseas, there are companies that are supporting these routers for their customers.

It appears that there is just a big lottery as to what will run on VOIP and what won’t. To add with that, the level of support from the two VOIP providers I worked with has fallen lower than I’m used to with the ordinary phone services. The support for VOIP should be higher, not lower, because getting a VOIP connection going is a lot more complex than just plugging in a phone on the old POTS network.

The Slingshot Experience - Finished

Earlier in the year I wrote about my experiences in switching from Telecom internet and phone, to Slingshot. Those posts made it clear that Slingshot didn’t live up to my expectations. Now, four months later, I just decided to cut out Slingshot altogether. In my estimation, it is hard to see how Slingshot today can still point to their Netguide award (which is several years ago now) based on my own impressions. The latest indignity is that they haven’t been open about the credits that were applied to my account. For various reason I decided to switch my VOIP phone to 2talk. Two weeks later I have still heard nothing from Slingshot, so I rang them to get them to refund the remaining credit. They agreed to do this, but they aren’t exactly being open about it; they can make money off keeping that money in deposit as long as I forget that it’s there. I don’t have any written record of any of these transactions on my accounts. They only send out an invoice each month, not a statement. They say you can look it up in their Visibill system, but you can only do that if you have a username and password, so in my case, where both my accounts are closed, I can’t look up anything at all.

Friday 29 May 2009

Windows 7, Windows 7, Windows 7

Now that Windows Vista is a mature product I believe the compulsory education sector needs to get up to speed on the process of upgrading to it, as Windows XP support is soon to be phased out by Microsoft. Windows 7 is due to be released in about three months’ time, and I’ve taken the opportunity to test it out by installing it into a Hyper-V virtual machine on our Windows 2008 server. I hope that it will be released to schools as part of the next MOE deal in 2009, but I will be seeking a new computer to run it on because this desktop has only an E2140 CPU that doesn’t support Intel VT and thus won’t run the XP Virtual mode. Our Windows 2008x64 Server has plenty of memory and a Xeon quad core CPU so it will not be overtaxed by running 7 as a virtual machine at all.

The installation of Windows 7 Release Candidate on Hyper-V was straightforward once networking issues were sorted out. Hyper-V by default installs a virtual network, which must be properly configured in order to give VMs proper network access. Specifically, any virtual network adapter that you set up in the Hyper-V Virtual Network Manager must be specifically configured to give access to the external network and then each VM must be hooked onto the correct VNA. I got fooled by the fact that the VNA has the same name as the physical adapter in the machine and wrongly assumed that this VNA would automatically connect to the external network when it was set up by default for private connection only. Apart from that issue, which stopped all the VMs of different OSs that I tried from connecting to the network, installing W7RC was extremely straightforward from a blank VM.

Now that we have it running I guess we will play with it a bit. I prototyped Vista in our school by running it on my own desktop for the last year and I will be an early adopter of 7 once it becomes widely available. In the meantime we are considering options for transitioning our users to Vista, much as we moved everything to XP a few years ago. This time around, though, there seems to be a marked level of resistance in the sector to Vista, which may to some extent be a reflection of the wider community’s attitude. There is not such resistance in tertiary institutions where Vista is being taught, and I would expect them to be on the forefront of adopting 7 when it is released. However, Microsoft has cautioned against organisations trying to bypass Vista and it makes no sense to jump straight from XP to 7 until the latter is well matured.

Monday 18 May 2009

Picasa 3, Take 2

Last year, Google updated the Picasa software to version 3. At the time, I tried it, but was very unhappy with a few things. Principally, these were the forced automated update from 2.7 (it downloads the update without asking you and then installs it behind your back), and the way it interfered with my scanner and camera settings. The issue with the update is that comparable programs like Firefox will never update you to a new major version, and they do so much more transparently. It is almost impossible to stop the update from 2.7 without being underhand because as soon as you start it up for the very first time, and before you get a chance to change the settings not to check automatically for updates, it starts out downloading 3.x right away and automatically installs it the next time at startup. And if you do manage to stop it from updating, then it keeps hassling you every few days to download the update anyway.

At the time I found Picasa had taken over from Canon CameraWindow as the default application when I connected a camera to download pictures. This issue has been addressed in the newest version that I installed, 3.1.0. Picasa also doesn’t interfere with scanner settings – the Epson’s front panel button still launched the Epson Scan software without any quibbles.

One of the most useful features of Picasa 3 is the automated web album sync which removes the need to manually upload new or changed files. I expect to give this a good test out over the next few weeks as I update all of my existing pictures in lepidopterophile’s albums with tags. Being able to add the latter to pictures is another new feature, though the interface provided could use some design improvements. It was pleasing to see that this feature is implemented through existing IPTC tags. Last I heard, Google was having some trouble implementing these, so it’s good to see they have been able to make them work, because it means that when the occasion suits you can use another IPTC client (I use Irfanview). Another nice feature is being able to make a video from a set of pictures. I got the exploding camera into a video and uploaded it to Youtube.

Now I have decided to stick with Picasa 3, Google will have to work hard to overcome my previous negative viewpoint of it. So far, it’s looking promising.

Saturday 16 May 2009

Camera Totals

  • Canon Powershot S1 IS: 9496 images, 47 months (202/mth). May 2005 – April 2009, display circuitry failed.
  • Canon Powershot A400: 5960 images, 21 months (283/mth). June 2005 – March 2007, sold.
  • Canon Powershot A450: 3265 images, 10 months (326/mth). March 2007 – January 2008, dust entered lens.
  • Canon Powershot A460: 4512 images, 16 months (282/mth). January 2008 – present.
  • Canon Powershot S5 IS: 1924 images, approx 9 months (TBD) (~213/mth). About August 2008 – present.
  • Total 25,157 images (524/mth).

Since June 2005 I have always owned two cameras, except for the last 9 months or so when I owned three. Now, I’m back to two, but I also still own the A450 although it’s next to useless, and will probably buy an A2000 next month. Not sure what I’ll do with the A460 then.

The Exploding Camera Part 2

OK, I now have the top off the camera so let’s see what is revealed, starting with the top panel itself. At top left you can see a sliding switch that is moved by the rotating zoom controller to zoom the lens in or out. Below it is some shiny metal holding the power switch and mode dial in place. To the right of them is the mechanism (relay) for popping up the flash, the electronics of the flash itself and its storage capacitor (the large black object).

Going onto the main body of the camera, the only part we haven’t seen already is the board on top of the battery compartment. This is connected by wires to the flash release relay (red and white, already mentioned).

After taking this photo I detached the top cover and decided there was no further need to examine it as there is little in there apart from the top switches and the flash. So I went back to looking at the main body of the camera.

This is the back of the camera body with the main logic board pulled off so you can see what’s on it. Prominent in the right centre of the board is the DiGiC processor (Digic-1 in this model). You can see that the camera has a chassis made of metal, to which the main components are attached. Looking at this chassis, on the left we have the lens with the imager attached to the rear of it. To the right, the black rectangle is the plastic membrane that encloses the Compact Flash slot. At right is the board carrying most of the control buttons on the camera’s rear. This normally sits behind the CF slot, but I bent it out of the way for the picture.

Here’s a shot of the lens mechanism. Curiously, the covers for this mechanism are partly held in place by strips of what looks like duct tape. The golden-coloured object bottom right appears to be the ultrasonic lens motor, and the gears obviously transmit the rotation of the motor to drive the lens in and out. The number of gears shown appears to be a simple way of getting the drive from the motor up to the top right corner where the lens is engaged. I would have thought that direct drive would be the way to go, but I suppose the shape of the camera partly dictates this, with the bulky motor cover being aesthetically desirable to locate low down on the front rather than high up.

And then I undid a pile more screws in the chassis, and the rest of the camera separated into three. In the top left is the chassis. Top right is the battery compartment, and in the foreground is the lens with imager, which is the only part that is worth bothering with now. The metal bracket on the back of the lens holds the CCD imager in place; obviously you need a good rigid assembly for this. The bracket is held on with tiny Torx screws which I am not going to attempt to remove as I’m sure I don’t have a bit small enough to do it.

And now, at last, here is the live side of the CCD imager itself. That multicoloured rectangle inside what looks like a slide frame. The flash has revealed a bit of detail, that it’s basically a chip mounted on its own little board. The ribbon cable coming off the back of this has a number of tracks on it, including what must be some test points that are covered up by a piece of tape. The CCD is pretty well sealed inside what must be an airtight space, most likely a cleanroom assembly.

At this point I stopped to feel a little bit nostalgic and sad. This camera, after all, cost me a lot of money when I purchased it on the 6th of May 2005. The sticker price for the camera itself was NZ$710.00. A power adapter (which I still use sometimes with the S5) cost another $98, and a Sandisk 256 MB Ultra II Compact Flash card was $80. I got a trade in of $150 for my old Pentax MX 35mm SLR, so all up this was a $738 purchase. The decision to buy the S1 was a late choice in the process of selecting a camera. I had looked seriously at a similar but cheaper Fuji FinePix model, and deciding to buy the S1 was only made a few days beforehand. What swung it for me was the excellent video performance (almost Mini-DV resolution and a high data rate of around 100 MB/minute), the articulating LCD screen and some other smaller points. It was a stretch of my budget to buy the S1, but it was a worthwhile process overall and the camera served me well over the next four years as I took 9496 images with it. The first photo being taken the day it was bought and the last photo of record on 19th April. Its replacement currently retails for $775, which is probably about par considering the recent decline of the exchange rate.

And finally, here’s the box of camera parts ready to throw in the rubbish. All I’m keeping is the neck strap, the CCD and the three Compact Flash cards. There was a 16 MB card shipped with it, and I eventually bought two 256 MB Sandisk cards. That’s pretty small by today’s standards, but each one could hold several hundred pictures at the 3.2 MP Fine setting. The camera served me very well for almost 10,000 images. Maybe that is all they are engineered to last for.