Friday 25 June 2010

Automating Windows Installations [5], Data Ownership in Major Databases

This has been a big project for us, starting from scratch to get a Windows 7 image built for our particular uses. At times it has been kind of overwhelming. I wrote before that monolithic is out and componentised is in. However the simple fact is that no single solution will suit all requirements. After my previous experience of being unable to create a monolithic application, I tried componentised, then when that didn’t seem to be going anywhere, I tried monolithic again with a freshly built image on a virtual machine this time. Applications got installed in two stages – the optional or “Feature” applications that staff like to have (like iTunes and some specific teaching situations) and required or “Core” applications (antivirus, Office, etc). Both capture stages were completed successfully leaving me with images that could be successfully deployed to test platforms. The next step is to deploy to a production environment, our laptops, so I created a deployment sequence to add the hardware specific drivers, and the specific additional applications just for these laptops. At the moment the first production deployment is completing and has just had two minor glitches so far with dialog boxes popping up that shouldn’t be there. I think the model we will follow in the future is to build the monolithic image on a generic VM for the base OS and applications for a specific scenario and then customise it to the specific hardware with the drivers and any related apps. The great advantage of this scenario is being able to use the VM to customise the image and therefore I don’t have to have an example machine at hand to test it on, although one will have to be borrowed for the production deployment test. This time around the settings I specified in the deployment wizard were applied (joining the domain etc) so that is also good. With the particular image we are using at the moment, the base OS + apps image takes about 40 minutes to be installed and the platform specific apps take about another 40 minutes. We are currently using MDT 2010 gold to do everything except sysprepped captures and MDT 2010 Update 1 beta to do the captures. I guess the total experience has been for a similar amount of time as our first Vista image but in this case being a lot more repeatable and reusable so learning MDT will be worth a lot more to us overall.

The other issue I am writing about today is data ownership in major databases. The biggest database that a typical NZ school will have is their Student Management System, which I will use here as an example. There is a tendency to think of applications in terms of the old file based DB approach where each application has its own data and tends to use it exclusively. For those of us with DBA experience that is not taking advantage of the key advantages that databases have to offer, mainly centralisation and ease of management. However most vendors are well aware of this in providing the capability to export data for other uses and, in some cases ODBC access to their backends.  One of the important considerations is being able to allow other applications access to a major database such as an SMS, and being able to take that data to another product of the same type (e.g. another SMS) if for some reason you need to change vendor. An SMS vendor should be prevailed upon to give you, at least, read-only access to all of your data, and full documentation of its functionality and how it uses the data. Our experience of Integris being that both of those requirements were fulfilled – the documentation has not always been complete but enough information was available to begin with to enable me to link two specific reporting applications into Integris to analyse PAT and other assessment data (STAR, WL etc), using Integris as the data entry frontend, and Microsoft Access to do the reporting.

The important consideration in getting free and full access to your data, particularly to all of it, and to proper documentation, is that at some future stage you may need to switch, for example, an SMS, as your requirements change. Having your data locked into the vendor’s proprietary format with sparse documentation can make it difficult to switch to another SMS as easily as you would switch an operating system or other general purpose application. To a certain extent the specialisation of SMSs is part of the problem, but vendor lock-in is a consideration that needs greater attention from the powers that be. I understand that there is official interest in getting NZ SMS vendors to move to client-server architecture, which gives better options for sharing SMS data. Within a school locally this can still be ensured if a backend provides access to other apps using a standard interface such as ODBC. 

Tuesday 22 June 2010

Automating Windows 7 Installations [4]

Today we moved along to our first major deployment project using Windows 7 and MDT. We have several new laptops and have decided that they should be upgraded from Windows XP to Windows 7. In practice this means a clean installation of Windows 7 plus applications. Right here it is best to have a look at two methods that have always existed for setting up a PC. One of those is cloning, in which a reference PC is set up with all software and applications. A complete image is then taken of its hard disk, and this image is loaded onto other computers. This is called a monolithic installation. If you only have a few different hardware or software specs then it is relatively quick and simple to use. Symantec Ghost is one of the well known software applications used to generate monolithic installations and because of this, cloning is also known as “ghosting”.

The second method which is more commonly used when there are a number of different hardware and software specs to be considered is a componentised system. This one works by automating the installation of the base operating system, drivers and software applications onto a computer. Although it takes more work to set up such a deployment, it is much easier to maintain, since you don’t have to reload or rebuild a monolithic image to make changes to it when something changes in the hardware or software; all you have to do is make changes in what is installed onto a target machine in the automated setup sequence. The result is that at all times you only have one copy of the component files – the operating system, drivers and applications – and changing these is as simple as a few clicks in the deployment management software that you are using.

MDT is set up to handle both tasks; Ghost can only really handle monolithic installs. MDT builds on all of the technology that has existed in previous versions of Windows; it has been possible before now to build customised installations of the componentised type; now MDT tidies all of it up. As I may have noted in previous articles we experienced problems trying to perform a monolithic image capture so work today has gone into putting together some sort of componentised install. This, of course, as a base learning curve, is rather a lot of work and involves trying out various steps. The monolithic install has had lots of problems and for now I have had to ditch this idea. In the process I have had to set up my deployment share again from scratch in order to be using a stable production edition of MDT rather than an update beta. Basically, today I am putting together a ground up process of deploying a base OS onto a laptop with its drivers and working from there to get applications to go out as well, because that is what works in the current release of MDT2010.

Anyway, what worked today and what didn’t:

  • The monolithic capture of a full PC didn’t work in MDT 2010 Update 1 (some yucky Sysprep error)
  • Update 1 trashed the deployment ISO images after I put in the drivers for the 6730b (it injects them into the boot images. In the process of doing this it somehow threw a wobbly and screwed up the next time I tried to run the deployment wizard from a laptop). So I had to set up a new deployment share running original release MDT 2010.
  • As I think I may have said in a previous conversation we can’t do a capture except with Update 1 but being a pre release it has issues anyway.
  • So I went back to the original release of MDT 2010 and used that to deploy the base OS with drivers, which did work. In addition to the standard Windows driver injection, an additional stage of injecting HP specific drivers is included, and it works well. Except for one or two little things…
  • I also booted to Windows PE on that fully installed laptop (the one that was going to be the source of the monolithic image) and captured it with ImageX so at lease if everything else turns to custard I can apply that to a laptop and then generalise it with a sysprep.

So my next step will be to add to my initial effort at a componentised install and figure out how to deploy applications either as part of that install or a separate task sequence. That will mean I can give away doing captures at the moment (the part of MDT that doesn’t work) and yet still have a fully automated install. I think four parts of this series is enough, though.

Friday 18 June 2010

Deploying printers using Group Policy in Windows 7

This is a subject I have written about various times before. We didn’t get the capability to install printers automatically to client computers via a GUI until Windows Server 2003 R2 was released, which was happening about 2 years after I started working with the school where I am now employed. Along came the Print Management console and with it, GPOs to deploy printers to clients both per machine and per user. This did have a few hiccups and limitations though. Not long after that, in 2006, Microsoft bought out an outfit called DesktopStandard and in the process acquired a product called PolicyMaker. This was subsequently incorporated into Windows Active Directory as the Group Policy Preferences. This has effectively superseded the Print Management Console-based GPO system as the most effective way to deploy printers to a client computer or user, given the enhanced capabilities such as being able to set the default printer for a user, and easily add, remove or update printers. These capabilities worked just fine on XP and pretty soon we had them in use across our whole network.

When I started to test out Vista a couple of years ago, one of the biggest issues to surface was the failure of GPPs to deploy printers to a Vista client. Even in Windows 7, you will get the printer deployed, but the drivers will not be seamlessly installed as they are on XP – the user will see the printer icon with “Driver Update Needed” displayed next to its name, and they will have to manually install the drivers for each printer (although this is pretty straightforward since it will automatically get the driver off the server where the print queue is shared from).

It appears this is pretty easily resolved by changing some settings in a GPO. Specifically these are located at Computer Configuration > Administrative Templates > Printers > Point and Print Restrictions policy. Enable the policy, decide whether to tick the first two boxes, and then go down to the two settings that control whether to show an elevation prompt or warning for installing new printers and updating drivers. I am presuming that at logon the system is not able to show these warnings when the GPO is being applied, and that this is the reason the drivers do not get installed. After applying these GPO settings to a newly set up VM I am pleased to see that at next logon a pile of printers showed up (using a particular account that had the preferences applied) and that they all appear to be installed properly.

This means we can stop having to play around with the old Print Management Console GPOs and use the same GPP printer deployment policies for all of our workstations regardless of OS. Resolving this issue is one further obstacle removed in our Windows 7 upgrade path.

Thursday 17 June 2010

Automating Windows Installations [3]

Once you have got started with MDT 2010, you will want to start playing with its more advanced features and functions. I may have mentioned that one of the scenarios for MDT is to deploy a base OS and install its applications automatically, instead of capturing a reference PC. This approach is ultimately the one you would follow in a situation where you don’t have access to a reference PC for each different hardware platform that you are deploying on (by which I mean different hardware combinations such as motherboard make/model, installed hardware devices etc) or software requirements (you may have a lab that requires only specific software, and some classroom computers that don’t for example). If you can package up all the deployment scenarios into tasks in MDT then you can ease the requirement to continually update reference images with new software as it is deployed. Since each reference image has first to be deployed to a reference PC, updated, and then captured before deployment, the MDT base-plus-apps scenario would save a lot of time spent in updating these images as requirements change or new software is released. This presupposes that reloading is the best scenario for deploying updates, as opposed to deploying them through Group Policy or installing them to individual machines. MDT also eases the deployment of drivers as they aren’t required to be injected to individual installs. MDT provides for the scenario where you may need to inject, say, RAID drivers into the boot CD image in order to be able to deploy the base OS image from the server, but it handles driver installation requirements for Windows itself by maintaining these in its own database on the deployment share, so all the administrator has to do is to import these drivers once, and then let Windows Setup handle the download of these drivers from the share during the hardware scan. Importing for say an HP laptop that comes with the drivers in C:\SWSetup would be as easy as pointing the driver import wizard to that folder and letting it handle the import – provided all the drivers are compatible.

I guess one immediate thought that came to mind was whether MDT would be powerful enough to support differencing, that is, the ability to roll out updates to machines that haven’t got them installed, as opposed to a full reload. I’m guessing that this capability isn’t offered at the level that MDT operates at; this is neatly handled by Group Policy for MSI based deployment, or the more sophisticated System Center. Still, it is an interesting thought. The constraints of packaging applications for deployment via MDT are likely to be the availability of applications that can be packaged into a deployment task, with the usual issue being whether an MSI based install is available and whether it can be scripted sufficiently. As an example I have had trouble recently using Group Policy to install the Maori Keyboard driver files, because the silent install goes by default to per user rather than per machine. However I have yet to work fully through the MDT documentation to confirm my expectations of how it will work.

Another scenario for MDT is to automatically update a PC from Windows XP to Windows 7. That doesn’t sound like much, surely you would just put the Windows 7 CD in and upgrade? Unfortunately that is not so simple, as there is no upgrade path from Windows XP to Windows 7. As such the MDT provides for the option to automatically archive and migrate user data using the User State Migration Tool , before it wipes off XP and clean installs 7 then restores this data. This is a scenario we probably won’t bother with too much, if at all, because our migration scenarios at this stage are based on installing Windows 7 onto clean new machines as they come up for replacement. Still, it’s good to see the depth of functionality and features which are being provided for in MDT. Another nice feature is a customisations database using SQL Server Express (which is free) that streamlines keeping track of the customisations for different computers or groups. Like WDS, putting in SQL Server Express is an option I will look at when we set up a more formal setup server later this year. Putting in the MDT database will be the prerequisite for customising installation parameters rather than mucking about with the default ini files that are provided with MDT so I won’t be installing that feature just yet.

My next stage of learning MDT will be to look further into a base OS plus applications plus drivers installation. The extent to which we can use this depends on the degree of application install support for the kind of automated install scenarios that MDT requires. I’m guessing some legacy apps won’t be very supportive so it may not be so easy unless a repackager like WinInstall LE can be deployed. Ultimately we will probably opt to deploy a reference image where there are significant numbers of applications that cannot easily be automated to deploy, and while this is less flexible, it can be customised with installable apps the same way as a base OS install can be. We have the advantage of not too many different software configurations so reference imaging might work better for us in MDT at this stage, with customisation of Office installation for different requirements (staff vs students) being the main application deployment add on to a reference image. I would expect that the Selection Profiles capability of MDT would enable us to choose the option for deploying different Office install customisation files.

Wednesday 16 June 2010

Automating Windows 7 Installation [2]

Having completed my reference computer setup I decided to jump in boots and all to the task of getting this computer imaged. Starting at Part 11 of the series at WindowsNetworking.com I began with executing the script LiteTouch.vbs. In practice I have to use RunAs /user:<username> in order to log on to the deployment share (and it looks like it will only run on a domain joined machine if you are using a domain server). On subsequent attempts I used a command prompt that I had run as Administrator and there was no need to use RunAs.

This first attempt was unsuccessful after the sysprep was completed (it looks like the Sysprep worked OK). That was because in skimming through the previous steps I had missed the necessity to update my deployment share, which installs the Windows PE images that LiteTouch needs in order to reboot the computer and complete the capture. The next problem, which is well known in the MDT development community, is the loss of a network connection after Sysprepping. This is because the sysprep stage causes the network to become disconnected, which is in part due to conflicts with Symantec Endpoint Protection (in our case and in other cases). Microsoft is well aware of this situation and is addressing it in Update 1 of MDT 2010, currently in beta. So my next step was to obtain this update from Microsoft Connect for further testing. A key change in the deployment sequence is to apply the Windows PE boot image before the Sysprep stage. This means that the reference system can be rebooted immediately after completing the Sysprep, instead of trying to continue over the network share when it has effectively lost connectivity, as happened previously.

This time around it has worked as expected. After Sysprep the reference computer automatically rebooted into Windows PE using their custom MDT image, and has created the WIM file for this operating system image. The slowest part of this is that last step – it is to be expected that the process of making an image of the computer is a relatively slow step. I couldn’t really say how it compares to Ghost in speed terms. But it seemed to get this part done relatively quickly. The capture image size was around 9.5 GB and I would say the whole capture was completed in about 1 hour. After the capture task has completed, the next step is to import the captured computer image as described in Step 2 of Part 10 of the WindowsNetworking.com series, then deploy the image to a target PC. The deployment to a target was also successfully completed.

So far, the use of MDT is a huge step up from my previous efforts using a manual sysprep and imagex on Vista and looks to be the way to go with Windows 7 imaging for us in the future.

Monday 14 June 2010

Automating Windows 7 Installation [1] & Thin Client Evaluation

Almost two years ago I wrote that I was planning to roll out Vista to student desktops. For a variety of reasons that didn’t happen. Some of the reasons were alluded to in that article – there were so many headaches with the new security model for a start. Last year I made a concerted effort to complete an image for staff laptops. This was an extremely time consuming procedure, not the least learning how to use the new version of Sysprep from the Windows Automated Installation Kit. It was around that time that I heard of the release of BDD (now MDT), which seemed in some respects to duplicate the AIK. I don’t know whether the AIK and MDT are supposed to be alternative options for installation, but all I can say so far from my MDT experience is that it is much superior to my Vista experience of the AIK. The reason I followed the manual Sysprep process for Vista was that I used Ghost to make the image. This time around I’m using MDT for the entire Windows 7 deployment process. It does the Sysprep automatically and makes the customisations a lot easier to produce.

The MDT includes all the tools necessary to automate as much as possible of your Windows 7 imaging and deployment experience. It can streamline the entire process from beginning to end. Going to Windows 7 on our laptops is ideally something we would have deferred until they started to come with it preinstalled, but this has not occurred in the timeframe that I had hoped. Because we will be looking to deploy new desktop PCs with 7 by the end of this year, and Office 2010 is now out, it is a good time to start looking into what is needed to roll out Windows 7. Definitely we won’t be using Ghost any more, we would have had to buy some more licenses probably to keep using it and I think it has no advantages now. I played with some of the advanced functionality when we first got the Ghost Solutions Suite a few years ago, but have not bothered since as it is too much work to set up.

In order to begin learning about and using MDT, I accessed the series of articles at WindowsNetworking.com covering Windows 7 deployment. I downloaded and installed MDT 2010 in order to start working on this project. The reference PC was then set up from scratch using Windows 7 x86 architecture. Although the x64 architecture is pretty good, as well, we are keeping 32 bit for compatibility reasons, the same as I did at home. While I have had no problems to date with x64 on my desktop, the same can’t necessarily be said for all of our laptop users, who may have older hardware for which 64 bit drivers are not available. As long as their laptops have less than 4 GB of RAM, we will use the 32 bit architecture for them. Once Windows 7 installation was completed, I installed some applications, including Office 2010, that we typically use. I then installed MDT and started setting it up. One issue to be aware of with MDT is whether the machine it is running on is x86 or x64. There are two different editions of MDT for the two architectures, and you have to install the matching edition. But on an x86 machine you can service both x86 and x64 images, on an x64 machine you can only service x64 images. Once I realised this, I transferred my MDT installation to a virtual machine running x86 Windows so that I can handle both types of images. MDT doesn’t require a WDS server to run – all you need is a computer or server that can have a share set up on it to allow files to be accessed over the network during operations.

The other big thing I have done a lot of work on lately is with HP thin clients running on a Remote Desktop server to see if they can do lots of things that a PC can do, almost as well. This has taken a lot of work in order to determine the right policy settings to use with an RDS server that is running WS 2008. Most apps have worked well, and it hasn’t been the issues with x64 that have caused problems, as few apps seem to be affected by this. In general, 99% of the time an app that will not run on 2008 x64 will not run on 2003 x86 either. We have tested the HP thin client with fully automated login, so that just turning it on will cause the desktop to come up on the server, and it can complete the entire process of client startup and login in less than a minute. There are still some technical issues to be solved, most notably that printing will have to be set up for it. I am aiming to have one in use in a classroom by the start of Term 3.

Wednesday 9 June 2010

HP Thin Client pt 2 / Windows 7 / Office 2010 / Office Live

This post is a bit of a wrap up of a whole lot of stuff. Firstly our experiments with the HP Thin Client are continuing. I successfully got it to connect to a WS2008 RD server with NLA turned off. I’m now experimenting with student logins and mandatory profiles. One of our standard configurations for students is to redirect the start menu to the All Users start menu. In Windows XP this is in one location, but for Windows 7/Vista the location has changed, so I have put in a loopback GPO for this particular RD server. I also had to put in another loopback policy due to the fact that the Start menu options aren’t being adjusted like they would be on a RDC 6 or 7 client (ensuring a logged in user can’t shut down the server (!). This was also seen with the NComputing terminal connecting to VSpace on WS2003, so maybe it is not a limit of NComputing’s client. I have got the HP to run in Kiosk mode and to connect automatically, so that when it is turned on, all the user has to do is wait for it to bring up the RD logon screen and then log in just as they would if it was a PC. We have just been busy experimenting with different software applications to see how well they will work in a RD session, and so far it looks pretty good, a lot of software titles that don’t require full video work pretty well in it. So it is not that far away before we try it out in an actual classroom.

Windows 7 continues to be my OS of choice for PC installations and I expect pretty soon, as soon as I have some of the necessary work done, that we will start building a Windows 7 image for HP laptops. It will incorporate Office 2010, which I am also slowly rolling out on other computers around the place. 2010 has many improvements and the ability to save an Outlook Live password, which Outlook 2007 can’t do, will be well welcomed by users no doubt. Office 2010 was released to volume customers (us) at the end of April, so it has preceded the retail sales launch date by several months. We will need a Windows 7 image for the next time we buy some student computers, which will probably be happening towards the end of the year. I have decided that future imaging will use the free setup tools provided by Microsoft, specifically the MDT system or part of it. Last year I spent a lot of effort learning how to put together a sysprepped Vista image. Unfortunately Vista turned out to be such a crock on our desktops that the main thing I got from this effort was the fact that the same knowledge/experience can be used to image Windows 7, as they are based on the same architecture. So I hope that the Windows 7 experience will be more productive, especially as there is not going to be an “official” image for these laptops for some time.

Today also I heard that Office Live has been launched. Office Live is the online (browser based) version of Word, Excel, Powerpoint and OneNote, which is a rival to Google Apps. I went to the website and logged in using my Windows Live ID and saw that the four icons for these applications now appear alongside the Skydrive section of my profile. What Live@Edu users are now waiting for is the Sharepoint Live rollout which will integrate Office Live into their organisational Exchange Live infrastructure, because apart from the online Office apps, Sharepoint Live will (I presume) also provision Live@Edu organisations with document sharing and collaboration features, the way that the previous Office Sharepoint Server technology did.

wordonline

Monday 7 June 2010

New vs secondhand vs rebuilt / refurbished PCs for schools

OK, this is a big subject of discussion for schools. Should your school buy new or second hand PCs, or look at rebuilding existing computers it already has? In NZ there is no direct funding for computer hardware. Schools have to make the most of their tight budgets and the cost of computer purchases adds up.

First, what is the useful age range of modern computers? Although in many cases, even today, the base hardware can run for 10 years or more, such a computer will not be able to run modern operating systems or software. It doesn’t matter which OS architecture is being discussed – if you want a PC that can do all the whizz-bang latest things, especially high demand applications such as gaming, it will have a limited life. I would say this life is about 6-8 years. If you have very low demand needs, such as many home users do, the PCs could keep being used until they physically wear out. If you choose to use Windows, as I do, then it is useful to upgrade your PC about every 5-6 years according to the timetable of either major or minor Windows releases (MS alternates major and minor Windows releases). The same PC should be good for two releases.

Second, how fast does a new computer depreciate in cost? I would say at about $100 per year. A new PC for education might cost you around $800. At 6 years old, you might be able to resell it for $200. These are very approximate numbers of course. It pays to add up these numbers when you look at the argument of second hand vs new, because there is, I would argue, precious little money to be saved on second hand purchases unless you are really short of capital (in which case leasing might suit better).

Third, how much does it cost to rebuild an existing PC? The economic case for rebuilding really only kicks in at 6 years or more of life, when it can be compared favourably to the cost of complete replacement. To make this case work, the old computer has to be worth next to nothing and you have to have free labour for the job and a very cheap supply of parts. A rebuild as defined for the purpose of this article is replacing the motherboard, CPU and RAM, and it is quite typical that all three of those would have to be replaced together, such is the rapidity of change of signalling / interface standards in modern computing. However in some cases you would also be replacing the power supply, hard disk and CD/DVD drive. If the chassis doesn’t fit the components (particularly true of proprietary brand PCs) or is obsolete and would need to be replaced as well then the job is not worth doing at all. It is also worth noting for the OS licensing issue that replacing the motherboard with a different type voids an existing MS OEM license for a PC.

As an example, I’m rebuilding my old home PC, which I can do economically with free labour and a cheap parts supply. It is costing me about $370 (ex GST) to replace the motherboard, CPU, RAM and power supply in an existing chassis I already have, and with existing HDDs and a DVD writer that I already have. These are for the cheapest Intel brand motherboard, Intel dual core CPU, 2 GB of DDR800 RAM and a modern ATX2.1 power supply. These all fit into an existing 5 year old chassis which I got for nothing. For Windows I will need a new OEM license as the sticker on the side of the case is now invalid. These costs all add up and make it difficult to justify rebuilding in the majority of cases.

Fourth, should you buy brand-name computers, or generic? There is no doubt that brand name PCs have made a heavy market penetration in recent years, due to aggressive pricing and advertising. In turn, many local assemblers have gone out of business. However in the education market, Insite Technology and Cyclone Computers continue to offer a viable local alternative to the multinationals. A very important consideration is to look at the total cost of ownership of your computers over their whole life, which for many schools will be more than the three years of warranty coverage. Proprietary spares and supplies for brand name computers are often very expensive and after five years you may not be able to get new parts, whereas your locally assembled box generally uses standardised parts that are available from multiple suppliers (Watch out however for small form factor chassis that may use a custom power supply). It is for this reason that our school buys locally assembled PCs, just as we buy our printers on the basis of their total lifetime cost including the major consumable pricing.

Fifth, what are the major factors of computer design that affect speed? The biggest is probably the speed of the main memory. In the last 8 years (writing this because our school still uses some PCs that old) RAM has gone from PC-133 MHz SDRAM to DDR-266 through 400, to DDR2-533 through 800 and now some higher spec systems are doing 1000 or even 1300 MHz memory. Every advance has made a big difference. The amount of memory also matters, although it is less significant as the PC ages because then the speed becomes more significant. CPU speed is less significant because the speed that the CPU can communicate with other components has much more of an impact, due to the fact that such communications are happening all the time. However the increase in CPU cache sizes has a big impact. Next would be disk speed. IDE gave way to the faster SATA, which has gone to SATA-II with a doubling of the data rate in recent years. If you do a lot of graphical work then the speed of the graphics processing components will be significant, here a separate card has a big advantage over most onboard chipsets.

Sixth, are there advantages in standardising the design of your PCs? Yes, there are. It is tempting to replace older PCs individually as they break down and therefore end up with a hotch potch of different PCs of various specs and ages. However, the costs of supporting these will be increased. Once your school gets to, say, 50 computers or more, you start to be able to realise the benefits of using imaging software like Symantec Ghost to set up PCs from scratch, and to update them regularly. This is a lot easier to do if the PCs are all very similar in spec because you don’t have to handle multiple different drivers and installed devices that are in them.

Now, to the core argument. New or second hand? There is one thing that can’t really be changed and that is that the life of a PC is about the same as I have noted above, and this is relatively short. We can’t really compare buying a computer with buying a major electrical appliance or a car. The computer’s useful life and thus our replacement cycle is basically determined by the software it can run, not by the longevity of the hardware. As new versions of OSs and major software packages are released, they demand more of the hardware resources, and consequently the computer becomes slower in relation to new computers, and a point is eventually reached where the computer is just too slow and the performance particularly of graphics and sound is compromised. Adding to this, and fairly significant, is that software suppliers stop supporting their software on older OSs. All these things together drive the short working life of an average PC.

The big issue is that if you buy second hand, you are not saving much money because while it may be cheaper it also has a shorter working life. There are also higher support costs associated with second hand computers, both from the fact that they are older and probably going to break down more, and from having to replace them more often. If the PCs are proprietary then they will be expensive to repair, and oftentimes may not be able to repaired economically at all. When it comes to buying second hand computers, there is also less choice in local vs proprietary in general, the local assembled PCs are often a lot harder to find because of lower market penetration. But when you buy new, the choice is obviously greater. Whilst we do have a number of companies refurbishing older computers in New Zealand for the education market, most of what they are offering is proprietary and therefore the same disadvantages apply, therefore I am unconvinced of the merit of their product.

So on balance I support the argument that a school should look to buy new PCs for a working life of 6-8 years, replace all of them at the same time, and buy them from a New Zealand based assembler that uses standardised PCs parts in their systems. Leasing options are available for NZ assembled PCs as an option for schools that can’t quite get the cash together, but of course it adds up to a higher total purchase cost.

Saturday 5 June 2010

HP Thin Clients vs NComputing

As I noted in my previous post about NComputing thin client terminals, HP and other vendors have been producing the thin client technology for much longer. In NZ, NComputing has been particularly effective at getting its name out there in the education community in a way that we haven’t seen with HP, in part because unlike the latter, I suspect, they haven’t got the conflict with their established marketplace for full computers. When you go to the HP website and look up thin client products, you see that they are considered by HP to have a place in education, but it is a fact that the resellers and agents in NZ aren’t actively promoting thin client terminals in the education marketplace to any significant extent that I am aware of.

Due to the well established nature of HP and other vendors in a number of sectors (not just education) where thin client hardware is commonly used, these brands of thin client terminal are widely available second hand. Therefore in order to get a reasonable comparison and evaluation opportunity before we decide whether to take the plunge into thin client computing, we have purchased a couple of second hand HP thin clients on Trademe. One point to be aware of when purchasing second hand (as with PCs) is that the vendor can supply all passwords needed to log in (as applicable). All thin clients I have encountered so far (including NComputing) have the ability to be locked down with passwords to limit the amount of end user configuration. If you can’t get these passwords when you buy a thin client, its usefulness to you could be very limited.

A2000_20100604_002

This is an HP T5510 thin client, which has a Crusoe processor, 128 MB of RAM, Windows CE shell, and supports various protocols, including RDP 5.2. It was manufactured exactly five years ago. In testing it was able to successfully connect so far to a Windows XP computer and a Windows Server 2003 server. I have been told this should be able to hook onto a Windows Server 2008 server, but so far I haven’t been able to do this. I suspect that Windows CE isn’t able to tell me that NLA might not be supported. We will try setting up a WS2008 virtual server to test with NLA turned off, since the risk won’t be that great for a server that is inside the network with no external access, or access only through the RD Gateway. Once I had figured out how to unlock the terminal from Kiosk mode it was very easy to get it going and connect to a remote desktop. Once we have a remote server set up for it to access we will be able to start testing to see whether it can do what we want.

A useful point of comparison is that the HP terminals (and others out there), unlike NComputing, can use standard protocols like RDP; NComputing makes you use their proprietary UXP system. This means you can use MS Terminal Server or one of the other supported session virtualisation technologies that are out there. The second point is being able to get cheap second hand terminals in the marketplace. Even although they are proprietary and could be expensive to repair, they are cheap enough that this isn’t such a big deal as it would be with a much more valuable PC. The third point is that, like NComputing, these terminals are also supported by MS Multipoint Server. Therefore, NComputing has no advantage in terms of Multipoint Server. The “shared resource computing” technology referred to by MS in the Multipoint publicity, incidentally, is somewhat weird, considering that it is just another kind of terminal server and this has existed for years. Multipoint is good, though, if you want to delegate administration of the server to a lesser skilled person, and this may be one advantage if we switch to thin client in our junior school. For now, I’ll just be testing this thin client in a classroom to see what use it is for a teacher and a junior class. It may well happen that we will go with HP thin clients instead of NComputing if we switch over our junior school to replace their old desktop computers with thin clients.

Switching to Windows 7, part 2, & building new computer

Well, I have carried on with this task since I installed 7 onto my home PC a couple of days ago. So far it has been reasonably straightforward. The following are points of contention/issue that have been encountered:

  • Even if you are the administrator of the computer, you aren’t automatically granted full permissions onto other hard drives in the system. You have to grant those permissions to yourself before you can change files on those drives. (This was also the case when I changed from XP to Vista with a clean install. 7 can’t upgrade directly from XP so you haven’t got any possibility that the upgrade setup might automatically address this issue)
  • Some applications like Picasa and IrfanView aren’t picking up the changes to My Pictures and other shell folder locations even though I have made the changes in the Library properties. It looks like either they are using the Shell Folder registry keys (now deprecated) or the API call that applications are supposed to make isn’t returning the correct location.
  • My Canon camera wouldn’t launch Canon CameraWindow when it was first plugged in after the installation of the original software and the Windows 7 update. I had to follow this procedure from Canon Support to get the autoplay settings of the camera to automatically launch CameraWindow (which has been updated). (I didn’t encounter any problems at work with this software because I didn’t install it as I use Explorer to access photos so that the camera’s per image “downloaded” flag is not reset)

However, Epson Scan was installed successfully and works as expected (using the update to the Vista version).

The Nero DVD drive software (it is a Sony IDE drive) is too old (as it was for Vista). I’m putting a new Sony Optiarc drive into the new computer when it is built and the software for this will be Vista compatible. The timeline for getting the new computer going is now about a month. The job got delayed a little due to other financial priorities but I will be ordering the remaining components (RAM, CPU, mainboard and HDD) over the next 4 weeks so that I will be able to get the assembly job done (in a friend’s home-based computer workshop) soon after that. I already have the case with the new power supply that I bought installed into it.

A2000_20100323_002

Reused Foxconn TS-001 chassis with new Enermax ATX2.1 power supply installed. When completed this will have 2 HDDs, DVD writer, CD writer and card reader installed in the bays to the right. These chassis use special square sided screws like the Compaq ones in their tool-less bay retention clips, but you can get by with ordinary computer screws or Compaq ones.

Thursday 3 June 2010

Switching to Windows 7

Right now I am setting up Windows 7 on my home computer. This was not something I had planned to do but was driven by the fact that XP was working very poorly and would have to be repaired or reinstalled. This computer is pretty old now, it is an Intel 915 motherboard with only the onboard graphics and therefore has no driver for Windows 7, although the generic driver supplied by MS will drive my LCD screen to its full 1280x1024 native resolution. Office 2010 release version is installed as this has been released to Volume License customers since the end of April.  When I upgrade my PC I will transfer the boot disk image to the new PC so I don’t have to reinstall again (at least that is planned at this stage).

I am using Windows 7 x86 at home even though we have used x64 at work, the home situation is more likely to have older hardware which may only have 32 bit drivers. Although there is a 64 bit version of Office, I am following the MS recommendation to use 32 bit. So far in general the whole installation has gone smoothly with no hiccups.