Friday 15 July 2011

V2P [4], Thin PC [4]

A couple of weeks ago I wrote about how our laptops which were set up with native boot VHD would all have to be V2Pd because VHDs on the native boot system tend to get corrupted more often. This week it is time to put that into reality by getting started on the 14 laptops which will be continuing over the next couple of weeks. The first step is to back up the VHD. Then we can mount it to a drive letter using Diskpart commands. We then use ImageX to capture it in place to a WIM. After ImageX is complete we apply the image to the same partition where the VHD was. We then remove the existing BCD from the boot partition (easiest just to format it) and then use bcdboot to create a new boot configuration on the system partition. Then we boot and the laptop should be exactly as it was.

The more experienced of you will probably say I left a step out, and I did. That’s because it shouldn’t have been necessary to sysprep the VHD because we were just moving the image from one partition to another. And because MS have limited the use of sysprep to a maximum number of times it can be done, I try to use sysprep a minimum amount. But regrettably sysprep has proven necessary in this instance, because Windows 7 will detect the subtle changes in the hardware environment and lock down the computer and say it is not genuine and there is absolutely nothing you can do to make it work normally. And with all of the negative comment I have written about MS in the past couple of weeks, I have to say that this adds further to that viewpoint, as does the rest of this post. I did get that computer working but it was a real mess to have to go through it all again after having sysprepped it and that should not have been necessary, it was totally unnecessary but that is another example of the MS mentality.

The second thing I have been working on lately is Thin PC, the latest effort being to see if it can run a very old software package we have which is called Successmaker 5.5. This package has been around our school pretty well for the last 6 years and we started with it on Windows 98. The installer that comes with it had some trouble on Windows 7 and it wasn’t totally due to lack of elevation, there were some of the things it was doing that just wouldn’t work at all for whatever reason. So I tried another tack. I built up a Windows XP machine and ran Ghost Autoinstaller (AI) on it to capture the machine state, I then installed SM on this machine, ran AI Snapshot again and built an AI installation. Then I went over to the Thin PC machine and ran this installation. I then had to customise some config files and then I tested it and it worked properly. So it looks like we can load up some machines with this version of 7 and have them running this old legacy package.
I then decided to capture the installation with sysprep and this is where I ran into problems. On reboot there was an error during installation, some problem with the product key, and Setup threw a fatal and told me to reboot. Well of course that did not fix the problem at all. This became another unrecoverable setup error like others I have seen before. The only fix is to remove the image and completely replace it. As you would understand this means I have to build the image again from scratch (as my pre sysprep image turned out to be corrupted). It’s becoming abundantly clear that Windows 7 Setup can’t actually recover from many errors and even if MS fixes the problems like this they have bought themselves another bad rep with OEMs and organisations which use imaging.

UPDATE: The one good thing about reimaging the laptop with a sysprep was that all I had to do after setup finished was join it to the domain. It picked up the existing user profiles and settings that were already on the laptop without problems. So it looks like to speed up this process, because I had to repeat nearly all the steps from the beginning to implement the sysprep, that we will just back up the VHD, then sysprep, then image etc. But sysprep would not have been necessary if some idiot at MS had not decided that we will make an image totally unusable instead of giving a user the chance to activate it again. And the same mentality exists when a setup fails and forces you to completely throw away your image and start again.

Thursday 14 July 2011

More problems with Windows 2008-R2-Vista-7 security elevation

Last week I wrote a rant about the changes MS has made due to its security elevation model implemented in 2008/R2/Vista/7. The post covered what is effectively turning the domain administrators group into a lepers colony, by effectively implementing built in and irrevocable Deny permissions to the administrators group on a computer or server.

Today I’ve just discovered another problem – along with the explanation that it is “by design”. This is the issue that when you elevate a process to administrative rights, it loses access to mapped network drives that it would otherwise have access to on your computer. I already knew this happened at command prompt level, but wasn’t prepared for seeing it occur when I tried to elevate a setup process that also happened to be running on a mapped drive. Although the setup process was able to elevate, it couldn’t find the drives including the one it was being run from (LOL). There is a means of working around this problem as described here.

Whilst I don’t regard rants as an effective means of communication it has served to make the point that these changes which MS have implemented in Windows (along with many others in their service model) have, in my opinion, significantly diminished their credibility to be able to claim they have produced a credible, professional grade product suitable for use by large enterprises in all the situations that would reasonably be encountered in large sites, or multiple sites. MS’s response to the increasing competition they face in various levels has not been to produce a better product, but to slash their costs and service levels, and find new ways of stamping out the competition.

Thursday 7 July 2011

!@#$%^ Windows stupid ownership / permissions changes in Vista/Server 2008

Clipboard01
This is a response to the above message which zillions of system administrators world wide hate seeing on their server console. These messages were introduced as a new “feature” of Windows Server 2008, along with the changes that cause them. Microsoft arbitrarily brought in different meanings of ownership in Vista/2008 that are different from XP/2003. In Vista/2008 the ownership of a file or folder has precedence over permissions that are assigned to parent folders. For example in a home folders share, where individual users have created their own home folders or have had them created by an automated process, they are automatically the owner of those folders. Even if the administrator has full control over the parent folder this ownership blocks the normal inheritance of permissions. While there may be situations where an administrator should not have access to users’ home folders, this can already be catered for within the existing mechanisms for setting permissions on a parent folder and assigning them to different administrators, rather than imposing a one size fits all solution based on a Big Brother idea of dictating to organisations how to run their own file server in their own organisation.

Now, a solution to this is to change the ownership of all the files and folders in a location. Make the administrators group the owner and that will fix all these problems? Actually, it won’t. The second change which came about in Vista/2008 is that the administrators group in general no longer has the same authority over the server as they used to. Everyone has seen innumerable messages telling you that unless you tell something to run as administrator, the fact you are a member of the administrators group does not actually give you the rights you should normally have to do something. The implication of this for ownership is that changing the ownership to a group actually does not work. Changing the ownership to “administrators” group does not overcome the problem of getting the above message in the slightest. Windows basically will not honour those settings unless the ownership is changed to only one user. This means that a group of administrators cannot administer files because only one individual user account can be the owner of the files at any one time. Likewise you cannot grant other users administrative permissions to a file share because they are blocked by the ownership issue on the files and folders in it.

These features might make sense on a desktop computer used by only one server. They don’t make sense on a server where an administrator has to be able to manage files. For example we have scripted backups using Robocopy. It is common to see “Access denied” messages in the logs from running these scripts, purely on the basis of this arbitrary ownership change.

Why has this happened? MS has come up with the cheapest and simplest for it solution to all their massive security headaches and put these changes in without asking users what they wanted because all that matters is getting the bad publicity about security breaches off the front pages of newspapers. Some way back I wrote a hard headed post about all the ways that Vista lies to users. These faults to some extent were fixed in 7, but not in Vista. The solution, always, fork out more money for a new edition of Windows. A pattern that is becoming more and more common in Windows these days. Customer service has gone out the window.


As aside: What happens when you click Yes to the dialog box shown at the top of this thread? Windows automatically assigns you permissions (Read and Execute only) to the folder in question. Windows has to do this even if you are a member of the Administrators group and have already inherited permissions to the folder, and your user account you are logged onto at the moment is a member of that administrators group; in other words, you can’t use a group to manage security permissions for a resource any more unless they are not some of the built in administrative groups. I haven’t quite figured out yet if I can make up my own group of administrators and give them permissions, but so far everyone seems to be tainted by association with the membership of the Administrators group. 
 
By way of more testing I have confirmed that if I give permissions on the folder to individual user accounts then all the permissions work. If I create my own group and make my administrative accounts members of that group and apply permissions for that group, they don’t work. It is like MS has forced a Deny full control by default to the Administrators group. You can have read only access but not full permissions unless those permissions are granted to individual user accounts only.

None of these changes make any sense, nor does Microsoft appear to have any concept of accountability for them.

Wednesday 6 July 2011

Windows Thin PC [3]

Today’s Thin PC experience has been to set up a computer at work to see what I can do with it. I wrote a custom shell in Delphi 5 which has a very simple task of launching Remote Desktop Connection and passing a RDP file to it. This means the user doesn’t have to do anything. By enabling the RD SSO settings in Group Policy, they don’t have to specify their username and password to log on to the RD server, instead the username and password they used to log onto Windows are used. The thin PC is joined to the domain so all the settings are configured using Group Policy. So when the user logs on to the computer, they don’t get the standard Windows shell; instead, they get connected automatically to the RD Server. When they quit Remote Desktop Connection by logging off, the only option they have, the custom shell detects the quit and automatically logs them off the computer as well.

There is a bit of tweaking still to do but overall this is a very positive and worthwhile experience getting this working.

Tuesday 5 July 2011

Windows Thin PC [2]

Following on from today’s earlier post, I have found out that WTPC is supported by WAIK, which allows for a custom image in the same way as we would set one up with Windows 7 Enterprise. I also found out how to get Remote Desktop client to start automatically by replacing the shell. However I will probably have to write a custom shell (a fairly simple task) to provide a way to shut down or log off when Remote Desktop is closed, as otherwise the user is left with a blank screen.

We can also use Group Policy to set up Single Sign On for Remote Desktop. Therefore our thin PC would be domain joined and the user would log onto it using their regular domain username and password. It would then automatically start up Remote Desktop Client and use their credentials to go straight through to the remote server, so the experience would be pretty seamless.

The next question is whether logging off the session can cause them to be logged off Windows, as well, or whether the custom shell would have to detect their logoff and then automatically log off Windows.

Windows Thin PC

Recently I was logging into the MS VLSC and I noticed a MAK for Windows Thin PC. So naturally I wondered what this was and took a closer look. Windows Thin PC was put on general release on 1 July (five days ago) and it is essentially Windows Embedded Standard (2010) (i.e. Windows 7 SP1 embedded edition) made available to SA customers (which includes all NZ schools that are signed to the MOE Microsoft agreements). The installation onto my old home PC, which is six years old and pretty limited these days (for example it can’t run 64 bit), was as easy as it gets. You will need the minimum specs that have been around since Vista, the most important is 512 MB of RAM installed.

Although MS recommends a WDDM driver (i.e. Vista compatible), as is the case with Windows 7 generally, the drivers that come with it do a pretty good job with a lot of hardware. The motherboard in this case being an old Intel D915GAG – one of those infamous “Vista Ready” motherboards that didn’t have a native Vista driver and therefore can’t do Aero. For this type of application that doesn’t matter. However this MS driver doesn’t support widescreen resolutions so ensure your displays have the 4:3 aspect ratio or just put up with a slightly fuzzy picture. Drivers were installed by Windows for the other devices and everything works as expected.

Windows Thin PC on closer inspection turns out to be similar to other thin client editions of Windows I have used, with limited features and functionality. However the considerable advantage is that Microsoft supports it, so you aren’t limited to the functionality or support level of an OEM. I got my fingers burned when I bought a HP thin client box that had support for RD Gateway which I thought would let me log straight in from home on it. Only to discover that some of the components needed weren’t installed and HP didn’t make them available so that was a waste of time/money.

The main application I could see us using Thin PC for is a Remote Desktop client. I assume that all our Thin PC machines will be domain joined and have an automatic logon to a shared account. The desktop will be completely locked down and the only allowed application will be the Remote Desktop client which will automatically load and come up. The user is then logging in remotely using their own username and password, to a Remote Desktop server, which gives them the desktop they are normally using. We have done testing with Remote Desktop configurations with student accounts and also with hardware thin clients so it is a pretty standard lite computer configuration used in lots of educational settings.

Well, as it happened, the Remote Desktop experience was pretty much what I expected. It fully supports the functionality of RD Gateway and the like, and I have now got this Thin PC at home that I can log in to the school’s network on, although this is mainly just for demo purposes as my regular home PC can also do this of course. Thin PC is the replacement for Windows Fundamentals for Legacy PCs, which was based on XP Embedded rather than Windows 7 Embedded. WFLP is quite old having come out five years ago. It will be interesting to see if imaging can be used with WTP, although I doubt this is really necessary because the default installation does the job and nothing needs to be installed on it.

Monday 4 July 2011

How to fix Windows 7 logon error: “The Group Policy Client failed the logon. Access is denied.”

I have seen this particular error several times with Windows 7 and Vista and we are not helped by a lack of documentation from Microsoft for this problem.

In my case the most recent instance of this occurred when I had to drop and recreate a user account. The first time I got the error I tried deleting the local profile, deleting the server profile, giving the user administrative permissions on the laptop, you name it. In some instances instead of getting the above message the user would appear to log in as normal until “Preparing your desktop” then they would be logged out with no further explanation.

After a great deal of frustration I came across this helpful page and adapted the instructions to my situation and the problem is solved. Here is how I applied the steps:

  1. Start up Regedit on the affected computer
  2. Go to HKEY_USERS
  3. On the File menu click Load Hive
  4. Go to the folder of the affected profile and open NTUSER.DAT
  5. Name the new key e.g. Profile
  6. Right click this key and select Permissions
  7. Select Advanced
  8. Add the account of the user whose registry this is and give them Full Control and replace permissions for Child Objects with inherited permissions from this object.
  9. On the File menu click Unload Hive
  10. Close Regedit.

In this instance at Step 7 I found the SID of the previous user account had full control.

I still don’t have the foggiest idea why the new user account didn’t get the permissions assigned – how does this happen I have no idea. But it’s been a long day and time to go home.

FOOTNOTE: This all went well until I tried logging in the user on the Remote Desktop server – which picked up their new roaming profile, instead of the local one on their laptop (naturally), and threw the same error. To cut a long story short I had to repeat the steps on the server copy of the profile. Since this profile was created new, I don’t have any clue as to how the incorrect permissions got set in its NTUSER.DAT file.

Friday 1 July 2011

Native VHD data integrity issues / V2P [3]

The first thing to say is we are now moving to implement all of our deployments which have been in VHD, to physical i.e. V2P. This includes all computers such as desktops, although being networked with users’ personal files redirected to network shares, they are not as critical as laptops which all have users stuff in the same VHD. Simply put we are finding with desktops that there is a higher incidence of boot failures with VHD indicating we are perhaps pushing the technology beyond what was expected of it.

However this doesn’t get away from the greatness of native VHD as an image development/build scenario because you can still do that development process based around native VHD and then deploy to physical. To do this is currently a two step process using ImageX, mount the VHD to a drive letter, capture it with ImageX, then apply the WIM to the target using ImageX. What I am hoping for in the future is that Microsoft will come to the fore and change ImageX to work directly with VHDs so we don’t have to have WIM and VHD versions of the same image.

I wrote further back that I had figured out that we only need to keep pre-sysprepped images and sysprep them on each machine at deployment. Now our remaining post sysprep images will be getting wiped soon so I have enough disk space to store the WIMs I have to make of the deployment VHDs.

Compared with our native VHD deployments to things like a computer suite it actually takes no more time to deploy with ImageX from a network share than it does with NativeVHD and you do save the time it takes to copy the WIM locally to the target platform by getting ImageX to pick it up from a network share and apply it at the same time, this therefore is the equivalent of the VHD copy to target stage. The rest of the steps take exactly the same time as they would for VHD. You run BCDBoot the same as you would for virtual except giving it a different drive letter perhaps. In due course I will have scripts set up to run all the various steps including the ImageX step maybe.

The good thing for us is that the same technologies are used to prepare VHDs for deployment as can be used with ImageX WIM images and therefore there is an easy transition between the two. As Microsoft have given us this great technology for image testing and development, since it really is only suitable for test environments, and since they have integrated capabilities to mount VHDs in Windows 7 and Server 2008 R2 GUI as well as command line (Diskpart), I am quite hopeful they will come to the party with ImageX enhanced to work directly with VHD so that these images can be deployed to physical as this is what ImageX does.