Wednesday 14 December 2016

Android updates not as good as Windows Phone

If you buy an Android phone, the key issue is getting software updates from the manufacturer. With Windows in particular, which is the nearest equivalent (one OS on many different hardware devices), MS has a very good system that provides updates to their core operating system for many years after it is first released. This is possible because MS has fairly tight control over the Windows OS, and because the nature of the OS makes it easy to customise for different hardware.

Android is a very different beast, and that creates problems for end users. Typically the stock Android is customised by a hardware manufacturer for their product, and then further customised by the carrier who sells it to the customer. There is a big shoving match as to which of the three is responsible for updates to the software. But the vast bulk of Android devices out there are not able to receive updates for security issues (and Android has had its fair share of exploits). This is because Google can't make an update for the customised Android that the manufacturer has produced for their platform and that in part is due to the nature of Android (and Linux) in respect of the adaption to different hardware platforms. By and large, manufacturers are passing the buck on supporting non-current models of their phones and it is rare to see a version upgrade (compared to the Apple scenario where new iOS versions come out regularly) let alone a guarantee of security fixes for any period of time.

It is for this reason (and a few others) that I have reluctantly concluded that to be sure of getting a phone that will receive a good level of support from the manufacturer for security and version updates, that I will have to spend a little more and buy a Google Nexus. Google makes these phones (and the newer Pixel series) to compete head on with Apple. But the lower end models are still affordable compared with the iPhone (I have many friends who have iPhones and I can't quite bring myself to ask them how they were able to justify the huge expense, since at the time of writing the cheapest new model is $749) - Noel Leeming has an unlocked Nexus 5X for $399. This is really as high as I could justify for a phone, and a long way up from the $149 you could pay for a locked Samsung Galaxy J2 at current pricing (I paid more than that for mine) - $149 is also what I paid for my Lumia 635 and its predecessor. 

The Nexus has 32 GB of onboard memory, but there is no microSD card slot, and it also has a USB-C socket, which means different charge cables. There is also a fingerprint sensor for the unlock, and you get a stock Android version in it, so you aren't missing out on features the OEM has chosen not to implement (as I found with Samsung's implementation of Do Not Disturb on my Galaxy). It comes with Android 6 but Google has guaranteed availability of Android 7. So we shall see.

Saturday 10 December 2016

More Samsung self serving crap and ripoffs

My first Android phone was a Motorola Moto-E dual sim (XT1022) that I picked up from The Warehouse relatively cheaply as it was end of line at that time. A great little phone especially as it was my first use of Android with the main issue being the lack of RAM (only 4 GB). Because Motorola puts out pretty much the standard Android offering with relatively few apps you get all of the standard Android features and it doesn't gobble up the memory on the phone.

I thought the solution would be a Galaxy J2, which while offering 8 GB of RAM has filled a lot of the extra space with useless Samsung apps that can't be uninstalled. And in fact some of these apps are stuff like the Phone and Messaging apps which have terrible usability features that you can't fix in any way. Because Samsung is another huge greed driven corporation whose agenda is to drive everyone towards the high priced Galaxy S premium phones which give you the nice customisations I am sure and all the extra features but frankly the prices people have to pay is ludicrous, up there with Iphones for sure. Who wants to pay that much for a phone when you can have a tablet with a much larger screen?

The biggest hassle with the J2 is that Samsung decided they wouldn't put the standard Priority Notifications feature into their low-end phones. Instead they replaced it with a one size fits all feature called Do Not Disturb. Both of these systems are intended to do the same thing, put the phone onto silent when you don't want to be disturbed. But Priority Notifications has a timer that you can turn it on for say 2 hours and then it automatically turns off so if you are in a meeting, and this is crucial, it will put your phone back to normal after that time elapses. Samsung's DND feature only has one predetermined timed schedule possible, there is no ability to turn it on "on the fly" casually without mucking about with a whole lot of settings. So there have been many occasions when I have missed calls because I put the phone on silent and then forgot to put it back on after a meeting or something.

Apart from that I also find the phone has problems detecting when it should stop trying to connect to Wi-fi because I am no longer at home, and has to be restarted quite often because it has dropped internet connectivity. An expensive lesson I have learned, with successive Microsoft and Samsung phones, as well as Canon cameras, is that big corporate agendas see the main purpose of the lower end of the product range as free advertising to push people up to the higher and more profitable end of the market. On the other hand, those old phones do make great media players. My old Lumia, which only had a few months use as a phone and is still in top condition, is now a Bluetooth capable media player which makes it very useful around the house with one portable Bluetooth speaker I got recently and the ancient sound system in the lounge with a Logitech bluetooth receiver.

The other thing is Samsung dragging the chain with updates because Marshmallow should be available for this phone but there is no sign of it and in a lot of cases there is extreme irresponsibility since security updates are not being pushed out by vendors. MS has also dragged out or abandoned older Lumia owners as there is no sign of Windows 10 Mobile coming anytime soon.

At the moment I am considering whether to use the Moto E again or go for a new Moto G Play which is $299 (unlocked), the problem is you can't get a cheaper locked phone as Vodafone don't carry Motorola but at least you do get those options of running the standard Android instead of the Samsung proprietary and stripped implementation of the UI, with 16 GB of memory which is massive (I am already getting "out of memory" errors on the J2 when app upgrades are needed) and won't be clogged up with useless Samsung apps.

I think the lesson for all of us is we have to be that much more vigilant about being locked in to proprietary hardware with the limitations for software in these days. Thankfully I already had that instinct long ago with PCs which has saved me when it came to the point Windows was no longer a palatable offering from MS and apart from the HP tablet (which could probably run Linux) I haven't got any computer hardware that restricts software/UI choices. Of course the Samsung Android tablet I have is a locked-in platform but it is wholly different as I mostly use it with third party apps and it is not a phone. So it is an acceptable compromise provided Samsung keep up to date with security updates for Android 6.

Monday 28 November 2016

Undocumented and stupid Samsung TV sound settings for HDMI input

I am currently the owner of a Samsung UA22ES5000 TV which is a small (22") TV, probably the smallest Samsung TV made. Since I purchased this TV originally, it has almost exclusively been used as a PC monitor, without connecting the internal speakers for sound.

The TV has the following inputs: HDMI, VGA, component and composite video, analogue audio (RCA and minijack) and the following outputs: analogue and optical audio. The TV features the ability to rename these inputs, and somewhere along the way I chose to rename the HDMI input as "DVI PC" as that reflected what the input was being used for.

All was fine until I very recently signed on to a satellite TV service and connected the set top PVR to the TV using the HDMI input. My intention was that the sound from the PVR would come over the HDMI cable to the TV and then the speaker output from the TV would be connected to external speakers for better sound quality. This was done so that I didn't need to have switch boxes or other complications to connect the PVR to my existing sound system and would also enable me to have both the media PC and the PVR playing back both video and audio at the same time. (Last night I did in fact watch two programs simultaneously in this way, both of them were Hillsong Church broadcasts and I was able to keep track of both of them simultaneously :)

On doing the installation it was discovered that the TV would not produce sound from the HDMI input and although it is possible to run a separate sound cable to the minijack input (as would be the case if a computer was connected with a DVI cable), I preferred to use the HDMI cable, partly because I suspected the digital audio output would not work with the analogue audio input and partly because I wanted to be sure the TV was not actually faulty.

After going through every menu and setting and the instruction manual it appeared the only option I had was to try hard resetting the TV and after this was done and the initial setup procedure had been completed and the TV had retuned all the channels, the sound was working from the HDMI input.

But that still left the mystery of how the sound came to be turned off in the first place and as there was no clear setting or instructions I eventually Googled this and eventually found a thread on, of all places, the Ubuntu forums. Here it is.

The gist of this is that if you use the input rename feature of the TV and change the name of the HDMI input then it changes where it gets its audio signal from. In particular if you rename this input to "DVI PC" which is one of the preset names, then the sound comes from the DVI/PC minijack input and not from the HDMI input. If you leave the input at its default (no) name, then the sound comes from the HDMI input and not the DVI/PC minijack.

As I noted above there is no documentation of this feature by Samsung in the instruction manual for the TV and I had not had cause to use any of the audio inputs of the TV up to this point since it was connected to computers which had separate speakers. That has now changed as, not only is this the correct display to use with the PVR (which has TV type resolution settings like 1080p and 720i in its options, and not computer monitor resolutions like 1680x1050), we are also getting Shine TV on FreeviewHD in a few days time and I will also watch that channel on the TV. So it will be used as a TV for two channels, Shine TV and Hillsong Channel. And if I could really be bothered there are about 10 other Christian channels between the two signal sources 

Wednesday 9 November 2016

Linux So Far [3]

Well the household still has 3 PCs - but they are all in the same room now. The idea is that I can drive the display in the lounge off a bedroom PC using VGA over Cat5 adapters and a wireless keyboard / mouse, and this means I can cap the number of the computers in the house at three while making maximum use of them all the time. That has benefits to me in the bedroom where I do most of my computing, as it means I can go back to having the two best computers both running Xubuntu (one for general work and one just for video playback) and just a small lower powered computer doing the occasional Windows tasks (scanner and cameras, DVD ripping, some image editing and stuff like that).

Bluetooth is one of the features in Xubuntu that is nicely implemented. I bought this dodgy USB bluetooth adapter a couple of weeks ago called a Bluemate 4. The problem was that the (Windows) software runs in evaluation mode and doesn't actually come with a license. Apparently the answer to this problem is to buy the Bluemate 5 which does come with a license and sells for the same price. As it turns out when I took the adapter out of the Windows computer and put it into a Xubuntu PC, the built in bluetooth in Xubuntu worked straight out of the box and lets me do file transfers to my tablet, which is all I really wanted it for, just fine. Kudos to the retailer (Ascent) for giving me a credit on the price of it as well. It sort of makes up for that Dick Smith gift card being dishonoured at the beginning of this year when they went into receivership.

So putting Xubuntu back on the Ivy Bridge (B75) computer was as usual really fast once it could be booted - sometimes the boot gets hung up, or you have to try UEFI boot, or the regular boot, just whatever works. And that computer will be hooked up to the little TV I have in that room, and it will also be connected on its VGA output from the 2 head $50 video card it has, to the display in the lounge, which at the moment is an old one that only has VGA inputs.  But sound could still be an issue and I have to figure out whether to get a converter that also puts sound down the same cable.

The tricky question is where to put the satellite set top box when it gets installed in a week or two. In the bedroom where I do the most watching, or in the lounge where it is nicer to sit on the couch? If it has only HDMI output, can it be adapted to VGA easily?

UPDATE: Well the latest solution has been a lot more radical and that was to move the computer desk out of the bedroom and into the living room. I found the idea of linking between rooms with cable to be relatively impractical and also the satellite box will be in the living room so it makes sense to bring everything together in one place. Still waiting on the satellite TV company to arrange the install and will be chasing them up this week to see where it has got to.


Monday 7 November 2016

Is Apple abandoning the professional creative marketplace?

The idea that Apple might seek to abandon the professional market segment altogether having decided to focus on the consumer marketplace has gained ground with the focus of their recent product updates becoming clearer. In fact this trend has been obvious for some time with a growing range of pro/business equipment being dropped over the last decade.

I have more than a passing interest in this segment of the IT marketplace due to the interface of my musical interests with education and church scenarios over many years. Due to the previous efforts made by Apple to support professional music production, Apple hardware is very prevalent in many educational settings particularly classroom music suites, and also in church settings where these days contemporary worship environments are making major use of their products.

The trend is obvious however that Apple has been losing interest in the professional end of the marketplace for some considerable time. In one sense this is welcome as it has been blindingly obvious for some time that the snobbery and elitism that seems to be inherent in the choice of Apple hardware by default (and the willingness to fork out large sums of money for the nice bells and whistles unquestioningly) has, unfortunately, carried over to Christian circles where the self serving aspects of Apple's business model are something we should be more willing to challenge, and it is most regrettable that I have had people in church circles attempt to paint Apple gear as inherently superior and better than anything else out there.

The main interest Apple has these days is making a lot of money and the trends in hardware have seen Apple gear not only become more expensive over time, but sacrifice serviceability as well. Witness that it is now virtually impossible to make any user upgrades to the Macbook laptops, while the Mac Pro is frozen in time to 2013 in hardware spec and lacks internal expansion capabilities; and that the newer versions of the Imac are basically impossible for anyone other than the repair centre to open up for service as they are similar to a tablet in design and manufacture. All of this serviceability sacrificed in the name of superior visual elegance in the form of thinner, lighter product means they are very expensive to repair and once out of warranty may in fact be considered almost throwaway due to the high costs of servicing.

Leaving that aside it is now time to focus on viable alternatives. These are of course Windows and Linux. By and large it appears the mainstream commercial focus has shifted to Windows, but MS's attempts since Windows 10 to maximise revenue by virtually taking over the control of the computer from the end user (witness the default update settings which are to MS's convenience rather than user's, which cannot be turned off in the Home editions and which in the Pro and higher editions are deliberately hidden from the Start Menu user interface) have alienated many expert users such as myself, which is why almost all my computers now run Xubuntu. However there is a growing base of commercial creative software being produced and in schools the Windows alternative to the likes of GarageBand and even Logic has been achieved with packages like Mixcraft and a lot more besides. Adobe shifted its focus to Windows some years ago and has a very good range of software now available for the platform, and a lot of worship presentation packages are well supported on both platforms.

Linux is a bit of a dark horse when it comes to the pro-creative market segment but is steadily gaining ground - my views must be tempered with the reality that I have not actually been involved in production activities at any level except for worship multimedia presentation. Hence I am reliant on the knowledge that the operating system's customisability to an extent virtually impossible with Windows or macOS giving the ability to ensure high stability and reliability is what attracts growing support from segments of the music community and in fact there are two particular distros that are now produced, AVLinux and KXStudio, that are aimed squarely at the needs of musical creatives. It is a verifiable fact that Linux is now extensively used in motion picture production studios as the OS of choice for the big commercial hardware vendors like IBM and HP who produce the hardware of choice for movie production. This started with graphics rendering and Linux is steadily making inroads into other areas like video editing. One big advantage for Linux is its overall compatibility with Unix which was the previous market leader for the high end hardware needed at this level of production. meaning a lot of the software used has been readily ported over.

The biggest advantage Linux has is its ability to run on virtually the same hardware as PCs, something Apple has missed with their reversion to hardware lockdown since the clone Macs were discontinued in the mid 1990s. Since there is not that much difference between macOS and Windows it will be relatively easy for Apple to exit the desktop marketplace and switch its entire focus to laptops and handheld devices. It only remains therefore for its professional segment fans to find something else that will meet their requirements for the future. I think that future is very likely to be in Linux/x86.

Saturday 22 October 2016

Linux So Far [2]

Obviously a bugbear of any platform is that the software you want to run isn't always available for the platform and these days it is not unusual to have several computers running multiple platforms if you are a power user. I have had to maintain at least one Windows computer since I switched to Linux, but there is no way I would ever go back to solely Windows because there is so much good software available for Linux now that I have discovered it. Because there are technical limitations with virtualisation, having a physical computer box running Windows is essential rather than a VM, or the other way around.

As a result of identifying the preference to downsize the household to three computers instead of the five that it had up until very recently, it has been determined that the Intel DB75EN is the best hardware to be running the Windows computer on, rather than AMD E350 and this also enables one computer to be eliminated in the household. Recently also I have removed the second computer from the living room as longer term the expectation is a single small form factor HTPC will be the sole requirement there.

The DB75EN will be running Windows 8.1 indefinitely and will have the scanner and camera software installed as well as an Android emulator to enable it to be used with Instagram, and at this stage the Windows versions of Kodi and VLC will be installed along with a media library. It can also run Cobian Backup for all the backups that I do. I will be looking at which computer can run video editing software. After testing BlueStacks Android Emulator, I found it was difficult to use on my PC (which is a pretty powerful beast with 12 GB of RAM), so I uninstalled it and put Leapdroid on it, which has been much more useful with Instagram. However there have still been problems with all Instagram functionality being available so as far as I am concerned the use of the Android emulator to post Instagram content is severely limited and I will be going back to my original plan to buy an Android tablet instead.

The number of screens will be changed to two for each computer (the main PC had three up until this point, but there is no real need for the third one), and two of those screens will have a resolution of 1920x1080, the other two will be 1680x1050. The Xubuntu PC has the two screens side by side in landscape mode as usual; the Windows PC has the screens stacked vertically, with the lower screen at 1680x1050 being in a portrait orientation, and the upper screen at 1920x1080 being in the usual landscape orientation. The orientation of the lower screen works better in some situations such as editing documents or viewing Instagram content in a web browser because of the way some of these apps are optimised for vertical form factors such as a typical smart phone. With the second screen on this computer horizontal I have a choice of both form factors depending on need.

UPDATE: Well there have been problems with Kodi or VLC on Windows when playing Matroska format video files which are what I can produce with AviDemux when splitting DVD video files into tracks (can't save MP4). Basically at a certain point of playback (this varies for each video but is always the same location for a particular clip) the video would stop momentarily and then resume itself a few seconds later, the sound also skipped. After trying lots of tweaks I decided I could just install Kodi onto my main PC and reconnect the TV to it as a third screen (since it has a quad head video card). So Kodi works better on Linux without giving these problems. So now I have one computer with three screens and one computer with one screen and that is how it is going to stay for now.

Wednesday 28 September 2016

MS Smoke and Mirrors over Live Email storage

Microsoft deliberately won't tell you what the size limit of your Live mailbox is. They sort of imply it's unlimited. This is a smoke and mirrors game in which they can shift the goalposts any time they like and is similar to how they very quietly announced they were chopping OneDrive storage from 15 GB to 5 GB. They have in fact twice chopped OneDrive from a large amount (it was 25 GB at one time) to a smaller amount, and obviously twice gone from a small amount to a large amount.

The reason we were given for OneDrive being chopped was that they had had a period of unlimited storage and a small number of people abused it, so now they had to chop everyone to 5 GB. There was no explanation why they couldn't go back to 15 GB. Well, I just opened a new Gmail account and moved everything to Drive.

Going back to Live Mail, as soon as you hit some sort of limit, you'll keep getting these emails saying your Outlook account is growing too fast. Even if it's growing really slowly. I don't actually know what the limit is except that I keep getting these messages, and there is no actual way to know how much space my emails have used. I am actually getting to the point of being annoyed by the messages, and have now started moving mail off the Outlook.com server to another Gmail account. Maybe I will start redirecting new mails to that account as well.

It looks like this capacity limit for Live could be as low as 2 GB - compared to the 15 GB mailbox that Google gives you it is not much, and there isn't a lot going for MS's cloud services now that OneDrive only stores 5 GB.

Kodi Media Player on Xubuntu

Last time I was looking at putting Kodibuntu onto a low spec computer (in this case a AMD E350 as I have a couple of those and they are not much use for anything else). I duly did install the Kodibuntu image but there was no sound on the computer when it was set up. These types of issues are unfortunately quite common with Lubuntu, upon which Kodibuntu is based. So I went back to the drawing board and installed Xubuntu and then Kodi on top of it. It works very well for media playback on that computer and isn't resource limited with that.

So right now I have five computers. In the bedroom I have the main PC and the media PC, both on Xubuntu, then there is a third computer (one of the AMD E350s) with Windows 8.1 that is there for the scanner and cameras mainly. In the lounge is the computer I use for general stuff, and then the other AMD E350 - both computers run Xubuntu as well. The two media player computers are both running Kodi which works very well. 

One of the niceties of a proper media centre is that it isn't limited to playing back files. With the libdvdcss2 library you can play back DVDs, and with the plugins supplied, YouTube videos can be played. So I have been playing a lot of Youtube stuff on the lounge media computer and it is good that Kodi is really easy to use. Both media players have Logitech cordless keyboards with built in trackpads and all my Logitech keyboards have keys to control media playback and volume. 

The next task is to bring together the media library on the media PC in the bedroom, so that all the cds and dvd clips are all organised on it, so that I can then copy the library to the lounge media computer. At the moment I don't have all of the DVDs I ripped recently and I suspect some of them are stored on the 2nd disk in the lounge media computer which was previously the Win81 PC and DVD ripper.

The long term plan is there would be one less computer in the living room, there would be no desk as now with a PC hanging off the bottom of it. Instead the small chassis (Antec) AMD350 would be parked in behind a big screen TV and would be used with that system to play stuff back. There is still a lot of Kodi I have not explored like the other types of media it can handle (mostly pictures) and heaps of addons for different video websites. I believe modules can also be installed for PVR type stuff but I have no intention of using these.

Thursday 15 September 2016

Linux So Far

As one would be aware, about six months ago I took my first tentative steps towards the Linux platform. Since then I have been using Linux extensively on as many of my computers as I can. I did so in the full knowledge there is something of a religious zeolotry associated with this platform, almost like the Apple zeolotry but for different reasons. Whilst wishing to stay above that kind of fanatacism I still have to admit that I have developed a fierce loyalty for the Linux platform in that six months. Every day there is some new opportunity or experience or learning to be had in conjunction with this platform. Running VNC is the latest great accomplishment as I have spent the last few days exhaustively VNCing to my 24 GB computer from the low spec computer in another part of the house and it has been a brilliant working experience with only occasional hiccups in terms of being able to do a lot of stuff from a computer that is innately incapable of handling it. The browser issue, well switching to Firefox Developer as my primary browser at home hasn't been the issue that I thought it would be, the issues I saw back in July when I last tried it haven't actually been there at all this time.

It took me several goes to find a distro that worked well for all the different computers. They could have all run different distros but now they are all running Xubuntu, which I think gives the best use of resources on both low spec and high spec computers, and also has a very pleasing visual impact for the most part. The fact many applications I already use have been available for Linux for a long time has limited the need for me to keep computers that run Windows down to just one actual computer and one virtual machine on a regular basis. There is a question mark over the AMD E350 computer running Xubuntu which has not been as good a performer as I expected and tends to max out on the CPU relatively quickly. A thought is the AMD video driver supplied in Ubuntu may be an issue but I have yet to investigate further as being able to VNC to another computer in the house has been a useful workaround. Both Firefox and Chrome max out the CPU playing fullscreen Youtube clips with Remmina being the only other substantial sysem load, but there is virtually nothing between the browsers as far as performance goes. However it is notable that CPU is at 100% while the amount of RAM being used is less than 500 MB, so I don't really know where the bottleneck is, and it will be interesting to compare the other AMD E350 that is running Windows 8.1 to see what its performance is like.

So far - very good.

The comparison with the AMD E350 running Win 8.1 turned out to have similar performance issues. Well, the boards were cheap (the CPU is onboard) and were originally bought to use as essentially thin client terminals so they were never expected to have a lot of power. Still, it is disappointing that an old DQ35JO Intel board with 8 GB still is a long way from maxing out CPU as quickly as these do, and that it was essentially a waste of money upping both the ones I have to 8 GB. I have now replaced the AMD E350 in the lounge with that selfsame DQ35JO on a number of bases, including that it can take one of my $50 Nvidia NV210 cards and therefore run both screens in digital - it has VGA, HDMI and DVI connectors and can use any two - for the sharpest picture. Plan at the moment is to pick up the DG41RQ from work and swap it for something else low spec, bring it home and use it as the shed pc in place of that DQ35JO. Would be interesting to see how that AMD E350 goes as a multimedia computer running Kodibuntu and I will have a play with that as soon as I work out which chassis to put the board into.

Wednesday 14 September 2016

Browsers that work on Xubuntu

When I started using Xubuntu it came with Firefox, but I preferred Chrome and Opera. Since then, Opera's sync servers have been hacked, causing me to make less use of it and stop syncing data altogether. Chrome has been very good on Windows for me in the past, but seems to have a lot of issues in the Linux edition. Primarily these revolve around the Flash plugin, which often stops or freezes. I find that some tabs quite often put up messages to this affect. At other times without these messages, tabs will simply crash for no obvious reason. It does not take very long before this starts to happen on even the computer with 24 GB of RAM; it can be in as little as 12 tabs when this is happening.

So first of all I tried Chromium, which behaved in exactly the same way, and then I started to test Firefox Aurora. Now not so long ago I tried out Aurora because Opera was giving me some headaches. Well I am not totally sure if Aurora is the best option but I know that it can perform well with a massive number of tabs open. At the moment it is showing no signs of breaking down like Chrome has been. I don't remember what the other issues were with it more recently that stopped me from wanting to use it for personal stuff. However I am having another go with it because Chrome is not satisfactory at the present time.

I don't know what is going on with Chrome but I do know I have seen its issues across several of my computers so it is not specific to any one computer, and as mentioned above, having a lot of RAM for it to work in (and it does use a lot, in any case) does not solve the problem. For now, I am therefore working with Firefox Aurora (FFDE) to see if it can do better.

Tuesday 13 September 2016

More Remote Access with VNC Servers and Clients

Last time I wrote about remote access it was to talk about RDP clients for Linux. Now it is time to have a look at using VNC, which is better supported as a server for Linux than RDP. Ubuntu comes with what is called "Desktop Sharing" which can also be installed as the vino package. This is a straightforward and easy to set up VNC server for your computer. I am only using it on my home network, not over the Internet - if you need to do the latter you should set it up to work over SSH. I have never set up any remote access to my home network over the Internet and don't plan to do so in the future.


You will also need a VNC client. TightVNC which I used on Windows is capable but is no longer produced natively for Linux as the producers have focused exclusively on Windows for their present and future development of the native client. It is possible however to have it running on Java but I want a native client.


From this I can see a few recommended options like Vinagre. This was easy to install, but terrible to use. It was incredibly slow to update the screen. So I ditched it pretty quickly.

Here is another article that is worth a look:

From that article I have tried installing TigerVNC (which is in fact a TightVNC fork), but it won't run on my system. So for now I am playing with Remmina. This at least will give me access to all three screens (on MainPC) in one wide window I can scroll across. There is still a lag I am not quite used to when typing and moving the cursor around. Having used Remmina quite extensively with RDP clients you don't see these sort of delays on RDP so maybe there is just something to adjust in the configuration, but it could also be Vino that is responsible for the slowish performance. I noticed in the article they used X11VNC rather than Vino. TurboVNC looks good as a client but turns out to require Java, which I don't see the point of, as this computer is going to have resource challenges running it. 

Back to the the Ubuntu.com website and we have a separate section on VNC servers worth looking at:


This one talks about both Vino and X11VNC. I chose at this juncture to remove the former and install the latter. The instructions for X11VNC are very useful simply because this software's man page reveals a huge number of configuration options. Well as soon as I got X11VNC running I can see that immediately it is way better than Vino. This well and truly justifies the high praise that the combination got in the TechRadar report mentioned above. In fact I am finishing off this blog post from a remote machine running Remmina accessing MainPC running X11VNC and the result is as good as RDP, flawlessly smooth response for both keyboard and mouse, the colours look great and so on. Although Remmina doesn't appear to provide a way of displaying the client window over multiple local monitors it provides quite a painless way when it is in full screen mode of scrolling between the server's monitors by simply displaying them from left to right and moving the mouse to the right or left simply causes the client to scroll across to the next monitor. Switching between virtual desktops on the server was quite straightforward as well. So I can see that if I choose to work in the lounge with the only PC there now being a fairly lightweight AMD E350 which even with 8 GB of RAM is quite stretched running more than a web browser and definitely struggling with more than a few browser tabs open, then I do have the option of VNCing to one of the other computers. It is further interesting that starting VirtualBox on MainPC provided another means of interacting with it as a pair of windows on MainPC, compared to remoting to it with Remmina. As Remmina does not have the ability via RDP to handle multiple windows, using it under VNC is a more desirable option, although of course I do have the ability to also use XFreeRDP on this computer to get to the VM as previously mentioned.

So all round getting VNC working on my computers is a great move forward due to the location of where computers are currently being used and their various capabilities.

Friday 9 September 2016

Remote Access with Remmina and XFreeRDP

In the title of this message I have listed two remote access clients: the Remmina gui based RDP software and XFreeRDP which is command line based. Right now these are the tools I am having a play with to try out different remote access systems, which in my scenario is to allow me to remote from a Xubuntu desktop to a Windows 7 virtual machine running on VirtualBox. The big issue is that this VM is running with dual displays, which Remmina in particular doesn't support. I certainly want to have both of those displays appearing on my computer screen on the lounge computer when I work there.

Remmina is otherwise a good package that I use all the time for work purposes, it is similar to RD Tabs with a tabbed interface to access multiple simultaneous remote desktop sessions at the same time. XFreeRDP is something I have tested and shown to work with the dual screens of the VM but I was not able to find a way to exit from its full screen mode so the aim of this post is to more fully document its functionality as a reference for using it later.


The command line syntax for XFreeRDP goes along these lines

xfreerdp [file] [options] [/v:server[:port]]

There are a lot of possible switches and the ability to store settings into a .rdp file, which I presume has the same format as Windows does. The simplest option by far is to simply pass a server name and optional port number. These settings and perhaps the -f (for fullscreen) option could be saved in a shortcut without having to create the .rdp file. Pressing Ctrl-Alt-Enter turns out to be the keyboard shortcut that toggles full screen mode, but only if you put -f in to start with. But to get out, also, you can select Disconnect from the start menu. The next thought is how do you make it use the multiple monitors present in the source in this case? And the answer is the -multimon switch. So a useful command line will look something like the following:

xfreerdp -v 192.168.20.103 -u admin -p password -f -multimon

As it happens Remmina is good enough to use for non-multimon purposes so I am just going to use that xfreerdp for this exact situation, connecting to that virtual machine, which it happens I was doing with MSTSC when this computer was running Windows. So I made a shortcut with those parameters, and it works flawlessly.

Here is a description of the man page for xfreerpd:

Great image viewers for Xubuntu

I just want something that lets me zoom in and out with the mouse wheel, lets me run multiple instances, and is easy to use. Gliv fits the bill for that, and you can easily install it directly with apt. However, it is not currently maintained.

GPicView, the default viewer of LXDE, was also evaluated, but it was found to use the mouse wheel to scroll between images, which is not my preferred UI option. Therefore GPicView is not my preferred choice.

I eventually discovered Eye of Gnome (Eog), which is better than Gliv as it allows you to turn the mouse wheel the correct way to zoom in and out (that matches Google Earth and Qgis) and also names its windows directly after the image, which is handy if you have a lot of them open. Eog is therefore my preferred image viewer on my Linux computers.

Thursday 8 September 2016

More Xubuntu

As a result of rearranging things in the house and having three computers in one room, I now have three Xubuntu computers in the house and one Windows 8.1 computer. One of those Xubuntu computers is acting as a media player with Kodi. The function of the Windows computer is mostly to do with hardware I don't have Linux drivers for - scanner and cameras etc, as I can do all software requiring functions in a VirtualBox VM running Win7.

Xubuntu is a very nice system with a lot cleaner appearance than Lubuntu, but the latter is still the most resource efficient option for low spec computers. However I have found in practice there is little difference between them resource usage wise and Xubuntu is a lot easier to set up power applications on. So that is why I am using Xubuntu for everything now.

So having just wiped Windows off the lounge computer and installed Xubuntu, now I will be setting it up to use much as it was used before, except that all the hardware will get connected to the Antec-Gigabyte computer which has joined the two other PCs in my bedroom and is the sole Windows computer in the house.

Saturday 3 September 2016

Multimedia playback in Linux & remote access

One of the things people like to do with PCs is to use them as media centres for playback and recording of videos, TV broadcasts and music etc. I haven't really got into this in a big way other than for playback of my CD and DVD collection. Nevertheless I have used a computer for playback of these things for quite a few years. In Windows, DVD playback capability was not included after Windows 7, requiring either an app purchased from MS, or some third party software. VLC Media Player has proven to be reasonably good, although not perfect, as the playback sometimes stutters (although it is possible the computer's hardware may be at fault). 

VLC is one of the free cross platform packages available but others do exist. Kodi (formerly XBMC) is a dedicated media player package that is produced for various platforms, including Linux (precompiled specifically for Ubuntu). I have installed it onto the secondary Xubuntu computer and am just getting used to it, it looks very good and it is designed for media-centre type scenarios, where the user will be some distance away from the screen and perhaps (as in my case) using a media-centre type of keyboard (I use the Logitech K400 Plus which incorporates a trackpad) with less precise pointer movements and therefore needing larger visual targets etc. 

So as I am rearranging things here at home with where the computers are situated and will have the 2nd Xubuntu computer in the bedroom alongside the main pc as a 2nd PC and media centre then Kodi will get a lot of use and with all the plugins for various features it looks very good for that purpose.

Remote access is another interesting subject for Linux. In my lounge I have a computer I use for study as it is sometimes a quieter working space than the bedroom and it ideally remotes to the computer I actually use most of the time. Now in fact that computer is a VirtualBox VM running on the mainPC and it can be remoted onto in just the same way as if it were a regular Windows computer. VirtualBox also has its own remote display (VRDP) system which is backward compatible to RDP meaning it is a way of getting RDP on virtual machines that aren't running Windows. I haven't looked into this to see if it is a better option than using the regular RDP for a virtual macine running Windows.

So that is how I actually work with remote access at the moment as I haven't got to the stage of needing to remote access the main part of the computer running Xubuntu just now. If I do get to that point at some time in the future then it will probably be VNC I will use as this is well supported in Linux.


Linux RAID-1 [9]: Removing an array

The last step having set up a new array is to remove the old one from the computer. This is relatively straightforward.

Firstly the array has to be unmounted, which is a simple umount command with the path, in this case umount /oldhome. The next thing is to run a mdadm command to remove the array from the disk.

mdadm --manage --stop /dev/md0

is the command that will stop the running md0 RAID array.

Then another command removes the array:

mdadm --remove /dev/md0

Final action is to remove the superblock on the individual disks (only one disk present in this case as the failed drive has already been taken out of the computer):

mdadm --zero-superblock /dev/sdd

We also need to remove its entries from /etc/fstab  and from /etc/mdadm/mdadm.conf


At this point GDU is showing me the disk is just a disk with nothing on it, it still shows "Block device /dev/md0" however, I have not got much of an idea whether that will just disappear the next time the computer is restarted, I assume it will.








Wednesday 31 August 2016

Linux RAID-1 [8]: Setting up the replacement array

As I mentioned last time I have bought the 2 TB disks so that I can have a RAID-1 array in the main computer that is big enough to hold all of my stuff, meaning the second computer (and by extension, its screens and desk) will be effectively redundant. So here I am essentially repeating the steps of the earlier instruction sequence of the previous articles in this series.

After backing up my stuff, I shut the computer down, found the faulty disk from the 1 TB array and removed it from the PC. The other half of the array was left in the computer and reconnected while the two new disks were put into the other bays. With everything plugged in the computer was turned back on again and should have come back up straight off but it was not so easy as the 1 TB disk was now plugged into a different port, so a mdadm --assemble --scan command sequence was needed to get the 1 TB array back online. After another reboot everything is now where it is supposed to be so /home is back going again, and the two new disks are ready to be made into a new array.

The new disks are /dev/sdb and /dev/sde this time, so first thing to do is to partition them, using gnome-disk-utility, with one partition on each disk. Then we issue this command to create the array:
sudo mdadm --create --verbose /dev/md1 --level=mirror --raid-devices=2 /dev/sdb /dev/sde

After that we can see that /dev/md1 is online in GDU. Now have a look at /proc/mdstat and it tells us that /dev/md1 is resyncing and will be finished in about three hours. This does not preclude it from being used, however. So our next step is to format the 2 TB partition, mount it and then rsync the existing home drive to it. At the same time we can start to move data across from the 2nd computer, but before we do that, the 2nd computer needs to have its own local backup, so that is also started with another rsync.

Next is to save the details of the array for mdadm 
mdamd --detail --scan >> /etc/mdadm/mdadm.conf

I then rebooted and on running a blkid see the new array renamed to md127. Then we put it into /etc/fstab as the mount point for /home, and change the old disk to /oldhome. So the fstab first looks like
# <file system> <mount point>   <type>  <options>       <dump>  <pass>
UUID=f49e5734-6c56-4f17-81b1-6423e4045e75 /               ext4    errors=remount-ro 0       1
UUID=cd9d465e-5574-4afd-85a6-851ace3959b7 none            swap    sw              0       0

UUID=6f1ea562-fb78-44ea-9262-d4234886064d /home           ext4    defaults        0       2

and now it will look like

UUID=f49e5734-6c56-4f17-81b1-6423e4045e75 /               ext4    errors=remount-ro 0       1
UUID=cd9d465e-5574-4afd-85a6-851ace3959b7 none            swap    sw              0       0
UUID=6f1ea562-fb78-44ea-9262-d4234886064d /oldhome        ext4    defaults        0       2

UUID=dfc4fa75-63e1-49ad-9a82-d9d0b826db3f /home           ext4    defaults        0       2

Now reboot is the simplest step to follow next. On coming back up after the reboot, everything is mounted correctly and some simple checks show that stuff is where it's meant to be.

In this case the final step is to move - over the network - the files from the 2nd computer. Leaving it only with the VirtualBox and Google Drive folders in its home drive for when I do my study in there. At some point in the future I will remove the md0 RAID array and its disk from the system.

The relocation of data from both sources was completed during the day and during the next night all of the data from the old 1 TB array (oldhome) was again copied into the 2 TB disks, this time as a move rather than a copy, in order to clean up the 1 TB array, the new array having proved its operational capability. Actually this is what I should have done the first time instead of rsyncing from oldhome to home. The hidden folders (dot folders) on oldhome will get moved into a backup folder rather than pasted over into home as a lot of contents will have changed unlike the visible folders.







Saturday 27 August 2016

Linux RAID-1 [7]: Recovery and fault-finding

/dev/md0:
        Version : 1.2
  Creation Time : Sat Jul  2 04:03:31 2016
     Raid Level : raid1
     Array Size : 976631360 (931.39 GiB 1000.07 GB)
  Used Dev Size : 976631360 (931.39 GiB 1000.07 GB)
   Raid Devices : 2
  Total Devices : 1
    Persistence : Superblock is persistent

    Update Time : Sat Aug 27 15:49:27 2016
          State : clean, degraded 
 Active Devices : 1
Working Devices : 1
 Failed Devices : 0
  Spare Devices : 0

           Name : patrick-H97-D3H:0
           UUID : fb59dadf:72ab4a7e:f821e41a:daa5988b
         Events : 524054

    Number   Major   Minor   RaidDevice State
       0       0        0        0      removed
       1       8       48        1      active sync   /dev/sdd

As you can see from the above, the RAID-1 array on my main computer has problems, with one of the disks falling out of the array. Hence in the table at the bottom, where the two devices in the array are listed, one is shown as removed, and the overall state of the array shows as "degraded".

The cause is shown in this extract from the kernel log:

Aug 27 11:05:09 MainPC kernel: [25592.706805] ata3: link is slow to respond, please be patient (ready=0)
Aug 27 11:05:12 MainPC kernel: [25595.794847] ata3: SATA link up 6.0 Gbps (SStatus 133 SControl 300)
Aug 27 11:05:12 MainPC kernel: [25595.801763] ata3.00: failed to IDENTIFY (I/O error, err_mask=0x100)
Aug 27 11:05:12 MainPC kernel: [25595.801765] ata3.00: revalidation failed (errno=-5)
Aug 27 11:05:21 MainPC kernel: [25604.206942] ata3: SATA link up 6.0 Gbps (SStatus 133 SControl 300)
Aug 27 11:05:21 MainPC kernel: [25604.213541] ata3.00: failed to IDENTIFY (I/O error, err_mask=0x100)
Aug 27 11:05:21 MainPC kernel: [25604.213543] ata3.00: revalidation failed (errno=-5)
Aug 27 11:05:21 MainPC kernel: [25604.213556] ata3: limiting SATA link speed to 3.0 Gbps
Aug 27 11:05:29 MainPC kernel: [25612.239030] ata3: SATA link down (SStatus 0 SControl 320)
Aug 27 11:05:29 MainPC kernel: [25612.239034] ata3.00: link offline, clearing class 1 to NONE
Aug 27 11:05:29 MainPC kernel: [25612.239037] ata3.00: disabled
Aug 27 11:05:29 MainPC kernel: [25612.239048] sd 2:0:0:0: rejecting I/O to offline device
Aug 27 11:05:29 MainPC kernel: [25612.239051] sd 2:0:0:0: killing request
Aug 27 11:05:29 MainPC kernel: [25612.239056] ata3.00: detaching (SCSI 2:0:0:0)
Aug 27 11:05:29 MainPC kernel: [25612.239183] sd 2:0:0:0: [sdb] Start/Stop Unit failed: Result: hostbyte=DID_NO_CONNECT driverbyte=DRIVER_OK
Aug 27 11:05:29 MainPC kernel: [25612.242814] sd 2:0:0:0: [sdb] Synchronizing SCSI cache
Aug 27 11:05:29 MainPC kernel: [25612.242840] sd 2:0:0:0: [sdb] Synchronize Cache(10) failed: Result: hostbyte=DID_BAD_TARGET driverbyte=DRIVER_OK
Aug 27 11:05:29 MainPC kernel: [25612.242841] sd 2:0:0:0: [sdb] Stopping disk
Aug 27 11:05:29 MainPC kernel: [25612.242847] sd 2:0:0:0: [sdb] Start/Stop Unit failed: Result: hostbyte=DID_BAD_TARGET driverbyte=DRIVER_OK
Aug 27 11:05:29 MainPC kernel: [25612.251453] ata3: exception Emask 0x10 SAct 0x0 SErr 0x4040000 action 0xe frozen

Clearly the disk is identified as /dev/sdb, which was the disk missing from the mdadm output. There were plenty of other similar messages in the kernel log from later in the boot cycle, and the physical aspect apparent as the computer was coming one, during which time I was seeing these messages scrolling up the screen, was that disk would keep spinning up and then stopping again and resetting. The messages come up on the screen because I have textmode set in grub due to the need to have textmode startups with this computer after first installing it to facilitate swapping from the broken Nouveau drivers to NVidia for the graphics card. 

The sequence of kernel messages finally ends at 11:06:53 when the disk apparently came back online so it took nearly 2 minutes to get the disk working and meantime even though the SMART tests look OK mdadm has decided to drop it out of the array.  These disks are a pair of WD Caviar Black 1 TBs that I purchased four years and five days ago. Obviously a consideration is whether to replace them with 2 TB units as I am currently using two computers to store data each in a 1 TB array because only I ran out of disk space on the 1 TB array some time ago and decided splitting the data across two computers was a cheaper option, as some old spare 1 TB disks were available for the second computer.

I have decided to replace both disks in the array with 2 TB disks, a long deferred decision as mentioned above. This will let me bring the array up to 2 TB size and therefore allow the other computer to be eliminated as an extra storage location, because I am looking ahead to a day when having two separate computers in two rooms of the house takes up too much space and in fact, having four computers in two rooms as is actually the case, is extravagant. So the idea then would be to have just one computer desk with two computers attached to it.

Obviously there are usual lessons about backups to be learned. In this case I had a recent backup that was done when Xubuntu was put onto the computer. I also have a regular series of backups with some removable disks and the use of the RAID array is in itself a backup technique. But if there was a fire here I could still lose a month's worth of data, currently none of those backup disks are out of the house so I have to get back to taking at least one of them offsite regularly again. However at the moment not many photos are being taken and most other stuff like maps and study are backed up on Google Drive, so I could probably manage. Full rsync backups are what I mainly do at the moment because I haven't worked out anything else.

mdadm and some other systems exist for monitoring RAID arrays but I haven't used any monitoring tool to date.

Well I ran two SMART tests on the disk using smartctl and the short one was find and the long one came back as a read failure. In fact gnome-disk-utility, the GUI we all use to look at disks in distros like Ubuntu, has now updated itself to show the disk as failed. So therefore I am working for sure on the basis that as of now I need to make some backups and replace this disk because it really is stuffed. New disks will arrive next week and new array will get put in soon as after that. So I make a backup to a removable drive in the caddy of /home and then redirect to it, then take out both the old disks, put the new ones in, copy /home back to them and then redirect back to them. 

Sunday 21 August 2016

Virtualisation and Qgis

UPDATE: The below comment does not make a lot of sense as I have since used Virtualbox VMs very extensively to run Qgis and edit maps. Right now I don't have need to do this, but in the past it has certainly been a major part of the Qgis work that I have done.

(Unfortunately after further testing the use of VMs, at least under VirtualBox, to run different Qgis versions, has been found to be unsatisfactory. It is easy to see from a technical viewpoint how this would be possible in an environment where video, keyboard and mouse environments are emulated, rather than native. 

The solution that has been arrived at for now is to use the VM to edit project properties like the rule based styles in an older version of Qgis, while continuing to use the development version for everything else. However should the development version have too many issues to make this work I would be forced to revert to the current release version, since in Linux it appears to be much more difficult to have multiple versions on the same computer simultaneously)

One of the perils of running a beta version of software is that the bugs may make it unusable. This has happened from time to time with Qgis but it has never been as much of a problem as it is with the current version 2.17-master. A bug has been found that freezes Qgis when attempting to edit rule-based styles in data layers. Since all of my layers use rule-based styles (there is a "type" field in each record which is used in all the rules) obviously I have a problem with this version.

For now I have set up two VMs to use the current release version of Qgis. One of these is running the same version of Xubuntu as the host and the other is running Windows 10. Since the host now has 24 GB of RAM (3/4 of the maximum of 32 GB the board supports) it is easily possible to run one or even both of these VMs at the same time. I won't actually need to do this but I did run the Xubuntu one and set up the Win10 one at the same time.

Here you can see the Xubuntu VM running Qgis 2.16. The Maps folder on the host has been shared and after a fashion I was able to work out how to permanently mount this to a Maps folder in the home folder of the user on the VM. So when I save stuff on the VM, it goes back into the right folder on the host (using network shares as the shared folder capability in VirtualBox ran into permissions problems).

And here we have Qgis running on the Windows 10 VM. Again using a mapped network drive, this time drive M mapped to the share. As you can see a key difference is that the Windows 10 guest additions don't support full automatic resizing of the VM's display, which becomes a window centred inside the VirtualBox window. 

So at least I can continue working with Qgis (probably preferring the Xubuntu VM, but may need the Win10 one to run IrfanView when needed) until the bug in 2.17 is addressed.

Both these VMs are using the host's folders to access the map content. This also means the overGrive app running on the host will synchronise changes to the NZ Rail Maps Google Drive.

The third VirtualBox VM I have running on this machine is a Windows 7 one. One of the reasons for having it is to give me access to a different Google Drive - this being the one linked to my main email account - which in this case is used for a whole lot of personal stuff. So for example my budget spreadsheet and all of my study materials are on that one. There is also software I need for those purposes which is not available on Linux.

I also had a play with VirtualBox on one of my work computers which has a Wolfdale Celeron E3300 dual core and only 4 GB of RAM. This particular CPU was specially selected at the time of purchase because it supports VT-x (and 64 bit) so it just manages to meet the requirement for hardware assisted virtualisation. The computer is 6 years old and long since ceased to be of any use at home because the 4 GB memory limitation, driven by having only 2 memory slots and unavailability of DIMMs with more than 2 GB on them, means it runs out of steam pretty quickly. Still, with Xubuntu on it which has lower memory footprint than a lot of other Ubuntu based distros, it has managed to set up and run a Windows 10 VM in 1 GB of memory. This has been pretty slow work, but it looks like it should be able to achieve what is needed - and it's possible I might be able to increase its memory allocation if it can run some of the tasks I currently put on my Xubuntu desktop. It worked surprisingly well with just 1 GB allocated and the host also worked very well with a number of other resource-intensive applications such as Google Earth and Thunderbird running at the same time. It so happens that if I put the VM up to full screen then it is able to adjust the resolution to 1920x1080, but on my home computer where 1680x1050 is the screen resolution, that VM can't be made to go up to anything higher than 1152x864, so it would seem the Vbox Guest Additions have limited adaptability to different host screen sizes.

Subsequent to this I discovered issues in the Xubuntu VM that were peculiar to it, that were not seen in the Win10 VM, so I switched to the Win10 VM and then investigated whether the screen resolution could be bumped up. It turns out this is possible with the VBoxManage command on the host. Specifically the command VBoxManage setextradata "VM Name" CustomVideoMode1 1680x1050x32 did the trick, and after restarting the VM, the new fullscreen resolution appeared on the display settings. VBoxManage is basically the command line tool for changing all the VM's settings, although it obviously appears to be a superset of what is available in the GUI. 

Thursday 18 August 2016

The Linux systemd debacle

If you are reasonably familiar with the Linux ecosystem you will be aware that there has been a big rift over a process called "systemd". This particular process, or processes, are concerned with Linux system and session management. The issue has been for many that systemd steps beyond its role of being the "init" process and its creators wish to extend its operation into an increasingly greater and more invasive aspect of taking over more and more of the system management processes.

My conviction is that the developers of "systemd" are trying to make their project bigger than anyone else's, including the kernel project which they don't lead or control. Rightly or wrongly, it is seen in terms of their own egos, and also the implied desire of their employers, Red Hat, to have a strong influence in the direction of where Linux will go in the future, to, presumably, the corporate and financial benefit of their business interests.

The Linux community has as such become strongly polarised between those who see the implied dominance of Red Hat as the issue it undoubtedly is and those who are less concerned with the overall implications of that. There is now a specific fork of Debian concerned with ensuring systemd remains completely optional, following a bitter debate and split over the issue in their community.

As an end user I am somewhat concerned we will have limited choice as I install distros that have transitioned to systemd and as I am not willing to switch to a distro that specifically excludes it, for the sake of doing that. Unfortunately due to software compatibility issues and my own needs and expectations of a distro, I will have very little choice in doing that. 

It is quite right that the Devuan admins and others have highlighted the risks of allowing a for-profit corporation to dominate the Linux platform because of where Linux has come from and how much difficulty could be caused in the future, that everyone has already come to Linux to escape from. That makes it easy to understand the heated debate and divide in the Debian community particularly.


Monday 15 August 2016

Enable hibernation in Xubuntu

Hibernation is a somewhat controversial subject in the Linux desktop community. It is considered to be difficult to implement and to have the potential to cause a lot of problems. I have used hibernation wherever possible on all my Linux desktops but it is not always straightforward. Older hardware may not support it, and even where the hardware is modern, changing a card could be enough to spike hibernation and require a reinstallation to fix.

Here are the instructions for changing the settings to enable hibernation on Xubuntu. These have been verified as applicable to 16.04 LTS.

Hibernation uses the swap partition to store the memory contents of the computer. You should ensure the swap partition is at least twice the size of the computer's RAM.

My experience has been that older computers such as my dedicated work computers, which are both of advanced age, will not hibernate reliably in any OS. However, Xubuntu hibernates well on the four year old Ivy Bridge and two year old Haswell Refresh computers.

Saturday 13 August 2016

Installing Chrome on Ubuntu

What's interesting about Chrome is the different icons. On Mint computers and Lubuntus the icon has been square, while on Xubuntu we are back to the old round icon. This may be because of the different installation method I have used.

In general it is better to use a package supplied with your distro rather than one you download directly from a website. There are various reasons for this, but key among them is that it can include an update system that will cause it to be updated automatically. Usually this is done by having a package repository added to apt's sources.list file, so that new package versions are detected with an apt update command.

Opera's downloadable browser package is an example of incorporating this capability within the install. The Chrome and Google Earth packages also have this capability. On the other hand, Firefox Developer from their website does not give you that, as it does not have an install script provided. As such, it means Aurora cannot self update. Use ubuntu-make instead to install Aurora.

The method for installing Chrome as found on the Internet is as follows:

Install the signing key for the Chrome archive using this command

wget -q -O - https://dl-ssl.google.com/linux/linux_signing_key.pub | sudo apt-key add -

Download the Chrome package (google-chrome-beta and google-chrome-unstable are also options)

wget https://dl.google.com/linux/direct/google-chrome-stable_current_amd64.deb
Install this package using dpkg -i

If you get errors about missing dependencies then enter apt -f install which fixes them and automatically completes the install of Chrome.


Friday 12 August 2016

Xubuntu vs Lubuntu

Well of course after playing with Xubuntu at home it didn't take much to have it installed on my work computer, a very old Wolfdale with a Celeron CPU and 4 GB of RAM. A very slow computer for its age, and also relatively old. Not sure exactly but I would guess around 6 years. This has been running Lubuntu up to now. The result of this test showed there is not really a great deal of difference performance wise, at least for the apps I use, between these two variants of Ubuntu. I understand that the memory footprint of Lubuntu itself may be slightly less. 

Nevertheless, the window manager in Xubuntu, which is not OpenBox, gives a much cleaner appearance to the operating system, a highly desirable characteristic for someone like me, who finds the interface improvements make for a very clean and polished appearance. The further advantage is that it is not difficult to install sophisticated apps like Google Earth. Therefore it is more of an all round system than either Mint, which is too resource hungry for old computers, or Lubuntu, which has too many components missing to make it easy to install power applications.


Thursday 11 August 2016

Xubuntu vs Mint

Today I decided I needed something other than Mint on one of my systems. I chose to do this because Mint has had security issues that led to their website being hacked and giving out malware, and their forum database passwords stolen. Which brings down the whole reputation of their project, unfortunately.

There were only ever going to be two possible alternatives to Mint and they are Debian and Ubuntu. I had a play with Debian, which provides Cinnamon within the distro as a supported desktop environment. Although the current Debian release has quite an old version, 2.2, you can switch to the unstable repository to get it to update to 3.0. The main issue with Debian is under the hood. Software packages I use all the time are unavailable from the official repositories because of their dislike of the licensing. They even went so far as to rebrand both Firefox and Thunderbird on the grounds that the logos are copyrighted by the Mozilla Foundation, and the resulting Icedove mail application wouldn't recognise my Thunderbird profile. Is that a big deal? Well yes, if it saves you having to reinstall all your email accounts and download all the mail all over again.

So what looked quite promising ended up being a waste of time as while I can hack installs as well as a lot of expert users, the lack of support for many things that Ubuntu/Mint have catered for was going to be a really big issue making it nearly impossible to have a smooth running system and a painless transition from Mint.

So then I had a live preview of Ubuntu followed by Xubuntu, and quickly chose the latter because its desktop environment Xfce is as good as if not better than Cinnamon. And the biggie of course is it can install Ubuntu packages, which comes down to much wider support in some areas than Debian. Ubuntu itself is limited in that Cinnamon is not officially available on it, so the way seemed clear to break with Cinnamon completely. As it happens, Xfce bears more than a passing resemblance to Cinnamon.

We shouldn't knock Debian of course as it is running under Ubuntu and all its flavours - just that it is more of a system that favours expert users, and has some limitations. Which is why building Ubuntu on top of it has the best all round outcome.

So LoungePC is getting another makeover, the third in the same week. The installer is almost identical to Mint's, both of which are somewhat better than Ubuntu's. The RAID-1 got brought back up fairly quickly and I am just finishing off some installs. 

I guess eventually the map computer will also be migrated to Xubuntu but there is no real hurry for that.

Wednesday 10 August 2016

Linux Kiosk computer with Chrome Browser [3]

So having built our image and got it going pretty much as we please, the next thing you will want is to be able to clone from one computer to another.

The default Linux install can be a single partition for the OS. and a swap partition. Since the OS is all in one partition, you can just copy that partition to each computer you want to install on. The main issue is that you aren't copying the Master Boot Record, so the computer won't just be able to boot directly after installing the partition. You also need to create a swap partition.

After looking at a few different tools that I could use to do the cloning, for a manual clone in which I copy the OS partition, set up a swap partition and set up the bootloader, the Gparted Live CD is the one to go for in this case. Gparted is a partitioning tool, but you can also use it to copy partitions between disks. You can write the iso image they supply to a pen drive and boot from that pen drive, then you can copy the partition from your source machine to another pen drive. Then using the same two pen drives, you can restore the image to another machine.

The two additional things you need to do, which can be done from the same Gparted live CD session:
  • Create a swap partition. Make sure you turn swapon in the right click menu for this partition after creating it. I used 1-2 GB as a swap partition (my computers generally have 256 MB of RAM)
  • Reinstall the boot loader (GRUB). The steps for this are as follows
    • Find out in the Gparted main window where the bootable (OS) partition you copied to the disk is e.g. /dev/sda1
    • Go back to the main menu of the Gparted live CD and select Terminal
    • Type in the following commands:
    • sudo mount /dev/sda1 /mnt
    • sudo mount --bind /dev /mnt/dev
    • sudo mount --bind /proc /mnt/proc
    • sudo mount --bind /sys /mnt/sys
    • sudo chroot /mnt
    • grub-install /dev/sda
  • Then reboot and away you go.

I guess the Grub install steps could be scripted.

The one thing possibly I may do with these computers is to test Ofris (a deep freeze tool) to see if they can be made completely bombproof. And I have made a simple website that will come up as the default in the web browser when you start them up. So there we have it, they just have to be good for 3 months of use. 

For remote maintenance you can just use SSH to log into them and check anything or maintain anything on them.

The last issue that I needed to resolve when  cloning was to address that the clones would have the network card coming up as eth1 instead of eth0, and as the settings which enable the network card would refer to eth0, which didn't exist in the computer, it would have no networking. With the help of the Ubuntu community I discovered that a file called /etc/udev/rules.d/70-persistent-net.rules will contain an entry that effectively locks the network card (eth0) to the MAC address of the original machine. The entries relating to this should be deleted before cloning the computer, but they are regenerated each time the computer boots, so you have to be sure that when you do your clone you have removed them before booting to the clone environment, or else directly edit this file on the source partition while in your cloning environment.

Monday 8 August 2016

Linux Kiosk computer with Chrome Browser [2]

So my steps followed are:

  1. Install Ubuntu Server 14.04.5 from a bootable pen drive
  2. Set my username to user with a password
  3. Enabled automatic updates
  4. Install OpenSSH server in the tasksel at the end
  5. Reboot
  6. Log or SSH into the system (latter from another system is preferred since you can copy and paste instructions from this web page)
  7. Install some packages:
    1. sudo apt install --no-install-recommends chromium-browser
    2. sudo apt install  --no-install-recommends xorg openbox pulseaudio
    3. sudo usermod -a -G audio $USER
  8. Edit the kiosk.sh file:
    1. sudo nano /opt/kiosk.sh
    2. Enter the following lines:
      #!/bin/bash

      xset -dpms
      xset s off
      openbox-session &
      start-pulseaudio-x11

      while true; do
      rm -rf ~/.{config,cache}/chromium/
      chromium-browser --incognito --disable-background-mode --disable-sync --start-maximized --no-first-run 'http://www.example.com
      done
    3. Ctrl-O to write the file then Ctrl-X to exit
  9. sudo chmod +x /opt/kiosk.sh to make the script executable
  10. Edit the kiosk.conf file:
    1. sudo nano /etc/init/kiosk.conf
    2. Enter the following lines:
      start on (filesystem and stopped udevtrigger)
      stop on runlevel [06]
      
      console output
      emits starting-x
      
      respawn
      
      exec sudo -u user startx /etc/X11/Xsession /opt/kiosk.sh --
      
      
    3. Ctrl-O to write the file then Ctrl-X to exit
  11. sudo dpkg-reconfigure x11-common and select "Anybody" can start the X server
  12. try sudo start kiosk for testing and away it should go (or reboot)
  13. Change grub configuration to hide startup messages:
    1. sudo nano /etc/default/grub
    2. Find GRUB_CMDLINE_LINUX_DEFAULT and put quiet splash inside the quotes.
    3. Ctrl-O to write the file then Ctrl-X to quit
    4. Run the command sudo update-grub
    5. Reboot to test
I am still looking at whether to clone the disk to copy it to another computer. It looks complex to do this, because we really want to be copying a whole disk (two partitions) instead of just a partition at a time.

Chromium policies are also a possible scenario as referred to in the previous post.

There is some OpenBox stuff we want to change, particularly the number of virtual desktops offered and the right click menu that we don't want. Only one virtual desktop and disable the popup etc.

To do this copy /etc/xdg/openbox/rc.xml to /home/user/.config/openbox and then use nano to edit the latter. Look for the <desktops> section to set the number of desktops to 0, look for <keybind>s that relate to switching desktops, and look for a <mousebind> for a right mouse click that relates to the root menu to get rid of the right click menu.