Sunday 24 December 2017

NUC as HTPC or small portable computer [2]

Last time I posted about this. Here are some further thoughts after using a NUC for 2 1/2 weeks. In addition to HTPC, I took the NUC with me on a holiday in the south of New Zealand, leveraging its tiny form factor to save on transport space/weight.

The NUC has been a very useful and dependable replacement for the AMD E350 computers I used to use in the HTPC role at home. With current driver support available in Xubuntu, the graphics performance is very satisfactory, as is other performance. While down south I used it to run the maps drawing software Qgis as well as for more general tasks, and it handled the heavy load of rendering the screen and composer output just fine. If anything, the hardware does seem to be quite fast with a Celeron J3455 quad core and 8 GB of RAM compared with the Pentium G dual core and 24 GB of RAM in my main pc. 

The main issue you do have to be aware of in these NUCs is the built in wireless capability. The internal wireless card which is preinstalled comes with a couple of small antenna wires already attached to the case. The performance of these will not be as good as a regular whip antenna so you have to be careful with where you put the NUC. On holiday I was only getting 1 Mbps from the campground's free Wifi which was very disappointing and almost unusable (the iwconfig command is used to get details of wireless performance). I even burned through several GB of Vodem data (and had to pay topup charges as well) to be able to use the NUC to do all the usual stuff that I do at home on the internet. However as soon as I raised the chassis to a high windowsill the wireless speed vastly improved to around 65 Mbps and now everything seems to be working just fine. Even at home the wireless speed is slower than what the Ubiquiti wireless bridge cabled onto the old computer was capable of, but I will have a look at the location of the box if I decide more speed is mandated.

Thursday 7 December 2017

Using a NUC as a HTPC

So I have been having a play with a NUC as a HTPC. It looks like it could be a good replacement for the AMD E350 I am currently using to play movies and do a few unrelated things in my bedroom.

The main issue with the NUC is getting Linux support for its hardware. So I chose to test it with Xubuntu because Intel specifically supports only a few distros (Ubuntu and Fedora particularly) for hardware. This makes it easier to get the drivers needed (there is a wireless and bluetooth card built into the one I am testing, the NUC6CAYH which can have one 2.5" laptop HDD and up to 8 GB of RAM installed).

There have been issues in the past with EFI support for NUCs. Suffice it to say I couldn't get Debian to boot on the NUC. But the installer for Xubuntu was able to do the EFI install properly. It is the first time I have set up a computer by choice as an EFI installation, which requires a new skill set - a separate 300MB FAT32 partition is required specifically for this function. The system restarted just fine - it looks like the Xubuntu installer puts the required files into the folder to be compatible with what the NUC expects to find on startup.

The messy part was updating the NUC's firmware - the update tool provided by Intel only works on Windows 10 x64, so I have had to install that first, then run the updater, to ensure it has all the updates needed. After that I took Windows off to install (firstly) Debian, which couldn't boot as the Bios wouldn't recognise it. At that point I did some checking for compatibility issues and decided to go with Xubuntu and using the UEFI boot. 

Out of the box it looks like most hardware is OK - it recognises the wireless card, and possibly Bluetooth will work, which will be interesting. I used to use Bluetooth on mainpc to transfer photos and files to my phone when I wanted to post on Instagram or something like that. It was flaky however and I haven't bothered with it since reinstalling some considerable time ago - possibly Debian doesn't have the driver stack installed by default - because I started using Google Drive to transfer stuff instead. I can't think of any particular reason to use Bluetooth with a HTPC so it probably isn't that big a deal. It would only be useful if I could use it to play music from a phone to the HTPC's speakers and I am pretty sure that is not possible.


Their claim to fame is to be a very small package - smaller than a Mini-ITX, which is partly achieved by having sideways memory sockets, eliminating any slots on the board, and having only a single SATA port and limited other headers etc. Another part of it is that the CPU which is a Celeron J3455 quad core, is mounted directly onto the mainboard without a socket, and because it has a very small TDP of only 10 watts maximum, the fan needed can be quite small as well. In fact it is possible in the Bios to configure the system to turn the fan off when it isn't needed, and in truth I have never noticed the fan running, it must be very quiet.

The system has four USB ports, VGA, HDMI, SD card reader and two sound connectors one of which doubles as an optical and analogue output. It comes with a universal power adapter with different plug heads so you just clip on the plug head that fits your local mains socket. It comes with a VESA mounting plate which as we all know can be used just as a general wall mount or whatever. The system uses an Intel visual bios which has the annoying habit of not recognising the Logitech multimedia keyboard I use properly - specifically its function keys, so I have had to plug in a corded keyboard to change the Bios settings.

So now I am using Xubuntu again - this doesn't mean I have any intention of switching any more computers back to it (especially since mediapc was upgraded to Debian recently) - it is just easier on the NUC because of better hardware support. Debian would tell you on startup about a pile of missing firmware files in the Linux kernel - Ubuntu has installed all the right files itself so it just works without all the error messages.


Saturday 2 December 2017

Interview with Gary Sherman, the founder of the QGIS project

Gary Sherman is the GIS industry professional who started the Qgis project way back in 2001. Here is a two-part interview with him on Youtube.

Wednesday 29 November 2017

Windows 10 downgrade

When I started using Linux I did not know how successful a changeover to it could be. I set up a new computer at the beginning of this year to have Windows 10 on and deemed it worthwhile to have a PC in a space wasting tower case with new hardware to run Windows 10 on because I expected to keep using Windows 10 a lot.

Now it's nearly a year later and I hardly use Windows 10 at all. The most I need to use it for is printing, scanning and downloading photos from my cameras and using IrfanView to edit the camera's images. This is not something I will even turn the computer on once a week for these days. Until very recently I thought I would still need it for graphics editing but having learned Gimp on my Debian desktop in the last month I won't need to use Paint.net at all. I still have some documents I edit in Word but that doesn't need much resources. 

So as I have an old slow AMD E350 mini-ITX in a small chassis available I have decided this will be my Windows 10 computer, and the Home edition (which I am buying a license for) will serve just fine for this purpose. Even though everyone knows MS just forces updates to Windows 10 on you at the drop of a hat, the testing on this older computer shows it will run 10 and the display will work well. I don't need much hardware support for it except for the screen and USB ports, and with the screen I have connected to, the default Windows drivers work well enough. It has 8 USB ports which should be enough as the screen it connects to has another 4 port hub on it. The tower has had a lot of extra USB ports added by me (I think there are 4-6 on the back of the motherboard and a USB 4 port card and a couple of 2 port brackets to connect to the motherboard connectors as well as the two on the front panel and a 2 port USB hub on the monitor) and I don't expect to have such a need of these ports as has been the case when it was used with Windows. There were three or four camera/phone USB cables connected and I don't know if I will keep these all plugged in at once as I have with the tower which is less accessible.

This means I can reclaim that nearly new tower computer for Debian and that's what I working on at the moment. I should get the Windows 10 license I have ordered in a day or two and then I can start setting up the E350 on Windows 10 with the stuff it needs. The tower is going to be a lot better running Debian 10 with a lot of stuff because it has an Intel graphics chipset instead of an older Radeon chipset that there aren't any drivers for any more (either in Windows or Linux) that redraws the screen really slowly. The tower also has room to add more disks in future (it already has two inside) and a removable drive bay like all my computers. There won't be much in the way of local resources on it but should I get into audio recording with some of the great software available on Linux it is possible that it could be used that way. I am planning to evaluate Ardour and a couple of other serious audio packages on it just to see how good they are, as well as Qgis. Mostly it will be testing Debian 10 which is a development release.

Monday 27 November 2017

More Debian

Since last writing the remaining Linux computers in the house have been changed over to Debian. As usual this has been a straightforward procedure, in the case of mediapc reconnecting the RAID array being as simple as it was on mainpc. All computers use XFCE as their desktop environment.

There are now four computers in this household running Linux and one running Windows 10. The possibility is that the Windows 10 computer will be changed to the lowest spec one due to the fact I use Windows quite rarely and struggle to see the rationale in having a computer sitting there not being used a lot of the time. However the case for the fourth Linux computer isn't that great either so of the four computers on this desk, at present the lowest spec one is running Linux and the 3rd lowest runs Windows. The other issue with switching computers is ensuring there are enough USB ports as there are quite a few camera cables and other devices plugged into the Windows computer such as the printer and scanner.

Due to now having a fourth computer at my desk (a small form factor that used to be playing videos in my bedroom) it is using its VESA bracket to mount on one of the monitor support posts at the side of the desk and has its own keyboard and mouse plus the KVM to share the main keyboard/mouse. It has one display of its own plus sharing a display with the mediapc. At least this lets me make reasonable use of it to do some actual work on.

Debian 10 is a test installation as this is a development version of the software but it will run Qgis development edition and I am using this where I can and may install additional applications to see if I can test out this version of Debian reasonably well. I may have a look at a good audio editor if ones exist to see how well it would work on the platform.

The bedroom PC is hopefully getting replaced with another low-end system but using better supported hardware (Intel chipset) that can play videos because this has been an issue with the AMD E350 board (the same board also in the 4th computer on my desk). Since AMD has dropped support for the Radeon chipset on these boards in Linux as well as Windows, these computers now struggles to play even Youtube videos full screen. I could put an older version of Debian or Ubuntu on it and regain that performance but replacing the computer with one using Intel chipsets is the preferred scenario. 

Monday 13 November 2017

Best open source software

Here's my picks:


  • Debian GNU/Linux. One of the oldest distros around and definitely among the best, many others are derived from it. First released 24 years ago. I now use Debian on my desktops daily, generally with XFCE desktop environment.
  • Xubuntu Linux. After trying a few variants of Linux and Ubuntu this was the one I used the most until recently and is still very well worthwhile. It is basically Ubuntu with XFCE as the desktop environment and compared to Cinnamon that runs on Linux Mint, it is very economical on system resources, you will appreciate this whether you run low end or high end hardware.
  • Qgis. A great GIS package and a great FOSS package. I've been using it since 1.8 (the current stable release is 2.18 but there is a 3.0 due early next year) and have frequently used the development masters for day to day work on my maps project.
  • Gimp. A brilliant graphics package every bit as good as Photoshop, yet completely free and open source. A well deserved reputation earned as a piece of high quality and well supported software.
  • Inkscape. The other piece of high quality graphics software, whereas GIMP works with rasters, Inkscape works with vectors. It can open one of my maps produced in a PDF file and edit every element in the map easily. I haven't yet had occasion to use Inkscape in a production environment but it is standing by for any time in the future that I might have to heavily customise any maps for special purposes.
  • Firefox Developer Edition. This adds on to the basic functionality of the regular Firefox release. It runs e10s out of the box and also has many tools provided to aid web development. Whilst I don't use these tools much myself, FFDE (formerly Firefox Aurora) is a great general purpose web browser.
  • Disk Usage Analyser (Baobab). Every so often you can run out of disk space, this package does a great job of analysing disk usage and helping me to keep on top of managing my computer's home drive free space.
  • LibreOffice. I haven't made much of this other than Writer but the capability looks to be very good. I must spend some time digging more into what you can do with this software suite.
  • Thunderbird. One of the best email clients ever written, its strength lies in its common heritage with other Mozilla projects, which includes the ability to be customised and added on to with extensions. The calendar which can work with Google Calendar is an example of this.
  • Youtube-dl. If you have ever installed some dodgy "youtube downloader" only to find your PC was taken over by spyware, you'll appreciate this great command line package. Very easy to use, it is also hugely customisable with dozens of switches and settings.
  • Kodi. I have a couple of my computers running this software in use every day for playing video clips, ripped DVDs and extracted CD tracks etc. It is designed to be used with a multimedia keyboard and has a wide range of plugins and extensions available.
  • Simple Screen Recorder. A great and easy to use package for capturing your screens. Does not have any technical limitations or put any watermarks into your video clips.
  • Bluefish Editor. A great text editor, I mainly use it to hand code the HTML on my web site. It has all the usual stuff like syntax highlighting, code completion, colour coding etc etc.

Monday 23 October 2017

Evolution and digiKam

UPDATE: Unfortunately in practice Evolution has turned out to be another poorly supported piece of open source software that is only getting minimal updates and bug fixes. It is not too much exaggeration to say its support for Imap resembles that of Outlook that it tries to emulate. As far as I can tell the software has not had a major update since 2012. If all the bugs were fixed it would be great software, but because there are serious issues like not being able to connect to calendars, and hanging when sending or receiving messages, and stuff that makes it unreliable, then I have to go back to using Thunderbird, which at least is dependable even with the (relatively minor) issues that it has.

As noted the main issue with Thunderbird is its inability to reliably update counts in additional Imap folders other than the Inbox. One way around this is to have, as I do now, three general purpose email accounts which are graded as low, medium and high priority.


Having written about both last time I am getting more into depth this time around taking a serious look at what I can do with both pieces of software.

Evolution as an email package is written to emulate the look and feel of Outlook. It is of course much superior and also better than Thunderbird in some ways. At the moment the biggest thing for me is when using filters and labels to move messages to different folders. Thunderbird handles this poorly by not updating the subfolders until you actually enter them. Evolution's performance is better in this regard. Its limitation however is that it doesn't have the different folder view options like selecting favourite folders than Thunderbird provides. However, you can choose to unsubscribe from folders - I dislike Gmail automatically marking messages as important or starred and putting them into these folders, so I can unsubscribe from them and not have them wasting space in Evolution.

Since Evolution has turned out to have superior performance to Thunderbird I have decided to use Evolution for my day to day stuff and use Thunderbird for the accounts I am closing down until they close, it looks like Evolution can handle Gmail and Google Calendars and Contacts quite well without problems and Thunderbird has a lot of issues with not updating the counts in additional folders very well so I am working to implement this decision.

digiKam is an interesting piece of software that I need to evaluate further as well, one of the key considerations is whether it could take over from using a Windows computer to download and rename the photos from cameras. At the moment the key issue is whether I can get it to connect to any of my cameras. Initially Debian did not recognise a camera out of the box and it seemed the issue would be that the necessary capabilities are not by default enabled in Debian. When I tried plugging it into the mediapc running Xubuntu 17.10 it was detected and mounted immediately. After looking at some stuff with Debian I installed a package called pmount, and then plugged the camera back into the mainpc, and it was detected and mounted automatically as a USB mass storage device. digiKam has been able to import pictures from the camera and I am now testing it further to see how useful it might be and what I can do with it.

Sunday 22 October 2017

Facebook sucks...

As I have written before, but I think really apart from interacting with friends, Facebook is a giant experiment and not one that actually achieves a lot of good. The whole premise of Facebook is to be bigger than anyone else and to churn fantastic profits from advertising. The "social media" is about creating a product that people think is amazing so that people will use it, so that they can sell the advertising.

It is when you start to look between the lines and see all the cracks that you realise that many of these free social media platforms are a crock. At the moment Google is the only social media platform that I haven't had a bad experience with. I suppose there could be one coming - or I could count against Google their annoying repetitive advertisements which mine your emails and search history daily and immediately. Fortunately there aren't that many pages where advertising for Russian singles sites pop up but at times it seems like that is the only adverts you see all day long. I must see about getting an ad blocker on the browser because some of the other adverts especially the ones that are for fake products and scams are really annoying as well.

I did previously comment on my experiences with Flickr. Facebook and Flickr are about equal in my list of loathsome social media experiences. But most of my contempt is reserved for Facebook. As many people are finding out to their cost, the amount of trolling and self-aggrandisement on Facebook knows no bounds. There have been numerous shocking examples of bullying behaviour and trolling on Facebook. Here is an example:
In this case a coffee shop manager made some stupid comments about their business, the result was a group of people vandalised their Facebook page and caused their business to have to close down.

Closer to home there have been several examples that come to mind. One was a local school where an unsubstantiated allegation of bullying was made by a parent who put a sign up in the street and then got the news media to publish a story highly biased towards him. The story resulted in the school having to take their Facebook page offline because of people vandalising it with attack comments. All on the basis of hearsay, not proven fact. I posted a comment in favour of the school, which got 250 thumbs down on the Stuff newspaper website. 

Another case is this one
Essentially there is no system that can prevent someone signing up a fake social media profile and having a go at someone. I have not had these experiences, but I am aware of getting friend requests almost every week from profiles that search out and make a lot of cold call friend requests to a lot of people. This is clearly what happens when you click on the profile to see who it is and find the profile has been deleted by Facebook. When you click "Delete request" and then "Mark as spam" if there are too many "Mark as spam" responses to a particular profile then FB automatically suspends the profile. This is the best response as if you do not know who a person is then you should certainly not trust them on Facebook or any other platform.

When we started doing things like Yahoo Groups for some communities of interest such as railfanning, it was tame compared to what happens on Facebook now. As a lot of admins of these groups have found, trolling and flame wars are everyday occurrences on their groups and the moderators have their hands full dealing with problems like these. Added to the fact there are people who use these groups for their own ends such as stalking or criminal activity against group members, whose background is entirely unknown (such as whether they have a criminal record) and there are huge potential problems with these situations. There are also problems because the news media also exploits social media to make stories, a lot of stuff that is appearing on the major news sites in NZ originated on social media and is often exaggerated or is not checked for factuality before publishing. Another example is the Neighbourly site which Fairfax NZ has taken a shareholding in - they are obviously looking for a news source for stories.

I think the fundamental problem is that the whole notion of social media is that everyone has something worthwhile to say and it gives people a vehicle to say things whether they are smart or not. So it has created a platform for a massive amount of online trolling and half truths masquerading as fact to be peddled. For me, social media is best utilised to interact with groups or agencies that I have good relationships with, like Christian ministries that I follow. When it strays into self opinionated drivel, that is best avoided and that is why my own FB pages largely stick to fact rather than opinion where possible. The news media of all hues is busy using FB and in many cases their own websites to peddle in many cases these days the most inflammatory opinionated rhetoric which they just let go as if they were someone special or we owed them a living. Duncan Garner and Sean Plunket are both fools who use social media to spout their opinionated drivel under the guise of "free speech" and then stalk off in high dudgeon when they get challenged. Well known Christian conservative lobby group Family First has come out all guns blazing against the new government attacking their policies before half of them are even known and some of their comments are at best half truths. It's a great time to be limiting one's social media activity because there is so much rubbish out there.

Tuesday 17 October 2017

Digikam / Android 8 not perfect / Ditching Outlook.com email

For some strange reason this message (which only appears once in my inbox) has some magical properties that enable it to push itself to the top of the list every so often.

Android 8 seems to have issues with DefaultCarrierApp which puts up a notification that won't go away telling me I have run out of mobile data. This is quite a common issue a lot of people are experiencing as it turns out.

Aside from testing Evolution for email (quite positively so far) I am now experimenting with digiKam as a pseudo replacement for Irfanview. I don't know whether it can do watermark text or renaming the same, or automatically download images from the camera, so it will take a lot of work to test out these features.

Sunday 15 October 2017

Upgrade MainPC (and BedroomPC) to Debian [3]

This is really just a quick followup to two of my four PCs replacing Xubuntu with Debian running Xfce desktop. It's interesting that I tried out Debian a few times in the past and never liked it that much, maybe using the familiar Xfce desktop has helped but it is far more likely the case that my increased familiarity with Linux has been the determining factor. In both cases the installation took more work than Xubuntu but I am quite confident of being able to overcome any challenge with Debian by now, it has a well deserved reputation as the king of Linux distros due to its very long and continuous development over more than 20 years, stability and widespread adoption and support in the OSS community. 

I still have one PC running Xubuntu and there is absolutely no rush to change it over. Like the bedroom PC which had a non booting installation that needed to be fixed it would only be out of actual necessity I would replace the installation on this computer, since everything runs just fine on it at present. The impetus for MainPC was Qgis compatibility for the most part after perhaps a rash decision to run a beta version of Xubuntu for a time.


More useful Linux software

Obviously one thing you need with an operating system platform is software that will run on it. Every platform has stuff that is produced cross-platform and stuff that is only for its platform, and we constantly have the issue that something we want to run is not available on the platform we use most of the time. It's for this reason many of us using a less well supported platform like Linux desktop or macOS need to keep a computer or at the least a virtualisation software platform running Windows to be able to use some of the software for that platform that we haven't found or can't get an equivalent replacement for. After one year of using Linux I am still using Windows for the automated download of photos from my cameras, scanning and printing software for those two hardware devices, IrfanView for photo editing, MS Office for some functions I haven't got around to seeing if it is possible in OpenOffice, Paint.net because the Linux equivalent Pinta is much less stable, and a few other less used capabilities like DVD ripping, iTunes, syncing another Google drive for personal files (rather than the maps on MainPC), etc. I am sure with a lot of hard work I could eliminate the need for any Windows computer for all these things but it only cost about $300 to build that new computer from scratch although I will have to buy a Win10 license for it someday. A VM would have trouble running some of those HW dependent software apps. This computer only needs one screen and can be used if I need a third screen for some other function I am doing on mainPC to which it is networked.

As the comment about Paint.net reveals sometimes what looks like and should be a direct replacement is not of good quality. Unfortunately there is a lot of software on Linux that doesn't live up to its promise, whether because development has been abandoned or inadequately resourced. I don't want to knock a lot of software or say it is the open source model that causes this because that would be patently untrue, however it remains an issue that lack of widespread adoption of Linux desktop means we are really grandparenting off the massive adoption of Linux server or using software that has a Linux port from some other platform or is natively developed for Linux desktop and then ported to other platforms. Qgis is an example of some very good software that starts on Linux and has been ported to Unix, Windows and macOS and I started using it on Windows and then switched to using it on Linux. I'm happy to say the experience on Linux was at least as good as Windows, only the issue that the packages install folder can't easily be configured (maybe there is a way around this I don't know about yet) precluding multiple installations has been a concern, this is the price you pay for being able to upgrade to a new edition painlessly and automatically by just typing a command into a terminal window. The build from source option I have employed to have two versions running on my Debian desktop hasn't been too difficult. Qgis has the great benefit of being well supported and stable despite the at times appearance of slow evolution and resolution of issues. Much of the graphics and video editing software I have tested on Linux tends to be unstable and more often than not this is a consequence of poor design in that the programmers have not anticipated and coded for possible error situations in a way that allows the software to recover, so it just crashes instead.

The aim of this post today is to write about some more software tools for the Linux platform that I have adopted recently. As a result of becoming a webmaster recently for nzrailmaps.nz I had to find software that would allow me to edit and maintain my site. With no custom site builder integrated at trainnweb.org it's back to using a desktop graphical editor or a specialised text editor GUI and separate FTP software to sync updates to the web server. I spent a lot of time looking for and trying several GUI editors before finding Seamonkey for Windows, which for some reason is not available as .deb packages for Linux and is hard to install for that platform, examining and rejecting it due to its apparent lack of CSS support and a few other issues like clunky design. Because these days due to the rise of free web host sites like Google Sites and many others which have sitebuilders integrated into them there is not much call for basic GUI web editors these days and not many people writing them. I came to the conclusion it was far easier to use a specialised text editor with HTML / CSS templates built into it, of which there are a great many more produced due to the fact they can work with a whole lot of different coding languages and environments. Editing HTML in a text editor is not very hard and it's something I have plenty of experience with. So Bluefish is the editor of choice, very stable and useful. The FTP software for now is FileZilla, something I remember from Windows, and quite suitable for a small site, you have to manually browse to each directory and select the files to upload, though it does have a capability to highlight which directories or files have been updated. 

Another type of software I am experimenting with is seeing if there is a better email client than Thunderbird. Really the only realistic alternative out there that is not tied to some other package like (for example) the Seamonkey browser is Evolution, which is interesting as it is rare for non-MS software platforms to support Exchange (there is only a proprietary Exchange plugin available for Thunderbird). I have installed Evolution on MainPC and set it up to access a few lower priority or soon-to-close email accounts for now. As it has been noted in the past that Outlook has issues with Imap support it will be interesting to see if Evolution is significantly better in this regard and also how it handles Google calendars etc. Evolution used to be cross-platform but is now Linux-only. It was started by Ximian/Novell and is now commercially developed by RedHat and is obviously beneficial to their wider interest of commercial deployment of Linux for corporate environments, but still has an open source free variant that can be downloaded and installed automatically by apt.

As long as there is a lot of software for Linux, which I have no reason to doubt, it will continue to be a very useful desktop platform, but I can't see that Windows desktop disappearing any time soon. Although some Windows software has been ported to Linux using the mono adaption of .NET, I have generally felt that it is not as good as native software, although this could just be a perception. I have found both Pinta and FlickrDownloadr which are built on Mono, to be not very stable or well developed pieces of software, but this could be just a perception rather than a limitation of the platform.

Saturday 7 October 2017

Why I no longer have confidence in Flickr

Flickr used to be a great photo site. It was started in 2004 in Canada, but it didn't stay independent for very long. Yahoo took it over in 2005, and for quite a while it looked good. It was quite distinctive from some of the other photo sharing sites, and has retained a quirkiness all of its own. The problem is that Yahoo has had a lot of problems in recent years, and their properties like Flickr have suffered along with the parent, as they fall more and more into Google's shadow. 

For me, I thought Flickr was pretty good until relatively recently. I have even set up a new home site recently for some of my photos as a result of changing my core online identity this year. I had thought everything was well with Flickr. However what has shaken my faith in them has been a total failure in customer support. If you try to contact them, using the methods provided on their website, then you cannot get a reply. Your ticket will go into what looks like a regular customer support queue, but you will never get a response from them. Therefore, it's practically impossible to get any issues resolved.

The biggest concern I have had with Flickr, which I only just discovered last week, is that they have removed some of the photos from my albums. I only found this out when I tried to transfer the photos to a new Flickr account, and received messages that a few images were considered "infringing content" and had been previously removed. I then checked and found that the pictures in question had been replaced by templates with the text "this photo is no longer available" printed on them.

I haven't been able to determine how the photos came to be removed. The photos were unremarkable and the most likely scenario is that malicious allegations of copyright infringement were reported by persons unknown, in order to force the images to be taken down. One album with around 2000 pictures in it had 200 images that were replaced by this template. But simply put, Yahoo decided they did not need to notify me of the reporting of the images; they simply removed them without advising me in the slightest. The law in the US requires notification of a DMCA related takedown of an image. NZ copyright law also requires this. If Yahoo's case is that neither of these laws applied, then they are responsible to inform their users under what other circumstances and what other law they consider themselves empowered to remove images.

Needless to say, Yahoo has not responded to any requests in relation to this issue. I have just filed another support request, to move the NZ Rail Maps Flickr albums to ownership from a new Yahoo ID. This process used to be available directly from Flickr account settings but has been removed from there, and you now follow a process that involves filing a support request with Yahoo's helpdesk. Well of course this has gone unanswered like all the other requests. 

More than any other reason, this situation has precipitated the decision to register a domain name for the NZ Rail Maps project. It has come out of the realisation that Flickr, which has been where I hosted the map tiles for a long time, cannot be depended upon any more. Google Photos will probably end up being where a lot of the tiles are hosted in future. I am still thinking about whether to ditch the rest of my Flickr albums and move all my stuff to Google Photos. At that time I would still follow some people on Flickr, but not use it for anything else.

Thursday 5 October 2017

Upgrade mainpc to Debian 9.1 [2]

Well as usual I have decided to rush in boots and all and get Debian on this computer. Creating the pen drive for the netinstall image that I used previously to create VMs was a little tricky until I realised it had to be in FAT32 format rather than ext4. The Debian installer threw up a warning about using the Unetbootin tool to create the pen drive, but it worked flawlessly. The installation is practically the same as Xubuntu except you get a choice of desktop environments, from which you can choose, in this case, Xfce. It being old hat I soon had the partititions set up on the SSD (2x 60 GB, one the boot partition and the other swap) and the installer flew along as expected. One of the reasons for wanting to change the installed OS is some issues that have come up with Xubuntu 17.10 which naturally proves one has to be careful about installing the latest bleeding edge in a production environment. For this reason I decided not to install the testing repository and will find some other way to update Xfce if it is an old version.

Once having completed the base install the next step is, as usual, to set up the RAID array, which we do by logging in as root and then installing the RAID software and configuring it, just as I have numerous previous times. I'm being incautious by not having made a backup before starting the install, but I did have a backup done a couple of weeks ago, and there are two disks in the array that are exact copies of each other, and it's unlikely to be an issue to get the array working again. So actually it was very straightforward. So now for the rest of the day I will just be tweaking the installation and re-installing bits of software on the computer. As usual as soon as I pointed /home at the RAID array and then logged on as myself everything came back especially the XFCE panel and menus and it is hard to see any difference from Xubuntu. One of the key things I can do with this version of Debian is natively install a stable version of Qgis master alongside the most up to date version and at the moment it is building an installation from source that should work OK as it has on all the VMs I have put together up until now with Debian on them.

[UPDATE] So MainPC has been OK. The PC in the bedroom crashed a couple of days ago so I just jumped in and installed Debian on it as well. This is a bit tricky to do in general because of the differences between Debian and Ubuntu, and it took a bit more work to get it up and running. I still have to install a bit more software on it to finish it off, but it is going fairly well. I don't have a plan to migrate all the computers, and will only address it whenever it is necessary. So I still have one PC running Xubuntu and have no plans to change that.

Upgrade mainpc to Debian 9.1 [1]

I wrote about this last time. Like all the other times I have upgraded this will be a multi step process because of the various things needed on the system. However none of my other computers will be changed from Xubuntu. I don't have any real ideological reasons but I do want a system that is more widely supported than Xubuntu and yet still has the Xfce desktop interface which is light and fast. I remember that I tried Debian one time before and didn't like it for some reason, but I think that time is now past and I will go ahead with the upgrade this time. Probably all of the other computers will stay with Xubuntu (Artful currently).

To get more up to date packages the system needs to have the testing repositories added to apt, this is simple enough to do with a line or two in /etc/apt/sources.list:

deb http://ftp.nz.debian.org/debian/ testing non-free contrib main
(you may also have a deb-src line as well)
which gives me access to the NZ repositories for testing. When I put this into my test VM and ran apt update it said there were 778 new packages available. Whether all of these actually are required is somewhat moot as the main issue is that the system is pretty well stripped down to the minimum required components - which is probably going to be the case anyway. There were actually 892 gets needed which took 11 minutes in total to download. About the same amount of time again was needed to install everything.

So this is just an early stage of looking into it because there are numerous steps to follow and there will be reinstallation of the software and various disruptions etc.

Wednesday 4 October 2017

How to set a custom display resolution in Xubuntu

This is the way to mirror my pair of screens on my bedroom PC. One screen is 1360x768 (a little odd) and the other is 1680x1050. The default mirroring in XFCE will only run them both at a 4:3 resolution like 1280x1024, since this is the highest resolution they are both compatible with, even though the aspect ratio is wrong and so a part of the screen is not actually used.

x.org server ships with a command line tool called xrandr. This is capable of changing the display settings on the fly and in this case the command we want is either of the following:
xrandr --fb 1360x768 --output VGA-0 --mode 1360x768 --scale 1x1 --output HDMI-0 --same-as VGA-0 --mode 1680x1050 --scale-from 1360x768
Alternate form (which I also tested): 
xrandr --fb 1680x1050 --output HDMI-0 --mode 1680x1050 --scale 1x1 --output VGA-0 --same-as HDMI-0 --mode 1360x768 --scale-from 1680x1050
Which one you use will depend on which screen it is more important to run in native mode. At the moment I am using the first option, as it gives a native size picture on the 1360x768 screen which is the easiest to read bedside screen. The important difference over what xfce4-display-settings can achieve is that one of the screens can be scaled to match the other. It would also be possible to have some oddball configurations like three displays with two of them an extended desktop and one mirroring one of the others.

Running this command only lasts as long as the next boot. In order to make it come into effect at every startup it needs to be put into a file that the LightDM window manager executes when it initialises.

In this case, after looking at the Ubuntu documentation over here on the wiki, I made a file in the path /etc/lightdm/lightdm.conf.d and called it 50-myconfig.conf

The contents of this file are just two lines:
[Seat:*]
display-setup-script=xrandr --fb 1360x768 --output VGA-0 --mode 1360x768 --scale 1x1 --output HDMI-0 --same-as VGA-0 --mode 1680x1050 --scale-from 1360x768
Basically display-setup-script is a prefix that tells LightDM to run this command after it starts the X server, and then it is the xrandr command I mentioned above.

Just as an aside, right now I am looking at putting MainPC to Debian 9.1 with XFCE as the front end. The issues to be resolved will be how to get a more up to date version of XFCE than the default from the Debian repository, this will probably entail using the unstable repositories. This will all be tested out in a VM before migrating. No other computers are planned to be shifted, it is mostly about getting the best system on a PC that is used for the most work I do.

[UPDATE] When I reinstalled the bedroom PC with Debian recently I had to put this setting back in to the computer. The PC is also using XFCE as its GUI and the settings were the same, except I put the display-setup-script line into /etc/lightdm/lightdm.conf file. An interesting twist was I had to install some extra packages for Radeon drivers before Xrandr could detect the display settings and ports properly. Since there have been questions over video performance on this computer with late releases of Xubuntu it will make an interesting comparison. Although, this tiny PC is to get a makeover soon with a Asrock Q1900 board that will run Intel chipset and graphics which will help a lot because of better support in Linux.

Friday 29 September 2017

Using VNC for remote control again

One year ago almost exactly I wrote about using VNC with a couple of home computers. My computer arrangement after that date was changed and remote access was not needed between two rooms of the house. However this week I have decided there are scenarios for remoting from the bedroom to the lounge where three main computers are. Mainly that it will enable me to do some computing stuff and have a devotional time at the same time in the bedroom as that room is set up for devotional time and there are many small tasks that can be done on the computer that don't need me sitting in front of it all of the time. So in enabling me to increase my devotional time which is what I really want to be able to do. It is also better for being able to work on the computer while in bed so that I can avoid staying up way past my bedtime as I have at times. Bed is a much better place to be if you are tired as falling asleep in front of the desk is risky since many times I have fallen off my chair and landed heavily on the floor.

Remmina as a client and x11vnc as the server are the combination used just as my previous post described and Remmina presents the screen as two windows side by side that you can just move the mouse to the edge of the monitor to scroll to the other screen. I will not be setting up remote access to the other computers in the lounge as it's only MainPC that really justifies this. Since for more indepth stuff I can just work in the other room. Working from in bed at times and at other times sitting in front of a music keyboard, using a multimedia computer keyboard is not very productive as compared to a regular desk setup but I won't be doing this with the desk even though it has been fitted in the past with a keyboard slide as it has since been lowered to be at the right height for playing the music keyboard. In effect this options is really only for relatively simple tasks that do not require a lot of typing or a regular mouse in place of a touchpad. And I don't want the distraction of a full keyboard/mouse setup in the bedroom because it is of secondary importance. I already had the E350 HTPC set up to play videos and it has adapted easily to this new role with a pair of Dell 22" screens. Unfortunately at this stage I cannot used the 1366x768 Sony 32" monitor as one of the screens as this resolution is not available when mirroring the screens so cannot use a 1680x1050 screen mirrored with the Sony so 3 screens in the bedroom for now.

I had already been using a second screen with the computer (2 screens mirrored) to serve a dual purpose of enabling me to look at PDF music charts with my Casio keyboard as well as this second screen enabling me to have the computer on while playing music for night time intercession, a common scenario because of time differences doing international intercession, since I can just turn my head to the opposite side of the bed and the light from the screen won't keep me awake. At a time when it is all important to have more devotional time, being able to use this setup is really useful. This week I have been doing lots of small repetitive tasks on the PC that don't really warrant spending a lot of time sitting in front of it but still have to get done so being able to retreat to the bedroom is the best of both worlds and I am sure will continue to be in the future.

This is the first and last blog post written using this arrangement as being limited to two finger typing on a multimedia computer in bed is very slow and tedious.

Wednesday 27 September 2017

Useless Flickr processes and tools

Today I found that Flickr has been flagging some of my own personal photos as "infringing content". I only discovered this when I downloaded some albums from one of my sites and tried to upload them to another site. I then got a few emails about content being removed that had been previously marked as infringing and discovered that a few photos had been downloaded as a replacement image that had been put in place of the original by Flickr which says "this photo is no longer available".

Whilst Flickr has obviously used some process to identify images they believe are infringing they have not previously given me an opportunity to respond to the claims of infringement up until now. Instead this has happened automatically and without notice. I checked one of the albums which I had downloaded and it contained more than 200 of these replacement images. They were only notified to me because I had unknowingly attempted to upload them again.

It is highly likely Yahoo uses an automated system to scan photos and that it removes the photo completely automatically without human review. Yahoo will then only respond if the owner of the photo finds their photo has been removed and complain to Yahoo. The scale at which this can be happening can mean some people would simply not bother complaining as there are all these steps they have to go to to complain to Yahoo.

The Flickr uploader that you get for paying Pro fees for is also a crock. As I have found once a photo has been uploaded it cannot be uploaded again even if you delete the photo off Yahoo or if you uninstall and reinstall the software. It also deduplicates your photos but doesn't bother putting the original of the duplicate into the album that contained the duplicate. Finally, it makes the photo private. Evidently this is a very simple piece of software that does not serve the purpose for which I used to use Flickr uploaders in the past i.e. creating albums and uploading photos for public display from my computer, and as such is fairly useless and not worth the subscription fee.

I have also found that using an app to upload photos from my Instagram automatically to my Yahoo albums has stopped happening lately and I believe Yahoo has started blocking these apps because they are upload tools and only Pro accounts are allowed upload tools.

As I have found in dealings with Facebook with these large corporations you are guilty until proven innocent or shoot first and ask questions later, they frequently employ automated tools to block accounts and will just fall back on their terms that let them delete or disable your account any time they feel like without any recourse available to you. Facebook is particularly hard to contact if your account is disabled although there is now a system for having your friends able to raise a case on your behalf. Facebook will even suspend your account if their automated systems detect something that just looks dodgy, like using a third party app to forward posts from somewhere else like a blog, or liking too many pages within a certain period of time.

These types of experiences are why these large cloud services are coming under increasing scrutiny from regulators because of their draconian behaviour that locks people out without justification every day.

Tuesday 26 September 2017

Flickr uploaders and downloaders

So, I'm moving some of my Flickr photos from an older account into a new one. This means I have to download all the existing photos, and then upload them again on the new account.

For downloading, go and look on the internet for Flickr Downloadr, which is cross platform through Mono for Linux as well as Windows. This can be made to work some of the time, but I am seeing constant crashes when working with some of my larger albums. The biggest problem with this software is you can only download one album at a time, and often it will crash partway through the download, forcing me to download 100 photos at a time.

Still, so far I have downloaded more than 10% of the 55,000 photos on this account with this software.

Then, to upload again, well you can use the built in upload web page, which constantly has timeout errors, or pay $6 for a 1 month pro subscription which gives you access to the Uploadr app. Which is what I am doing. This is pretty reliable, but it will still take all day to do the upload, maybe longer.

So I hope the download will be all completed today, and maybe the upload to the new account will take 24 hours or something. Then I can delete the two old accounts and just keep the two current ones.

Apart from the downloadr crashing every few albums I have deleted the first batch of uploading twice, this is due to having some albums that contain large numbers of photos some of which are duplicated, and the idea is that as the uploadr can deduplicate, it gives me a chance to remove some duplicates automatically at the same time.

The Flickr Downloadr software while good when it works has obviously suffered from a lack of resources, with the download package not updated in 10 months, and no new packages from the package repository, with xenial being the last supported version of Ubuntu. Hence I doubt these bugs will be fixed any time soon.

UPDATE: Unfortunately Flickr Downloadr has turned out to be a bigger piece of crap than I thought, just completely refused to work after downloading about 25,000 images so I have to come up with a way to get the rest downloaded. Probably will have to get the originals from somewhere else like the Picasa albums I downloaded from Kahukowhai last week.

Meanwhile Flickr Uploadr has some kind of inbuilt history that I can't get rid of because I have uploaded everything and then deleted it again and now it says everything has already been uploaded when it hasn't. LOL. All very messy.

Monday 18 September 2017

Switching Google accounts

Just to make my life more complex (hopefully simpler in future) I have decided all my email accounts will consist of three Google accounts, classified as low, medium and high priority. This involves two completely new gmail accounts and one existing one. At the moment I am working on switching over my default email account to one of the new ones (the low priority one) that is the catch-all account for all sorts of web site signups. It is quite a big process switching over Google accounts. The fact this blog is going up under one of my new accounts is part of the completion which has involved changing all the blogs over to two of the new accounts (I always have two author accounts on each blog). 

Some of the big steps undertaken have been transferring and deleting email, and downloading all the Google photos. Google does provide an email and contacts transfer functionality in Gmail that I used to transfer multiple accounts with, including my soon-to-disappear Microsoft accounts. As content has been moved, old content has been removed. The next step was to download 16 GB worth of photos from Google Photos. The reason there was so much stuff in there was that it included all my old Picasa photo albums, long since forgotten about, which evidently were migrated to Google Photos some time back. I currently use Flickr to host these photos and have not done anything with the Picasa stuff for a long long time. Consequently there are many duplicates of Flickr albums and these will not be going back up on the Google Photos of my new account, only the photos taken on the phone are planned for at this stage.

I had to use Takeout to get the photos (there seems to be nearly 100,000 of them) and use a download manager to pull down the nine zip files totalling over 16 GB. Apart from Picasa I also did briefly trial the Google photo uploader to backup my photos from my PC, this was abandoned when I realised it was just a backup tool for backing up photos from a PC, not a sync tool for public albums like Picasa was. I will have to find the phone photos and download them back onto the phone because resetting it (see below) has wiped them out. The biggest potential problem from downloading these photos is whether they will have an impact on old blogs, because Google stores blog illustrations on Google Photos, so if I delete my old account it does have the potential to remove the photos including possibly the ones that were used in blogs.

Next steps including migrating calendars and the Chrome profile and this will take some investigating and testing. I have also factory reset my phone to force it to use one of my new accounts so I have had to reinstall all the apps on it. There is still a lot of account migration work to do and it will be some time before I actually delete anything. But I did delete a very old Microsoft account going back about five years and it may well be the case there are still some very old Google accounts forwarding occasional emails that will come to light as this project progresses.

Tuesday 12 September 2017

Qgis testing

Since my last post it seems fair to say that Qgis is like a lot of open source projects a largely community effort. There may be some people that are paid but in reality a lot of contributions are voluntary. I shouldn't be too demanding with issues because they are working to a deadline at the moment to produce the 3.0 release.

I have built a Debian VM to the latest development release 1caaa2e and will be looking at what my project need migrated to work with it.

Monday 11 September 2017

Debian vs Xubuntu: The ideal Qgis VM production environment

So having discovered a bug in the Qgis composer as the reason my homebuilt Qgis VMs crash and not any other reason, this idea of building from source has actually been not such a bad thing at all and has left me with something that actually works all in one VM and one that is actually licensed legit because I don't actually own any Windows licenses and haven't much like relying on some legacy licensing rights to Windows environments that I will sooner or later have to update to current.

This being achieved by being able to build what seems to be the last build (313ec55) of the Qgis development master that is actually reliable, onto versions of both Xubuntu and Debian that can run a release of Qt that doesn't have a silly floating point rendering bug that never existed until quite recently and as it happens was the version deployed with Xubuntu 16.04 which is the last version that can auto install 313ec55 from the packages without installing the latest version, so building from source on a later version of either Xubuntu, or else on a late version of Debian, was the only way of getting this 313ec55 fairly stable with few bugs version 2.99 of Qgis that I can use to develop my 2.99 map projects and use for virtually every part of the process without issues. At the moment there is still a bug that relates to saving new projects or new versions of a project that needs the use of Windows because the 2.99 project format is incompatible with 2.18 in which this particular bug has been fixed.

The development masters for 2.99 are of course part of the process of developing the new 3.0 release of Qgis and I guess there will soon be a 3.0 version out, probably in a couple of months, and hopefully all the bugs will have been fixed in it.

However a big discouragement is that the latest builds have introduced an incompatible project format and the developers have said they do not care that their new software is incapable of reading and converting older 2.99 project formats because they don't guarantee to support projects created with previous development versions of the software. This will have to be followed up to see if this is a new policy and is it reasonable.

In general as far as the VMs go both Debian 9.1 and Xubuntu 17.10 are very efficient with resources. A 12 GB VM will have loaded Qgis, but not a project, and have used up 1 GB of memory in total for the OS and the software. This is a lot better than Windows bloatware where the OS gobbles up memory like there's no tomorrow. You can build a really lean VM or PC optimised for low resource use and that's where a GUI environment that is optimised for low resource use like Xfce (which is what I have running with Debian on my Qgis build VMs) really shines, one of the reasons I stopped using Linux Mint (which I have almost forgotten) because their GUI gobbles resources unnecessarily.

Meanwhile the map work continues.


Friday 8 September 2017

Building Qgis from source code

UPDATE 2: Well in spite of my comments as below, it seems I have discovered a new bug in version 2.18 LTS of Qgis as well as the home built master with the composer crashing the software when attempting to render a map. It may well be that there is a bug in this master, the 2.18 LTS and some other versions of Qgis, and at the moment I am focusing on attempting to test and document the bug for the Qgis project. So it looks like the crashes in Debian were from the same issue, and that it is possibly a bug for Linux specific versions, but it could also be an issue that affects all versions.

It looks at this stage like I will be able to use either of those home-built VMs to do my work as the crashes are probably an inherent limitation of Qgis as a whole, rather than specific to the VMs I have built from source, but it is still a perplexing issue.

UPDATE: Whilst building from source is an interesting experience my outcomes have not resulted in more reliable operational experiences with my home built VMs. Building 313ec55 on both Debian 9.1 and Xubuntu 17.04 has resulted in unstable working environments that crash a lot more than the 16.04/313ec55 and the Windows 7/11812846 VM environments that I have used up until now. It is therefore my decision to continue the development of the project on 313ec55, and another Linux VM running the 2.18 stable release will be used to produce, as it is capable of reading enough data from a 2.99 project file to properly render the output.

With a FOSS package like Qgis there will be times you want to build it from source. For me this is getting a working version of the development master because newer versions sometimes have bugs, or are unavailable on the OS you are running on your computer. Whilst it may sound like a big deal to build from source, it is surprisingly straightforward.

In my case I have always tried running development versions of Qgis, but it gets risky running them when a new version comes out and has major bugs, as is currently the case with latest builds that trash the CRS settings and fail to load some layers (possibly these are connected reasons).

 The steps for building Qgis in a nutshell are:

  • Set up a development environment. In this case, a Vbox virtual machine is the best development platform. Once you have one set up with the basics you can easily clone it, and they don't need much memory or disk space. 
  • You'll need to install some packages specific for the Qgis build.
  • Set up some folders to hold the build files, the destination installation, and change a few settings here and there.
  • Get a build of the source code from the appropriate branch on the QGIS github site. In this case the master branch. Master builds happen every day, so there are plenty to choose from.
  • After you get the zip file downloaded, check the version number in CMakeLists.txt to ensure you have the version. There are currently 48 different branches on Github and all various versions too. Near the top of the file you will see stuff like, in the case of 2.99, different variables spelling out "2", "99","0" which tell me I have build files for 2.99.0 as I want it to be. My first build attempt on a Xubuntu zesty vm accidentally got a 2.18 build instead of the master. 
  • I have tested both Debian 9.1 (stretch) and Xubuntu 17.04 (zesty) VM environments. In the end, Xubuntu was more stable than the Debian VM with XFCE, which had a few problems and crashes. Debian would be fine as long as it was stable; I found it straightforward to set up and it went well until it became unstable and had to be reset, then was extremely slow to stabilise.
  • Set up the Cmake environment using ccmake. Mostly using the default settings, but we can set a variable to provide the destination folder that our build will be installed to, for actual usage. This is important because the default will install a new build over the top of an old one, whereas we want to let each build have its own install folders, so we can (theoretically) have more than one of them.
  • Tell ccmake to generate the makefiles for the build
  • Run make to build Qgis. This will take a while, maybe an hour to 90 minutes to complete. This part is actually quite painless considering how many steps are involved, as there will be thousands of lines of output scrolling up your screen. Most files take only 5-10 seconds to build, but occasionally it all slows down as something really big gets built. There are more than 13,000 source files in the build which take up 436 MB of disk space. At 35% of the build the make had made 6500 new files occupying 200 MB which had taken about 20 minutes, so there is a lot of file crunching going on.

  • In my case I chose build 313ec55, a release with which I have had some success on 16.04. The main problem running it on 16.04 is the version of Qt which has a bug that causes decimals to be rendered without rounding, this means a distance that you want displayed as 214.38 km will often show as 214.37999999999. In order to address this I followed the instructions (despite warnings) to build against Qt5 and the version that installed from Ubuntu's zesty archives was 5.7.1 which has the fixes in it. 
  • The steps listed for building against Qt5 with this release are slightly different than for Qt4 (recommended) but the release build of 313ec55 that has been installed in some of the other VMs I have used is built against this edition so there shouldn't really be an issue. So far there hasn't been.
  • The build went well and I am now testing this VM. Since it works and since I have fixes for the lack of files that can be opened at once (see a recent post) it looks like this custom build VM will be the ideal environment for both developing and publishing the maps since it eliminates the main restriction that has existed to prevent publishing from being done with a Xubuntu VM based version.
  • It looks like a lot of more recent builds (from June) have the serious issues with project data not loading correctly so I can't expect to find a more reliable later build and they have other issues with the designer that I don't like either.
  • 313ec55 still has an issue with relative paths not being followed, I have not been able to find out which later master build fixed the issue. If I have to save a new project at any stage, a Windows VM running 1182816 will have to be used, as this doesn't have this issue. 

Tuesday 5 September 2017

How to change the account order in Thunderbird

Thunderbird doesn't provide a user interface for changing the order of accounts in the left hand pane, but Config Editor can be used to change it.

See the guide here

In my case I have 11 accounts listed including the two special ones, so basically 9 email accounts. I expect to tidy all that up and have only three or four listed by the end of this year. Getting it to put the most important ones at the top has been good.

Thursday 31 August 2017

Linux is still miles better than Windows any day

My Windows PC doesn't even get turned on that much now. It is only for a few holdout applications like the cameras and IrfanView that I really need it for. At the moment I am using it to author some of the maps, but that isn't happening all of the time. It can cope with aerial images coming over the network and is quite quick compared to the virtual machine I was using, but Linux can process them a lot faster in a VM, so probably better still in a real PC.

Now that I have been reassured that the Qgis issue with a lot of files open isn't a bad design decision of the Linux platform then I can go back to a Linux VM to do the map design and it is faster than the Windows VM I have used up until now. The main problem is that for Linux we don't get to choose which edition of the master can be installed. Hence a need for an older master can be a problem, and explains why I use a VM running Xubuntu 16.04 as it will always pull the 313ec55 build if you choose not to use the ubuntugis repository. There is an option that I could pull 313ec55 from the repository and build it from source on 17.04 or even mainpc and therefore have it running reliably and doing all the things I need to.

Now here is a great piece of software to analyse free disk space: Baobab

This does a great job of analysing disk space usage graphically on a PC. As you can see I have an issue that my home folder is using 1.5 TB on a system with at most 2 TB of disk. Sooner or later I will have to clean out some stuff like all those aerial photos I have downloaded to draw maps, and some of the virtual machines I have installed under Vbox.

Fix for Qgis unable to open lots of aerial images on Linux editions

After some forethought the lovely Qgis people have come up with the issue being a limit on the number of open files. This is a deliberate security feature designed into Linux. Fortunately it is not, as was first alleged, an architectural limitation of Linux.

I have used the prlimit command with --nofile to increase the limit from the default 1024 to 10000 which deals with the problem most effectively. In order to make this limit increase permanent I changed the setting in /etc/security/limits.conf

This means I can go back to using a Linux virtual machine instead of a Windows one to work on NZ Rail Maps projects. Specifically I will be using a VM running Xubuntu 16.04 and Qgis version 2.99 build 313ec55 which has the fewest issues of Qgis 2.99 builds for Linux. This is actually the last master build available for Xubuntu 16.04 due to architectural changes. With testing the Linux VM running a project with a large number of aerial images open, Qgis handles the images much faster in Linux than Windows.

I just can't remember right now if there were other issues in Linux that made me switch to Windows, I think off memory it was mainly to do with being able to run different builds on the same computer (which was a physical computer for a while). This is much harder to do in Linux as the packages are not specifically configured for this option; when you update to a new version it always installs over the old. However running different VMs is one way of getting around it. Windows editions don't auto update as Linux does (when you run apt upgrade) which is handy. EDIT: There was one issue and that is to do with the rendering of distances on station labels, with 21.2 being commonly rendered as 21.1999999 for example. This means I need another VM running a different edition of Qgis to produce the outputs. 

Coincidentally this fix came the same day as another longstanding problem was resolved at home; this one concerned the power supply to my house which has been unreliable, with significant drops in voltage being experienced on two previous occasions. The last one in February or March this year resulted in a call to Orion, who subsequently claimed that as no other customer had experienced the fault, it must be inside the house, and therefore outside their responsibility.

The problem started to occur again this week and had been observed on three days up until yesterday but as usual being an intermittent fault made it difficult to diagnose. But in this case I was helped when I discovered my next door neighbour, who is in the other flat within this property, which means our flats have separate phase wires and a common neutral wire from the power pole, reported the same issues. When we managed to get Orion to come out they said the most likely scenario was a faulty neutral connection. After two separate visits in the same evening they have replaced all three neutral joints: the one on the outside of the house, the one on the pole directly outside the house, and the one on the pole on the other side of the road where our power is actually tapped off from the lines network. And in spite of all the time lost in the evening, the power supply has been rock solid since, although we had to call them back when only replacing one of the joints didn't fix the issue. The furtherest away joint was found to be loose and corroded so that was most likely the cause.

Thursday 24 August 2017

Artful release date set for 19 October

So the date for the Artful release (17.10) of Xubuntu has been set for 19 October, about eight weeks away. I have been running it on all three of my Linux computers for several months and have noticed only a few minor issues. 

Mainly the issues boiled down to:
  • In a number of cases I have had to configure third party repositories to use zesty as the release as few of these seem to have considered that they should be making a version available for artful, even development versions like Qgis.
  • Kodi not being available from the official Kodi repositories in a compatible version. Solved by using the version from the Debian repository
  • There has been an issue on the mediapc with Kodi constantly freezing after restoring from hibernation.
  • VirtualBox did not update to the latest version automatically from the zesty repository, but the version downloaded from the website manually did update. The manual update was necessary because the installed version of virtualbox stopped working after a major package upgrade.
In each case when there is a software freeze or some other challenge I try updating to the latest release. Almost every time I check for new packages there have been hundreds of updates. It is not particularly unusual to have three hundred new packages each month and it can take easily half an hour or more to install them all. However all these updates have gone smoothly.

The mainpc has been a lot more reliable hibernating since I got rid of the Nvidia 4 head card and used the cheap two head card that is supported out of the box by the Nouveau drivers. If I should want to have more than two displays in the future I could put a second two head card alongside the existing one as the Gigabyte H97-D3H mainboard has two PCIe x16 slots installed, although I think the second slot will only do 4x but for a cheap card that may well be fast enough. Alternatively using the onboard graphics would also be an option as Intel graphics is well supported in Xubuntu and the hardware supports three displays out of the box. However apart from map drawing there has been no ongoing call for three displays and the maps project is scaling down a lot at the end of this year so there is probably no real urgency to go to three displays and in fact there is a much simpler solution, now that each PC on the desk has its own keyboard and mouse then simply use, for example, the windows pc's display as a third display of something.

The usage of VirtualBox running Qgis in a Windows 7 VM has worked out very well for completing the major part of map development running an older version of the Qgis development master because of major problems with later versions. The best thing about the Windows versions of Qgis apart from being able to handle more resources such as 250 or more Geojpegs loaded simultaneously is that they don't automatically update like ones in Linux do (if you have specified the repositories in your sources.list file or imported an update list into /etc/apt/sources.list.d) so I don't have to worry about breaking Qgis with an update. I won't be updating that VM with any newer Windows release of Qgis before the end of this year and don't know if I will bother with any of the newer versions on Linux as the later development releases with a lot of challenges and project file corruption issues among other things is not something I can be bothered with wasting time on at present. This VM has been allocated 12 GB of RAM and it is surprising to see how much this drains mainpc's resources, I guess the browsers are sucking up all the RAM in the computer, absolutely typical as they are major resource hogs, even Firefox Developer with e10s.

The bedroom pc is going to be updated with a new motherboard I am being given which is a Asrock Q1900 Mini-ITX. The advantage of this is the use of an Intel BayTrail CPU instead of the AMD E350 in the current incarnation of this computer as the AMD graphics are now deprecated officially by AMD so this computer has with the latest release of Xubuntu been having a lot of problems with graphics stuttering. The Bay Trail in the Asrock board means Intel graphics which are much better supported under Linux. I also have the option with the mini-PCIe slot on board this computer to put in a wireless card which would enable it to have wireless on board, again as long as I stick to an Intel card it will be supported out of the box. It is interesting Asrock have anticipated a few different uses for their series of board and have a Q1900DC model which instead of the ATX 24 pin power connector has a DC jack that can accept 9-19 volts DC from a laptop power adapter. If I was starting from scratch with an enclosure that didn't have its own power supply (the Antec I bought has a separate laptop style adapter and a small internal board to produce the standard voltages) then this board would be well worth taking a look at. 

Saturday 5 August 2017

The joys of running development software

Obviously I'm having a bit of fun with a GNU/Hurd VM and the Free BSD one was to see if I could get Qgis running on a different platform other than Linux. This ultimately has failed because I could not map the network share from MainPC. The Win98 VM is not running Windows 98 because I don't have an ISO I can install from. ReactOS is another bit of fun although it does work. And then there are seven VMs running Qgis. There are four stable versions with some apparent duplication at 2.18, and then three different development versions.

Qgis map development

So as announced on my maps blog I am working to wind the maps project up by the end of this year and I am writing a series of articles that will be published in this timeframe. They will cover the Otago Central Railway but I am working concurrently on updating several different sections of the maps.

As far as general map editing goes with aerial photos I have to use a Windows based edition of Qgis because of there being problems with the Linux edition not being able to load all the aerial photos at once. The Qgis team claims this is a Linux limitation, which I doubt. It may be a limitation of whatever cross platform development system they are using.

I have been using my Windows 10 PC to do this but it has been very slow work and unfortunately I have come to the conclusion this is because the computer's screens are off to one side and I have to work sideways and this slows me down. I can't afford to have this time loss when drawing maps so I am updating a Windows 7 VirtualBox on my main computer so that I can do the maps on this one, it can be given the same 8 GB of memory that the physical computer has. 

The version of Qgis installed on this computer will be 2.99.0-9 (the Windows version number for an earlier development master that doesn't have the major issues that some of the more recent development masters have). There are quite a few advantages to using a VM on my mainpc as all of the other resources needed to do the maps are local to it and they can all be loaded on the mainpc while the Win7 VM is running in just another application window. This is how I do some work already with other VMs running older or different versions of Qgis on different Xubuntu releases.

One issue of both VMs and a physical Windows 10 computer is accessing the map data over the network from MainPC. I helped speed things up on the Win10 computer by putting all the aerial photos into a local drive instead of having them loaded over the network as well. I will set up the Win7 VM with a separate virtual disk for this specific purpose. Of course MainPC's disk space is all being used up with the demands of this project. It has 2 TB of disk space and scary as it would seem that is three quarters used up as of now. Once this project is completed I expect I can recover space by deleting some of the resources like aerial photos as it will be a major expense to add bigger disks as not only would the two disks in the computer have to be replaced but also the two removable backup disks as well. The other option is to move all the music and videos to the mediapc as they are essentially being duplicated on this computer.

The Win7 vbox took a bit of work to set up and make work with two virtual screens and copying the aerial photos across and loading them. It took quite a lot longer to load them than I expected but once it was up and running it has been very easy to do the maps on this computer so it is working well.

Wednesday 2 August 2017

Download managers for Linux

Well the day arrived when I needed a download manager to handle tricky downloads so here we are. After some trial and error uget which is in the standard repositories turns out to be a good gui based download manager. It can be integrated into Firefox with the FlashGot extension.

I also had to install Archive Manager (file-roller) to extract the downloads.

I have been busy again downloading truckloads of aerial photos from Koordinates to use with maps and the download manager has been needed as the downloads kept stalling and timing out. It seems this is an issue with the new Koordinates website although they are trying to blame Linz for the issue. When I looked at uget it took 5 retries to complete some of the 10 downloads I have done in the last day. They varied in size from 1 to 5.5 GB.

Friday 21 July 2017

Debian GNU/Hurd 2017 edition

Just for a bit of fun I have set up a VM running Debian GNU/Hurd 2017 edition just to see what it will do. The installer is pretty much the same as the ones that come with regular Debian and quite similar to the text mode installer for Xubuntu. Like the regular Debian installs you have a choice of GUI, out of which I chose Xfce for familiarity. I downloaded the Netinstall image to create the VM, and unsurprisingly for something that is pre-release, there is only one repository, in the Netherlands. Downloads from this are quite slow so it took a while to get all the files needed for the install, as a Netinstall image (equivalent to a mini installer for Xubuntu) is a 159MB ISO that only contains what is needed to get the install running to the point where the rest of the install can be downloaded from a repository.

Debian GNU/Hurd is quite interesting as the only well supported and current open source operating system based on the Mach microkernel. There is a very well known proprietary Mach based system called macOS (OS X) which is probably the best known and really the only commercially successful implementation of Mach. It would seem that the purported benefits of microkernels have not captivated widespread support in the IT community at large, except when it has been forcefed to users of Mac computers.


The Google homepage in Midori, the default browser installed with Xfce on Debian GNU/Hurd.

I don't expect to spend a lot of time playing with this, it is academic interest only but it would be interesting to see what it could do in future.

Monday 17 July 2017

Xubuntu Artful

So now I have all three computers in my household running Artful. Which everyone assumes will come out as 17.10 in a few months. The issues haven't been too major but the latest big issue is installing it on computers like mediapc with the problem that it has with the two screens with different resolutions. Whereby I have had to file a bug report on Launchpad. This is actually the reason why I gave up trying to upgrade it in place, because when it got to 17.04, the screen was all messed up and I couldn't figure out how to fix it at that time. So I tried the mini install of Artful Core, which had other significant issues (failing to load kernel modules), so I had to download the full install image for 17.10 and install it from a DVD (although a pen drive probably would have worked. Then after install I had to hook the RAID-1 array back on like on mainpc and then reinstall Samba to share the media volume like before.

The main issues when reinstalling software is that not everyone has an artful branch of their third party repositories. For Kodi I used the version that is already in the Ubuntu repositories, which is branded "Kodi by Debian". In the case of Qgis I followed the steps listed on the Qgis website for the development build, and then configured sources.list to use the zesty branch of the repository. On mainpc this has resulted in Qgis updating itself through several different master builds as new ones are produced. On all three computers video hardware is natively supported by Xubuntu, although the radeon chip in the bedroom pc is significantly more limited in performance than the NVidia 210 cards in both the other computers. I think the issue with the radeon chip is probably from the change in AMD driver architecture in Ubuntu of late and that the open source drivers are significantly compromised. I have found that higher bit rate mp4's really struggle on the radeon. It may well be that in time I will look to replace the motherboard in that computer with the Asrock Q1900 mini ITX board, since Intel video chips are well supported with open source software, but it is not a high priority at present with my limited funds.

I have been running development versions of Qgis forever as well. Some of the latest versions on Linux have been more difficult to work with; there seem to be some architectural limitations in the software that restrict the maximum number of layers in a project. I doubt Linux itself has this limitation as it is used to handle serious data with big servers and high end software in graphics and audio processing. The Windows version of Qgis is more reliable and I have kept it as a backup option both on my Windows computer and in a Windows virtualbox, but for now running an older development version in a Linux virtualbox is what keeps me sane as I work towards winding up the project at the end of this year.