Friday 29 September 2017

Using VNC for remote control again

One year ago almost exactly I wrote about using VNC with a couple of home computers. My computer arrangement after that date was changed and remote access was not needed between two rooms of the house. However this week I have decided there are scenarios for remoting from the bedroom to the lounge where three main computers are. Mainly that it will enable me to do some computing stuff and have a devotional time at the same time in the bedroom as that room is set up for devotional time and there are many small tasks that can be done on the computer that don't need me sitting in front of it all of the time. So in enabling me to increase my devotional time which is what I really want to be able to do. It is also better for being able to work on the computer while in bed so that I can avoid staying up way past my bedtime as I have at times. Bed is a much better place to be if you are tired as falling asleep in front of the desk is risky since many times I have fallen off my chair and landed heavily on the floor.

Remmina as a client and x11vnc as the server are the combination used just as my previous post described and Remmina presents the screen as two windows side by side that you can just move the mouse to the edge of the monitor to scroll to the other screen. I will not be setting up remote access to the other computers in the lounge as it's only MainPC that really justifies this. Since for more indepth stuff I can just work in the other room. Working from in bed at times and at other times sitting in front of a music keyboard, using a multimedia computer keyboard is not very productive as compared to a regular desk setup but I won't be doing this with the desk even though it has been fitted in the past with a keyboard slide as it has since been lowered to be at the right height for playing the music keyboard. In effect this options is really only for relatively simple tasks that do not require a lot of typing or a regular mouse in place of a touchpad. And I don't want the distraction of a full keyboard/mouse setup in the bedroom because it is of secondary importance. I already had the E350 HTPC set up to play videos and it has adapted easily to this new role with a pair of Dell 22" screens. Unfortunately at this stage I cannot used the 1366x768 Sony 32" monitor as one of the screens as this resolution is not available when mirroring the screens so cannot use a 1680x1050 screen mirrored with the Sony so 3 screens in the bedroom for now.

I had already been using a second screen with the computer (2 screens mirrored) to serve a dual purpose of enabling me to look at PDF music charts with my Casio keyboard as well as this second screen enabling me to have the computer on while playing music for night time intercession, a common scenario because of time differences doing international intercession, since I can just turn my head to the opposite side of the bed and the light from the screen won't keep me awake. At a time when it is all important to have more devotional time, being able to use this setup is really useful. This week I have been doing lots of small repetitive tasks on the PC that don't really warrant spending a lot of time sitting in front of it but still have to get done so being able to retreat to the bedroom is the best of both worlds and I am sure will continue to be in the future.

This is the first and last blog post written using this arrangement as being limited to two finger typing on a multimedia computer in bed is very slow and tedious.

Wednesday 27 September 2017

Useless Flickr processes and tools

Today I found that Flickr has been flagging some of my own personal photos as "infringing content". I only discovered this when I downloaded some albums from one of my sites and tried to upload them to another site. I then got a few emails about content being removed that had been previously marked as infringing and discovered that a few photos had been downloaded as a replacement image that had been put in place of the original by Flickr which says "this photo is no longer available".

Whilst Flickr has obviously used some process to identify images they believe are infringing they have not previously given me an opportunity to respond to the claims of infringement up until now. Instead this has happened automatically and without notice. I checked one of the albums which I had downloaded and it contained more than 200 of these replacement images. They were only notified to me because I had unknowingly attempted to upload them again.

It is highly likely Yahoo uses an automated system to scan photos and that it removes the photo completely automatically without human review. Yahoo will then only respond if the owner of the photo finds their photo has been removed and complain to Yahoo. The scale at which this can be happening can mean some people would simply not bother complaining as there are all these steps they have to go to to complain to Yahoo.

The Flickr uploader that you get for paying Pro fees for is also a crock. As I have found once a photo has been uploaded it cannot be uploaded again even if you delete the photo off Yahoo or if you uninstall and reinstall the software. It also deduplicates your photos but doesn't bother putting the original of the duplicate into the album that contained the duplicate. Finally, it makes the photo private. Evidently this is a very simple piece of software that does not serve the purpose for which I used to use Flickr uploaders in the past i.e. creating albums and uploading photos for public display from my computer, and as such is fairly useless and not worth the subscription fee.

I have also found that using an app to upload photos from my Instagram automatically to my Yahoo albums has stopped happening lately and I believe Yahoo has started blocking these apps because they are upload tools and only Pro accounts are allowed upload tools.

As I have found in dealings with Facebook with these large corporations you are guilty until proven innocent or shoot first and ask questions later, they frequently employ automated tools to block accounts and will just fall back on their terms that let them delete or disable your account any time they feel like without any recourse available to you. Facebook is particularly hard to contact if your account is disabled although there is now a system for having your friends able to raise a case on your behalf. Facebook will even suspend your account if their automated systems detect something that just looks dodgy, like using a third party app to forward posts from somewhere else like a blog, or liking too many pages within a certain period of time.

These types of experiences are why these large cloud services are coming under increasing scrutiny from regulators because of their draconian behaviour that locks people out without justification every day.

Tuesday 26 September 2017

Flickr uploaders and downloaders

So, I'm moving some of my Flickr photos from an older account into a new one. This means I have to download all the existing photos, and then upload them again on the new account.

For downloading, go and look on the internet for Flickr Downloadr, which is cross platform through Mono for Linux as well as Windows. This can be made to work some of the time, but I am seeing constant crashes when working with some of my larger albums. The biggest problem with this software is you can only download one album at a time, and often it will crash partway through the download, forcing me to download 100 photos at a time.

Still, so far I have downloaded more than 10% of the 55,000 photos on this account with this software.

Then, to upload again, well you can use the built in upload web page, which constantly has timeout errors, or pay $6 for a 1 month pro subscription which gives you access to the Uploadr app. Which is what I am doing. This is pretty reliable, but it will still take all day to do the upload, maybe longer.

So I hope the download will be all completed today, and maybe the upload to the new account will take 24 hours or something. Then I can delete the two old accounts and just keep the two current ones.

Apart from the downloadr crashing every few albums I have deleted the first batch of uploading twice, this is due to having some albums that contain large numbers of photos some of which are duplicated, and the idea is that as the uploadr can deduplicate, it gives me a chance to remove some duplicates automatically at the same time.

The Flickr Downloadr software while good when it works has obviously suffered from a lack of resources, with the download package not updated in 10 months, and no new packages from the package repository, with xenial being the last supported version of Ubuntu. Hence I doubt these bugs will be fixed any time soon.

UPDATE: Unfortunately Flickr Downloadr has turned out to be a bigger piece of crap than I thought, just completely refused to work after downloading about 25,000 images so I have to come up with a way to get the rest downloaded. Probably will have to get the originals from somewhere else like the Picasa albums I downloaded from Kahukowhai last week.

Meanwhile Flickr Uploadr has some kind of inbuilt history that I can't get rid of because I have uploaded everything and then deleted it again and now it says everything has already been uploaded when it hasn't. LOL. All very messy.

Monday 18 September 2017

Switching Google accounts

Just to make my life more complex (hopefully simpler in future) I have decided all my email accounts will consist of three Google accounts, classified as low, medium and high priority. This involves two completely new gmail accounts and one existing one. At the moment I am working on switching over my default email account to one of the new ones (the low priority one) that is the catch-all account for all sorts of web site signups. It is quite a big process switching over Google accounts. The fact this blog is going up under one of my new accounts is part of the completion which has involved changing all the blogs over to two of the new accounts (I always have two author accounts on each blog). 

Some of the big steps undertaken have been transferring and deleting email, and downloading all the Google photos. Google does provide an email and contacts transfer functionality in Gmail that I used to transfer multiple accounts with, including my soon-to-disappear Microsoft accounts. As content has been moved, old content has been removed. The next step was to download 16 GB worth of photos from Google Photos. The reason there was so much stuff in there was that it included all my old Picasa photo albums, long since forgotten about, which evidently were migrated to Google Photos some time back. I currently use Flickr to host these photos and have not done anything with the Picasa stuff for a long long time. Consequently there are many duplicates of Flickr albums and these will not be going back up on the Google Photos of my new account, only the photos taken on the phone are planned for at this stage.

I had to use Takeout to get the photos (there seems to be nearly 100,000 of them) and use a download manager to pull down the nine zip files totalling over 16 GB. Apart from Picasa I also did briefly trial the Google photo uploader to backup my photos from my PC, this was abandoned when I realised it was just a backup tool for backing up photos from a PC, not a sync tool for public albums like Picasa was. I will have to find the phone photos and download them back onto the phone because resetting it (see below) has wiped them out. The biggest potential problem from downloading these photos is whether they will have an impact on old blogs, because Google stores blog illustrations on Google Photos, so if I delete my old account it does have the potential to remove the photos including possibly the ones that were used in blogs.

Next steps including migrating calendars and the Chrome profile and this will take some investigating and testing. I have also factory reset my phone to force it to use one of my new accounts so I have had to reinstall all the apps on it. There is still a lot of account migration work to do and it will be some time before I actually delete anything. But I did delete a very old Microsoft account going back about five years and it may well be the case there are still some very old Google accounts forwarding occasional emails that will come to light as this project progresses.

Tuesday 12 September 2017

Qgis testing

Since my last post it seems fair to say that Qgis is like a lot of open source projects a largely community effort. There may be some people that are paid but in reality a lot of contributions are voluntary. I shouldn't be too demanding with issues because they are working to a deadline at the moment to produce the 3.0 release.

I have built a Debian VM to the latest development release 1caaa2e and will be looking at what my project need migrated to work with it.

Monday 11 September 2017

Debian vs Xubuntu: The ideal Qgis VM production environment

So having discovered a bug in the Qgis composer as the reason my homebuilt Qgis VMs crash and not any other reason, this idea of building from source has actually been not such a bad thing at all and has left me with something that actually works all in one VM and one that is actually licensed legit because I don't actually own any Windows licenses and haven't much like relying on some legacy licensing rights to Windows environments that I will sooner or later have to update to current.

This being achieved by being able to build what seems to be the last build (313ec55) of the Qgis development master that is actually reliable, onto versions of both Xubuntu and Debian that can run a release of Qt that doesn't have a silly floating point rendering bug that never existed until quite recently and as it happens was the version deployed with Xubuntu 16.04 which is the last version that can auto install 313ec55 from the packages without installing the latest version, so building from source on a later version of either Xubuntu, or else on a late version of Debian, was the only way of getting this 313ec55 fairly stable with few bugs version 2.99 of Qgis that I can use to develop my 2.99 map projects and use for virtually every part of the process without issues. At the moment there is still a bug that relates to saving new projects or new versions of a project that needs the use of Windows because the 2.99 project format is incompatible with 2.18 in which this particular bug has been fixed.

The development masters for 2.99 are of course part of the process of developing the new 3.0 release of Qgis and I guess there will soon be a 3.0 version out, probably in a couple of months, and hopefully all the bugs will have been fixed in it.

However a big discouragement is that the latest builds have introduced an incompatible project format and the developers have said they do not care that their new software is incapable of reading and converting older 2.99 project formats because they don't guarantee to support projects created with previous development versions of the software. This will have to be followed up to see if this is a new policy and is it reasonable.

In general as far as the VMs go both Debian 9.1 and Xubuntu 17.10 are very efficient with resources. A 12 GB VM will have loaded Qgis, but not a project, and have used up 1 GB of memory in total for the OS and the software. This is a lot better than Windows bloatware where the OS gobbles up memory like there's no tomorrow. You can build a really lean VM or PC optimised for low resource use and that's where a GUI environment that is optimised for low resource use like Xfce (which is what I have running with Debian on my Qgis build VMs) really shines, one of the reasons I stopped using Linux Mint (which I have almost forgotten) because their GUI gobbles resources unnecessarily.

Meanwhile the map work continues.


Friday 8 September 2017

Building Qgis from source code

UPDATE 2: Well in spite of my comments as below, it seems I have discovered a new bug in version 2.18 LTS of Qgis as well as the home built master with the composer crashing the software when attempting to render a map. It may well be that there is a bug in this master, the 2.18 LTS and some other versions of Qgis, and at the moment I am focusing on attempting to test and document the bug for the Qgis project. So it looks like the crashes in Debian were from the same issue, and that it is possibly a bug for Linux specific versions, but it could also be an issue that affects all versions.

It looks at this stage like I will be able to use either of those home-built VMs to do my work as the crashes are probably an inherent limitation of Qgis as a whole, rather than specific to the VMs I have built from source, but it is still a perplexing issue.

UPDATE: Whilst building from source is an interesting experience my outcomes have not resulted in more reliable operational experiences with my home built VMs. Building 313ec55 on both Debian 9.1 and Xubuntu 17.04 has resulted in unstable working environments that crash a lot more than the 16.04/313ec55 and the Windows 7/11812846 VM environments that I have used up until now. It is therefore my decision to continue the development of the project on 313ec55, and another Linux VM running the 2.18 stable release will be used to produce, as it is capable of reading enough data from a 2.99 project file to properly render the output.

With a FOSS package like Qgis there will be times you want to build it from source. For me this is getting a working version of the development master because newer versions sometimes have bugs, or are unavailable on the OS you are running on your computer. Whilst it may sound like a big deal to build from source, it is surprisingly straightforward.

In my case I have always tried running development versions of Qgis, but it gets risky running them when a new version comes out and has major bugs, as is currently the case with latest builds that trash the CRS settings and fail to load some layers (possibly these are connected reasons).

 The steps for building Qgis in a nutshell are:

  • Set up a development environment. In this case, a Vbox virtual machine is the best development platform. Once you have one set up with the basics you can easily clone it, and they don't need much memory or disk space. 
  • You'll need to install some packages specific for the Qgis build.
  • Set up some folders to hold the build files, the destination installation, and change a few settings here and there.
  • Get a build of the source code from the appropriate branch on the QGIS github site. In this case the master branch. Master builds happen every day, so there are plenty to choose from.
  • After you get the zip file downloaded, check the version number in CMakeLists.txt to ensure you have the version. There are currently 48 different branches on Github and all various versions too. Near the top of the file you will see stuff like, in the case of 2.99, different variables spelling out "2", "99","0" which tell me I have build files for 2.99.0 as I want it to be. My first build attempt on a Xubuntu zesty vm accidentally got a 2.18 build instead of the master. 
  • I have tested both Debian 9.1 (stretch) and Xubuntu 17.04 (zesty) VM environments. In the end, Xubuntu was more stable than the Debian VM with XFCE, which had a few problems and crashes. Debian would be fine as long as it was stable; I found it straightforward to set up and it went well until it became unstable and had to be reset, then was extremely slow to stabilise.
  • Set up the Cmake environment using ccmake. Mostly using the default settings, but we can set a variable to provide the destination folder that our build will be installed to, for actual usage. This is important because the default will install a new build over the top of an old one, whereas we want to let each build have its own install folders, so we can (theoretically) have more than one of them.
  • Tell ccmake to generate the makefiles for the build
  • Run make to build Qgis. This will take a while, maybe an hour to 90 minutes to complete. This part is actually quite painless considering how many steps are involved, as there will be thousands of lines of output scrolling up your screen. Most files take only 5-10 seconds to build, but occasionally it all slows down as something really big gets built. There are more than 13,000 source files in the build which take up 436 MB of disk space. At 35% of the build the make had made 6500 new files occupying 200 MB which had taken about 20 minutes, so there is a lot of file crunching going on.

  • In my case I chose build 313ec55, a release with which I have had some success on 16.04. The main problem running it on 16.04 is the version of Qt which has a bug that causes decimals to be rendered without rounding, this means a distance that you want displayed as 214.38 km will often show as 214.37999999999. In order to address this I followed the instructions (despite warnings) to build against Qt5 and the version that installed from Ubuntu's zesty archives was 5.7.1 which has the fixes in it. 
  • The steps listed for building against Qt5 with this release are slightly different than for Qt4 (recommended) but the release build of 313ec55 that has been installed in some of the other VMs I have used is built against this edition so there shouldn't really be an issue. So far there hasn't been.
  • The build went well and I am now testing this VM. Since it works and since I have fixes for the lack of files that can be opened at once (see a recent post) it looks like this custom build VM will be the ideal environment for both developing and publishing the maps since it eliminates the main restriction that has existed to prevent publishing from being done with a Xubuntu VM based version.
  • It looks like a lot of more recent builds (from June) have the serious issues with project data not loading correctly so I can't expect to find a more reliable later build and they have other issues with the designer that I don't like either.
  • 313ec55 still has an issue with relative paths not being followed, I have not been able to find out which later master build fixed the issue. If I have to save a new project at any stage, a Windows VM running 1182816 will have to be used, as this doesn't have this issue. 

Tuesday 5 September 2017

How to change the account order in Thunderbird

Thunderbird doesn't provide a user interface for changing the order of accounts in the left hand pane, but Config Editor can be used to change it.

See the guide here

In my case I have 11 accounts listed including the two special ones, so basically 9 email accounts. I expect to tidy all that up and have only three or four listed by the end of this year. Getting it to put the most important ones at the top has been good.