Create an encrypted backup from your NAS directly to a cloud based file store. Such as Google drive, Onedrive, Amazon S3 or Dropbox.

I have a lot of files. Not compared to some people. There’s only about 5tb actually. But they are almost all valuable. They have pictures of the children, family members, various guide dogs, holidays etc. I also have priceless audio recordings and videos that I really don’t want to lose. Of course, I back everything up to the NAS regularly and the NAS is in a RAID configuration so that if a disk fails, I can get the data back. I even have a few disks lying around with full backups that have been taken over the years. However, I wanted something a lot more reliable than this. I don’t consider myself to be a tin foil wearing member of the paranoid club but I don’t necessarily fully trust companies to keep my data safe and private either. So I have never backed up all of my treasured files to the cloud because I didn’t want them to be available to someone in the event of a breach on that platform.

A friend was talking about RClone last week. He surprised me by saying that he could even mount the encrypted backups and browse through them to recover specific files. More interestingly, the files didn’t need to be archived before copying them off the local NAS. He was using it through NextCloud however so I assumed wrongly that it was either going to be difficult to set up or it wouldn’t do what I expected. I’m really glad that I’m wrong here.

Getting it up and running is very easy. Install it, create a new mount for your cloud provider then create a second mount for the encryption.

You will then copy files to the encryption mount and it will in turn send the files over to your cloud provider.

If I was to write all the instructions out, I would be doing nothing more than replicating the work of people who know a lot more about RClone than I. So instead, please visit the following fantastic resources.

Google drive (rclone.org)

Tutorial for making an encrypted backup on cloud storage using rclone. · GitHub

Crypt (rclone.org)

If you are trying this out and you need a hand, give me a shout.

Maintaining personal security online. Passwords must go.

Like everyone, I have multiple accounts. Probably hundreds of accounts online. However, unlike everyone, online, I have a slightly higher than average level of exposure. Search for DigitalDarragh or Darragh Ó Héiligh and you will find that I’m involved in quite a few things. So my personal security or PerSec online must be at a higher than normal standard.

I use multi factor authentication for every online application I can. I also mix the types of multi factor that I use. I have some that use the Lastpass multi factor app, some that use Google and some that use Microsoft’s offering. I also have more than a few that use DUO and for the most crucial accounts, I have a third factor of security in the form of a physical key or token that I must have in my posession before I can log in.

The front line of defence of course is my passwords. I’m therefore using two password management applications. Both with the highest level of enterprise quality protection. My passwords are usually between 36 and 50 characters long.

I hear you cry out. That’s overkill! And yes. For some services, you’re absolutely right. It is. But I nearly got caught in the past. It made it very clear to me that the level of risk I hold is particularly high. If someone got access to one of my core accounts associated with my identity, they could possibly use that to gain access to other accounts through social engineering or even vulnerabilities that have yet to be exploited. My life is online. I rely on the Internet for more than most people. So locking up my online identity is as important as ensuring there’s a good lock on the doors and windows of a house.

But this is getting absolutely rediculous. I spent about half an hour every two months resetting passwords and updating those passwords on various applications that I have running on desktops, laptops, phones and tablets that I use almost every day. When password-less really becomes a viability for commonly used applications, I’ll be first in line.

Even with the risks I’ve just briefly written about, I feel frustrated and that my time is being wasted several times a day when I open an accounts package, or my Email and even though I was just using that device a moment ago, that application in it’s own sandbox must validate my identity using face ID. Yes. I’m reasonably confident with the lengths I’ve gone to with the aim of improving my personal security online. But how much more can we do? Microsoft, Google, Facebook, Twitter, Apple, Red Hat, IBM, all of these companies need to get serious about the next evolution in security. Perhaps it will be Fido3. This burden can’t be held by the end-user. I’m an unusual case. But I can see this kind of weight being placed on end-users in the next ten years if passwords remain the front line.

aLockdown on a Wednesday night. Server performance monitoring scripts. Wednesday night during lock down. Let’s do some scripting to get details from a few dozen servers.

Objective: Monitor CPU, RAM and disk activity on 54 hosts running Windows 2016. If possible, also monitor for GPU RAM usage.
Why? I had a conversation today where the performance of this large cluster was questioned. I’m certain that the performance is well within tollerence but as they say, when you’re explaining, you’re losing. So I’m bypassing all of that and I’ve decided to use PowerShell to create a dashboard so that the service owners can independently track the realistic server usage.

I put the children to bed around 8pm. A coleague started migrating this server to a different VM cluster 2 minutes after I got logged in at about 10 past 8. After a moment of writing about how there’s not much else to do in the evenings other than catch up on work as a result of lockdown, I decided to try to save some time by getting the PowerShell commands written and in one place.

I wrote the scripts to grab CPU and RAM usage. They take Server as a parameter. They return the value as a per centage to two decimal places. Unfortunately, I couldn’t find a way of getting the per centage of free GPU RAM. But I know I can get this from the event viewer when I need to know if the server is running low.

I have the scripts running perfectly on my local machine but there’s a firewall rule on the network side blocking RPC. I’ve sent on a request to our fantastic senior network engineer so I’m sure the rules will be changed in the next day or two.

Here’s a great resource on the Microsoft website that details the various inbound ports required for various services that run on Windows.

Using Nextcloud with a NAS that doesn’t natively support it.

My uncle has this saying. You’ll regularly find him making some contraption or fixing something or digging a hole. He’s the kind of person you expect will have his head stuck under the bonnet of a car. You know the kind of fella. If the car doesn’t start, your heating isn’t working or there’s a blague of zombies sweeping the land, he is the kind of person you want beside you because as sure as day follows night, he’s going to come up with something useful. Anyway, his saying is he’s just “Tricking around”. Well, I find myself doing a lot of that. In a very different way of course. I admire him. I would love to be able to do what he does but my tricking around is usually along the lines of something technology related. So when a friend said he was looking at UnRaid, I had absolutely nothing to do that evening so I thought I’d spin up a machine and take a look at it along with him.

UnRaid is an amazing project! It has support for software RAID, storage caching, backups and everything else you would expect from a powerful and versatile NAS operating system. But it also has a lot more. It has support for running Docker containers and a really smart bloke extended this functionality so that it can use the Docker Hub. The Docker Hub is a registry of community contributed and private docker containers. There’s thousands of them! HomeBridge, Unifi, Apache, enginX, the list goes on and on. I got UnRaid set up during that evening and the next night I got Docker containers configured for Lets encrypt, MariaDB and Next cloud. But instead of being happy with this solution, it gave me a thirst to learn more about Docker. I had of course heard of Docker before and I have used it in a very limited way. But I saw an actual use case where I had a good insentive outside of work to use it.

I gave UnRaid it’s last rights and promptly formatted the operating system disks. I then spun up a VM on my trusty Dell poweredge server and began the very interesting and very fast process of installing Docker. Of course, installing and configuring Docker can be as complex as you like but I’m a newb at this so within about an hour I had the installation running as I expected it to.

I then installed containers for Lets Encrypt, MariaDB and Nextcloud. My YML file wasn’t quite right though so I made a mess of the networking. Another few days later and I finally got time to get back to this. Armed with a little more experience and knowledge, my YML file was perfect and with just one command, it worked beautifully. I already had the port open from the outside world so LetsEncrypt gave me the certificate right away and within seconds I had Nextcloud singing and dancing on my screen. The MariaDB config had already been completed through the YML file template so I was delighted with the simplicity.

I’m not an expert. In fact I’m absolutely clueless when it comes to docker still. I’m still only learning what can be done with it. I realized for example that Cifs wasn’t installed on NextCloud. Fortunately, mounting the container was incredibly simple and in no time I had Cifs installed.

Why did I need Cifs? Simple. I love my existing NAS. It’s been running for a lot time and it has every feature it could possibly need. Here’s the thing that’s probably going to make you laugh at all of this time that I invested. My NAS has all the features of NextCloud and more. But at least now i have something that I don’t quite need but still is useful in a way and while making and breaking it over several evenings, I’ve managed to learn more about Docker. I have more to learn. I think I’m going to install the Unifi container next. I don’t want to put too much into Docker. I don’t understand it enough yet and things like the Unifi controller have been working from a Raspberry pi for years. I really don’t want to break it. That controller manages all of the wireless access points in the house so without it, I would be a little stuck.

I learn so much more with this kind of thing than I do in courses or through reading. I just need to find the motivation.

Uploading files to VSphere 6.x web UI fails

This is a really simple problem to fix. But it’s irritating so I’m going to write a quick blog post about it.

You go to upload a file to a datastore in VSphere. The upload fails. You seean error message saying that there’s an untrusted certificate.

Control left click the link in the error to open the ESXI host in a new window and accept the untrusted certificate. It’s rather stupid in my opinion that the upload requires a direct connection on the client side to the host ESXI server and that this shows up as an error. But the fix is easy.

The way the error is presented is quite senseless as well though. There’s a read more link in the upload list. This presents a modal dialogue. Surely if you’re uploading something, it’s not unreasonable to have a status window shown for the duration of the upload where errors and other contextual information can be shown.

Overall, I like the VSphere web interface. I have been using VSphere for about 11 years now and the web UI is generally quite okay. But sometimes, the design decisions are a bit nuts in my opinion.

Adding unsupported Zigbee lights to Apple Home.

I bought a few GU-10 light bulbs, some sockets and two LED strip lights from Lidl last week. They comply with the Zigbee compliant version 3 standard. I already had a Philips Hue hub so when Lidle had an offer on smart home gadgets I jumped at the chance.

But I quickly ran into a problem! I could see my Philips Hue bridge in the Apple Home app but I could never find the lights in the Home app. A bit of digging told the story. Well, it actually told several stories. Certain blog posts say that the devices must be the same maker as the bridge. So that would mean that I could only use Philips lights with the Philips bridge. Other posts said that only devices compatible with the Friends of hue standard would work. Either way, I had two choices. Either send the lights back or try to find a way of making them work. So I picked the second option. Mainly because while searching on Google for various reasons why this might not be working, I found people talking about this cool little application written using Note.JS that was Homekit compliant. This would act as a bridge between Home kit and Philips Hue so that devices that weren’t quite compliant would be recognised. It’s a beautifully light application that packs a huge amount of power. Isn’t open source amazing!

Please enter Homebridge!

Because I have plenty of Raspberry Pi devices hanging around, and actually one is already used as my Wifi controller, it was easy to get this up and running. There are plenty of guides on the interwebs.However, I encountered a few problems. So I thought I should write down my experiences. The troubleshooting secion is at the bottom. Scroll down if you have already gone through the installation.

Installing Homebridge on a Raspberry pi

Install the repository:

curl -sL https://deb.nodesource.com/setup_12.x | sudo bash -

Install Node.JS, Python and the GCC compiler.

sudo apt install -y nodejs gcc g++ make python

Verify that Node.js is installed:

node -v

With any luck, you will be presented with the version of Node.JS installed. Make sure that it’s higher than version 14. Also, keep an eye on the homebridge.log file later on to see if there are errors relating to the version of Node.JS installed. If Homebridge takes advantage of new features in later versions of Node.JS, you will want to make sure you have the latest version.
Install the latest version of NPM is installed. If it’s not there already.
sudo npm install -g npm

Now you can install Homebridge. You have all the dependencies at this point.

sudo npm install -g --unsafe-perm homebridge homebridge-config-ui-x

Install the service so that it runs at boot:

sudo hb-service install --user homebridge

Now you need the iP address of your Raspberry pi. Type one of the following to obtain it:

ip a


homename -i

Homebridge listens on TCP port 8581 by default. If you have a firewall, please open this port. The default username and password are admin. Use the following URL to access the Homebridge UI. Replace with your Raspberry Pi’s IP address:
http://:8531

Troubleshooting

The first thing I noticed was when I moved among wifi zones in my house covered by different access points, I lost the connection to Homekit and it took IOS ages to reaquire the connection. This lead to discovery number one and two.

  • Discovery 1
    If you are using Homebridge, you should really leave an Apple TV or iPad connection available at all times. Something needs to keep the connection persistent to Homebridge. Your phone, watch etc will then connect back to that hub.
  • Discovery 2
    If you try to connect to Homebridge twice from the same device as a result of the connection being dropped when you move between wireless access points, you will first need to manually delete your device from the persistent cache on the Homebridge server.

Remove the persistance cache from Homebridge

  1. The Homebridge installation is at /var/lib/homebridge and there is a directory in there called Persist. It’s in there where the files need to be removed. Change to that directory
    cd /var/lib/homebridge/persist
  2. Remove the files in that directory.
    rm -rf *

Restarting Homebridge

Restart the Homebridge service.
service homebridge restart

Check the Homebridge logs

You can always find the logs in the web interface. But I like to be able to grep them etc so here goes.
cat /var/lib/homebridge/homebridge.log
You can also see the new log entries as they are written with this command:
tail -f /var/lib/homebridge/homebridge.log -f

WordPress update. Categories are back again.

This was probably an accessibility problem but prior to wordpress 5.6.1, adding categories to a post was something that was stupidly complicated to the point that it was hit and miss. I’m probably one of the best around at finding weird to reach content. I look through the source code, the document object model and every other part I can think of to find content that might be hidden in drop down menus, tabs etc. But I couldn’t find a consistent way of reaching the categories list. It was really bugging me! It put me off writing anything new on the blog for the past month. I’m delighted today. . I updated WordPress core and the plugins and now finally categories are back where I expect them again. This wonderful news deserves a blog post all of it’s own.

Lock down podcast with the family – Week 2

There wasn’t actually a week one of this podcast. There might not eve a week three or week four either. But here’s how we are getting on with lock down so far. This includes a lot of input from Méabh and Rían and recordings of their experiences while home schooing. It’s best listened to with headphones.

NVDA keyboard command modification for the review cursor.

There is absolutely nothing wrong with the NVDA keyboard commands by default. But I had a little say in how Orca’s keyboard commands work in the terminal a very very long time ago and I thought it always suited really nicely. To be clear, when I said that I had a say, I mean that I was vocal in my opinions as to how it would work. The implementation was by people who worked for Sun at the time such as Bill, Peter and many others who knew a hell of a lot more about how this worked than me. Sorry. I’m getting off track yet again. I like the way Orca handles reading the console and that’s primarily where I need the review commands in NVDA so I like to replicate the functionality. However, every time I reinstall NVDA I go through the process of recreating my keymap manually. This is a huge waste of time. So for the future Darragh who wants to grab this quickly, here you go. You can thank me later.

[globalCommands.GlobalCommands]
review_nextLine = kb(laptop):nvda+o
review_previousLine = kb(laptop):nvda+u
review_nextCharacter = kb(laptop):.+nvda
review_nextWord = kb(laptop):l+nvda
review_currentLine = kb(laptop):i+nvda
review_currentWord = kb(laptop):k+nvda
review_previousCharacter = kb(laptop):m+nvda
review_previousWord = kb(laptop):j+nvda
review_currentCharacter = “kb(laptop):,+nvda”

FT232 USB UART driver installation step by step installed. For TinyDuino board.

About 7 years ago, I bought a Tiny Duino board.  It seemed like the perfect way for me to work on some cool Arduino projects. It’s modular and tiny! By modular I mean that there are modules for LED’s, wifi, IO ports, battery power, sensors and more.  All without any soldering.  It got left in a drawer though as I had problems getting it up and running at the time.  But I’ll get to that later because I know most of you just want me to get to the good stuff.  How do you get this thing working in Windows 10?  The supplied Arduino drivers don’t work. 

Windows sees this as a FT232 USB UART device under the “Unknown Devices” node of device manager.  In fact, if you are unlucky like I was, it might not see it at all.  The Tinyduino setup page actually writes this out in black and white.  Not all USB cables are created equally.  Some are made to deliver power only for charging.  They may not all be created to handle data transfer.  This caught me out six or seven years ago and it caught me out tonight as well.  I tried two cables before I decided to take the warning on this page into account.  I just thought they were being over cautious.  But after rummaging in my daughters room for the cable from her camera, I was delighted and a little frustrated to find that this worked perfectly but the other two cables from my stash did not.  She will not be happy when she finds out I’ve robbed her cable.

I was just about to celebrate when I heard Windows moan that it couldn’t fully install the driver for an FT232 USB UART device.  But fortunately a big of Googling quickly took me to the download page for that driver.  I extracted the zip file into a folder and told Windows to look in a specific location for the driver files. I assume that if you are working on a device like this that you probably know how to update a driver but just in case:

  1.  Go to start
  2. Type
    Device manager
  3. Expand Unknown Devices
  4. Right click the FT232R USB UART device
  5. Click Update Driver
  6. Choose the second option in the wizard to install the driver locally.
  7. Browse to the location where you have extracted the zip file to
  8. Follow the remainder of the wizard.

Are you following this to get the TinyDuino working?  If so, you will notice that you still don’t have a serial port available. 
But Darragh, How do I get my serial port?  That’s simple too.

  1.  Click start
  2. Type
    cmd
  3. In the prompt type the following command then press enter:
    chgport

You will either see a com port or a message saying that no com ports were found.  If you don’t see a com port, you need to go back and update another driver. 

Go back into your device manager, refresh the list and under Unknown Devices you will find a serial adapter.   Right click this, click Update Driver and follow the steps shown above.  Now you havea serial adapter.

You might want to do one more thing if you are working on the TinyDuino board.  You need to configure the Arduino IDE to use the right com port and board.  Select the Arduino Pro under the board type. 

However, I’m not using the Arduino IDE because it’s a horrible, steaming pile of crap.  Instead, I’m using the Platform.IO extension in VSCode.  This is beautiful.  It works so nicely.  Unfortunately, I needed to have the Arduino IDE installed as well but that can just sit on the disk gathering dust.  There’s one thing you need to do in the platform.IO ini file in your project.

Add the correct board and port.
board = pro16MHzatmega328

upload_port=COM3

It will likely find the port automatically but after the other trouble I had, I didn’t want to leave this to chance.

I hope that helps you.  But I usually write these blog posts more for my benefit than anything else.  So I hope this helps me out in the future.  No doubt in a few years I’ll come back to this and will have forgotten all about what I had to do tonight to make it work.

Pictures of the TinyDuino from two angles.