Comparason between Shure MV88+, MAudio Uber mic and Neumann TLM 102 microphones

I haven’t edited these sound samples at all. I have also left each microphone flat. Without any EQ or editing. You can hear two computers in the background sometimes. I also haven’t done anything with the volumes. I could probably have nudged the desk a little higher. But to be honest, I just couldn’t be bothered re-recording that sample again.
Each sample has a test phrase spoken by me, then a quick line of a song sung by my daughter Méabh. I included her as it is a contrasting voice and it shows the clarity and brightness of some of the microphones.
I won’t say what microphone I think is better here. That’s open to your own determination.
What I can say is I was less than happy with the extra background noise in the Shure MV88+ but I love it’s well rounded sound. The Neuman TLM 102 is crystal clear. The MAudioUber mic blew me away though. Very little background noise, maybe it’s not as well rounded as the other two but very very crisp.

I don’t know if it needs to be said, but of course, all three microphones have their place. The Shure MV88+ is portable and is perfect for video recording or recording live music when out and about. The Neuman TLM 102 is a studio microphone that requires a cannon connection to a mixing desk. The MAudio Uber mic is a studio microphone perfect for podcasting. It has an advantage in that it doesn’t require a mixer. Still, the comparison was fun.

BoseS1 Pro – Can not update firmware

I have had a Bose S1 since they came out around 2018. I have used this in small gigs very regularly. But it has always just worked. So there was no need to ever update the Firmware. However, because it’s looking like gigs will be out doors for the foreseeable future and the Sonos S1 Pro has a 6 hour battery, I decided that two speakers were better than one. Off I went to my local stockest and purchased another. However, I quickly found that it wasn’t possible to link them both using Bluetooth as the old speaker had never received a firmware update. Connecting the speaker to the computer was fruitless. No connection was detected. Connecting the new speaker worked so I knew that there was obvilusly something wrong with the old speaker. A quick search online showed me that there were indeed defects in the 2018 early models. The overwelming responses recommended that the speaker was sent back to Bose for repair.

I called Bose, explained the problem and they booked it in for a repair. First though, they tried to charge me 150 Euro because the speaker was three years old. I firmly told them that this wasn’t going to be acceptable as I had forum posts prooving that this was a known problem. After escalating my demand they finally agreed. That was just 3 weeks ago.

I sent the speaker away and was informed that it would be at least 6 to 8 weeks before I would get it back. But three weeks later, here it is. Working perfectly.

I love the Bose products. I have the S1 Pro, the L1, the QC30 and the QC 65’s. But now I can also say that I’m quite impressed by their support as well.

Of course, I just had to test the stereo sound in my home office earlier.

Shows the two Bose S1 Pro speakers on floor stands in front of a wall.

Uilleann pipes with a top class brass band.

Back in 2019 when the world was a little more normal, I was honored to be asked to perform with the multi-award-winning Drogheda Brass Band. So a few months ago when their musical directory John Carpender asked me to record some music with them, I jumped at the chance. Here it is. The Parting Glass. Arranged for uilleann pipipes and brass band.

Getting out in the air

The weather is finally starting to become a little more mild. The evenings are stretching out a little and the mornings are finally losing their bitterness.

I’ll tell you now that I’m not staying within 5km. But I’m using common sense. I’m traveling to areas where there aren’t many people and at times where it’s less likely to meet any kind of gathering.

There are a few reasons for this but they all boil down to a simple fact. I sit at a desk for way too many hours each day. Even before and during Christmas, we didn’t relax our precautions even when the rest of the country were settling in for a slightly more relaxed Christmas. But this has been taking a toll. Very simply, my head needs open spaces, air and sun. I have walked the dog for between 6 and 8km almost every day for the past seven months. But trapping along the same old streets in circuits that always lead back to the start of the route is just not providing enough stimulous.

So. Let me treat you to a few pictures taken over the past few weeks when we ventured a little outside the 5km corden.

Looking out to sea from the head at Clogher head.
It was still very mucky and clambering up to these heights was hard. But we really enjoyed it.
Yet another very mucky walk. This time across from Old Bridge. In Townly hall.

Create an encrypted backup from your NAS directly to a cloud based file store. Such as Google drive, Onedrive, Amazon S3 or Dropbox.

I have a lot of files. Not compared to some people. There’s only about 5tb actually. But they are almost all valuable. They have pictures of the children, family members, various guide dogs, holidays etc. I also have priceless audio recordings and videos that I really don’t want to lose. Of course, I back everything up to the NAS regularly and the NAS is in a RAID configuration so that if a disk fails, I can get the data back. I even have a few disks lying around with full backups that have been taken over the years. However, I wanted something a lot more reliable than this. I don’t consider myself to be a tin foil wearing member of the paranoid club but I don’t necessarily fully trust companies to keep my data safe and private either. So I have never backed up all of my treasured files to the cloud because I didn’t want them to be available to someone in the event of a breach on that platform.

A friend was talking about RClone last week. He surprised me by saying that he could even mount the encrypted backups and browse through them to recover specific files. More interestingly, the files didn’t need to be archived before copying them off the local NAS. He was using it through NextCloud however so I assumed wrongly that it was either going to be difficult to set up or it wouldn’t do what I expected. I’m really glad that I’m wrong here.

Getting it up and running is very easy. Install it, create a new mount for your cloud provider then create a second mount for the encryption.

You will then copy files to the encryption mount and it will in turn send the files over to your cloud provider.

If I was to write all the instructions out, I would be doing nothing more than replicating the work of people who know a lot more about RClone than I. So instead, please visit the following fantastic resources.

Google drive (

Tutorial for making an encrypted backup on cloud storage using rclone. · GitHub

Crypt (

If you are trying this out and you need a hand, give me a shout.

Maintaining personal security online. Passwords must go.

Like everyone, I have multiple accounts. Probably hundreds of accounts online. However, unlike everyone, online, I have a slightly higher than average level of exposure. Search for DigitalDarragh or Darragh Ó Héiligh and you will find that I’m involved in quite a few things. So my personal security or PerSec online must be at a higher than normal standard.

I use multi factor authentication for every online application I can. I also mix the types of multi factor that I use. I have some that use the Lastpass multi factor app, some that use Google and some that use Microsoft’s offering. I also have more than a few that use DUO and for the most crucial accounts, I have a third factor of security in the form of a physical key or token that I must have in my posession before I can log in.

The front line of defence of course is my passwords. I’m therefore using two password management applications. Both with the highest level of enterprise quality protection. My passwords are usually between 36 and 50 characters long.

I hear you cry out. That’s overkill! And yes. For some services, you’re absolutely right. It is. But I nearly got caught in the past. It made it very clear to me that the level of risk I hold is particularly high. If someone got access to one of my core accounts associated with my identity, they could possibly use that to gain access to other accounts through social engineering or even vulnerabilities that have yet to be exploited. My life is online. I rely on the Internet for more than most people. So locking up my online identity is as important as ensuring there’s a good lock on the doors and windows of a house.

But this is getting absolutely rediculous. I spent about half an hour every two months resetting passwords and updating those passwords on various applications that I have running on desktops, laptops, phones and tablets that I use almost every day. When password-less really becomes a viability for commonly used applications, I’ll be first in line.

Even with the risks I’ve just briefly written about, I feel frustrated and that my time is being wasted several times a day when I open an accounts package, or my Email and even though I was just using that device a moment ago, that application in it’s own sandbox must validate my identity using face ID. Yes. I’m reasonably confident with the lengths I’ve gone to with the aim of improving my personal security online. But how much more can we do? Microsoft, Google, Facebook, Twitter, Apple, Red Hat, IBM, all of these companies need to get serious about the next evolution in security. Perhaps it will be Fido3. This burden can’t be held by the end-user. I’m an unusual case. But I can see this kind of weight being placed on end-users in the next ten years if passwords remain the front line.

aLockdown on a Wednesday night. Server performance monitoring scripts. Wednesday night during lock down. Let’s do some scripting to get details from a few dozen servers.

Objective: Monitor CPU, RAM and disk activity on 54 hosts running Windows 2016. If possible, also monitor for GPU RAM usage.
Why? I had a conversation today where the performance of this large cluster was questioned. I’m certain that the performance is well within tollerence but as they say, when you’re explaining, you’re losing. So I’m bypassing all of that and I’ve decided to use PowerShell to create a dashboard so that the service owners can independently track the realistic server usage.

I put the children to bed around 8pm. A coleague started migrating this server to a different VM cluster 2 minutes after I got logged in at about 10 past 8. After a moment of writing about how there’s not much else to do in the evenings other than catch up on work as a result of lockdown, I decided to try to save some time by getting the PowerShell commands written and in one place.

I wrote the scripts to grab CPU and RAM usage. They take Server as a parameter. They return the value as a per centage to two decimal places. Unfortunately, I couldn’t find a way of getting the per centage of free GPU RAM. But I know I can get this from the event viewer when I need to know if the server is running low.

I have the scripts running perfectly on my local machine but there’s a firewall rule on the network side blocking RPC. I’ve sent on a request to our fantastic senior network engineer so I’m sure the rules will be changed in the next day or two.

Here’s a great resource on the Microsoft website that details the various inbound ports required for various services that run on Windows.

Function GetAvailableRam {
        [Parameter(Position=0, Mandatory = $true, HelpMessage="Provide a server name")] 
        [string] $Server
    [string] $Hostname = $Server + ""
    $strComputer = $Hostname
    $a=Get-WmiObject Win32_OperatingSystem -ComputerName $strComputer | fl *freePhysical* | Out-String
    $b=Get-WmiObject Win32_OperatingSystem -ComputerName $strComputer | fl *totalvisiblememory* | Out-String
    $a = $a -replace '\D+(\d+)','$1'
    $b = $b  -replace '\D+(\d+)','$1'
    $FreeRam = [math]::Round($a/$b*10000)/100

Function GetCPUPerformance {
        [Parameter(Position=0, Mandatory = $true, HelpMessage="Provide a server name")] 
        [string] $Server
    [string] $Hostname = $Server + ""
    $CPUAveragePerformance = (GET-COUNTER -ComputerName $Server -Counter "\Processor(_Total)\% Processor Time" -SampleInterval 2 -MaxSamples 5 |select -ExpandProperty countersamples | select -ExpandProperty cookedvalue | Measure-Object -Average).average
    # Write-Host "Average of CPU usage (calculated with 5 Sample with interval of 2 sec) :" $CPUAveragePerformance

Get-WmiObject -Computername $Server Win32_VideoController | select name, AdapterRAM,@{Expression={$_.adapterram/1MB};label="MB"}

Using Nextcloud with a NAS that doesn’t natively support it.

My uncle has this saying. You’ll regularly find him making some contraption or fixing something or digging a hole. He’s the kind of person you expect will have his head stuck under the bonnet of a car. You know the kind of fella. If the car doesn’t start, your heating isn’t working or there’s a blague of zombies sweeping the land, he is the kind of person you want beside you because as sure as day follows night, he’s going to come up with something useful. Anyway, his saying is he’s just “Tricking around”. Well, I find myself doing a lot of that. In a very different way of course. I admire him. I would love to be able to do what he does but my tricking around is usually along the lines of something technology related. So when a friend said he was looking at UnRaid, I had absolutely nothing to do that evening so I thought I’d spin up a machine and take a look at it along with him.

UnRaid is an amazing project! It has support for software RAID, storage caching, backups and everything else you would expect from a powerful and versatile NAS operating system. But it also has a lot more. It has support for running Docker containers and a really smart bloke extended this functionality so that it can use the Docker Hub. The Docker Hub is a registry of community contributed and private docker containers. There’s thousands of them! HomeBridge, Unifi, Apache, enginX, the list goes on and on. I got UnRaid set up during that evening and the next night I got Docker containers configured for Lets encrypt, MariaDB and Next cloud. But instead of being happy with this solution, it gave me a thirst to learn more about Docker. I had of course heard of Docker before and I have used it in a very limited way. But I saw an actual use case where I had a good insentive outside of work to use it.

I gave UnRaid it’s last rights and promptly formatted the operating system disks. I then spun up a VM on my trusty Dell poweredge server and began the very interesting and very fast process of installing Docker. Of course, installing and configuring Docker can be as complex as you like but I’m a newb at this so within about an hour I had the installation running as I expected it to.

I then installed containers for Lets Encrypt, MariaDB and Nextcloud. My YML file wasn’t quite right though so I made a mess of the networking. Another few days later and I finally got time to get back to this. Armed with a little more experience and knowledge, my YML file was perfect and with just one command, it worked beautifully. I already had the port open from the outside world so LetsEncrypt gave me the certificate right away and within seconds I had Nextcloud singing and dancing on my screen. The MariaDB config had already been completed through the YML file template so I was delighted with the simplicity.

I’m not an expert. In fact I’m absolutely clueless when it comes to docker still. I’m still only learning what can be done with it. I realized for example that Cifs wasn’t installed on NextCloud. Fortunately, mounting the container was incredibly simple and in no time I had Cifs installed.

Why did I need Cifs? Simple. I love my existing NAS. It’s been running for a lot time and it has every feature it could possibly need. Here’s the thing that’s probably going to make you laugh at all of this time that I invested. My NAS has all the features of NextCloud and more. But at least now i have something that I don’t quite need but still is useful in a way and while making and breaking it over several evenings, I’ve managed to learn more about Docker. I have more to learn. I think I’m going to install the Unifi container next. I don’t want to put too much into Docker. I don’t understand it enough yet and things like the Unifi controller have been working from a Raspberry pi for years. I really don’t want to break it. That controller manages all of the wireless access points in the house so without it, I would be a little stuck.

I learn so much more with this kind of thing than I do in courses or through reading. I just need to find the motivation.