phpmyadmin missing mbstring


You open PHPMyadmin and you get this error:
phpmyadmin missing mbstring


It’s easy to fix. just type this at a prompt on the server:

sudo apt-get install php-mbstring

Then restart apache with:

sudo systemctl apache2 restart


Now, why oh why would this suddenly happen for no reason that I can think of?  I haven’t used PHPMyadmin in years on this server so it’s probably a package that got removed during an apt-get autoremove at some point but still. It’s obviously in use, why just automatically remove it.

Nothing in this job that I’m doing at the moment has been easy. The following posts have been written. I’ll explain what I’m doing very shortly.

I’ll give you a very brief idea of what I’m doing.
I’m spending too much on cloud hosting. I have several projects on the go. Most are aimed at promoting traditional Irish music.

  • Ceol FM online streaming service. Promotes traditional Irish music around the world.
  • Music at the Gate. Promotes traditional Irish music in Drogheda. I would love my children to grow up surrounded by Irish culture.
  • Darragh Pipes. I love playing music. This promotes my own performances.

Then I also run this site, Computer Support Services and a few other websites from different servers as well.
It’s costing far too much.
So I’ve bought a reasonably powerful box, installed Hyper-V and I’m running everything off several virtual machines.
When I’ve had free time, I’ve been working on migrating everything across. But sites like Ceol FM provide additional functionality over and above just a simple website so migrating this functionality isn’t straight forward. Obviously hosting everything on server is tricky to do properly. Security needs to be a priority as does monitoring and bandwidth / resource control.

Recording from the Grey Goose weekly session

Every Monday night there’s a great session in the Grey Goose in Drogheda.  This is a tune that I recorded last night.

I picked up this tune about twelve years ago while touring for music around Israel.  By chance I met up with a Romanian group that loved traditional Irish music.  For days we swapped music.  This was one of the tunes I picked up from then.  Years later, it’s still a tune that I really enjoy to play.

Windows 2019 product key fails with error (0x80070490)

This seems to be a very commonly encountered problem based on about three minutes hopping around sites from Google.

But the solution is very easy. I stumbled across it as while I’m preparing a test server running Windows 2019, I’m also multi tasking by installing a KMS server.   That’s a story for another time.

Anyway. The installation of the product key fails with the error (0x80070490).

To get around it, just install the product key from the command line.

Open up the command line as an administrator and issue the following command.  Of course, replace your product key with the MAC product key that you’ve obtained from Microsoft.

cscript c:\windows\system32\slmgr.vbs /ipk [your product key]

If only things could be straight forward.

Tech failures

I’m good at my job. Hand me a virtual infrastructure build on VMware, Hyper-V or even Xen, active directory, any version of Windows, most Linux distributions, PHP, MySQL, NginX, Postfix, Currier, Exchange, Azure, Office365, GSUite and all that and I’ll give you a system that’s efficient, stable and cost effective. I can even dabble in .net, php, vbs, powershell, bash, SQL and at a stretch, c and mongodb.

So I’m quite confident in what I’m doing normally.

But January has been a complete Fuster Cluck of problems that have either taken me far too long to figure out or are still waiting on my ever growing to do pile of things I need to get around to fixing or finishing.

This kind of thing happens.  Integrating technologies doesn’t exactly come with an instruction manual.  Sometimes it is a suck it and see situation.  So, here’s a few of the things that haven’t gone right in January.

  •  During a meeting last week, I decided it would be brilliant if I could share my Microsoft To do notes to other people in a project that I’m working on. I love Microsoft To do. It’s just so easy to keep track of what I need to be doing and the priorities of different tasks.  I would have appreciated the power it would have provided if I could find out what others on this project are doing as well.  So, I configured SCCM to connect to Azure then to the Microsoft Education store, I provisioned the To do app and after some messing about, I got it installed on a test machine.  It ran perfectly under my test account. Next I installed it on someone elses PC.  It finally installed but it’s not available because it requires Exchange online to be enabled. We’re a Google house here so To do is simply not going to work.  I wouldn’t mind so much but to even get this far I had to:
    • Create an Azure active directory app in Azure.
    • Figure out where the private key was located in the Azure UI. Turns out I was in a slightly different place than I should have been.  That’s the stupid thing about the Azure UI, you can’t necessarily get to the keys for an application from the properties of that application.
    • Then I encountered the problem of how to assign devices to Azure active directory accounts.  This is required for the Microsoft Education Store to allow apps to be installed.
    • Then finally, most of the documentation said that when you look under Online Licenses in SCCM, you should see the available store apps. What they don’t say is that you will need to manually provision any apps that don’t get provisioned by default.
  • I was very happily surprised by PFSense.  The UI is nearly 100% accessible. I’m delighted because a few years ago it wasn’t that straight forward for a screen reader user to navigate.  But getting this running wasn’t all that easy.  I wrote a blog post about Configuring PFSense last week. But in summary:
    • Routing broke.
    • After a rebuild of the config, it worked again. NO idea why.
    • Key generation and association for OpenVPN didn’t work as expected.  But I got that working eventually.
    • Got load balancing working. Yay!
    • My Irish Broadband router then decided it was going to see the wrong IP address for the PFSense virtual machine. I’ve flushed that device several times and each time it gets it wrong.   It’s seeing an old device.  NO idea why.  The device is only running at 15% usage and 28% power but it’s user interface is running very slow. That’s a problem for another day.
  • .net problems have terrorised me for nearly two months now. Here’s the problem:
    • I inherited a large application from a company who are no longer supporting it effectively.  This application broke as a result of a change that was made outside our control on the infrastructure  that is hosting this service.
    • I worked with this bad company for about a month. But it was clear to everyone that I was coming up with better ideas to fix this than the people who were actually meant to be developing it. so, in frustration, I tok over the code in december.
    • The part of the code that broke as a result of the infrastructure change is now fixed.  But the fix depends heavily on .net 6.2.  The system was written in .net 4.5 so upgrading it shoudl be straight forward. But no.  I was struck again.  The code that I’ve written uses newer versions of the libraries that are already in use in .net4.5.  Updating those libraries breaks the main application.
    • I could go on and on about this but it’s very complicated.  I’ve sat at my desk until 3:30am in the morning trying to get my head around this but I’m not getting very far.  I have 27 conflicts left.  Each time I encounter a conflict, I need to explicitly reference the correct library and version.  HOwever, in the event that conflict is communicating with a clas in the main application, directing the code at that class and therefore including it for compilation may or may not add dozens of other conflicts.  If I’m lucky, it will just compile.  Generally it does, but when it doesn’t, it can set me back days.  Each time I find a conflict, I have to open the old version of the code and verify the library / namespace that it was previously using.

Yep. It’s all a complete Fuster cluck.

I’m tired. I’m not getting enough sleep because I’m not good at switching off while I have things that need to be fixed or finished.  But then I’m finding it’s very hard to get motivated because i have had such a long string of problems and I’m constantly tired.

Don’t worry. I’ll break through this cycle. Things will start falling into place.  I’ll keep working away at it.  This isn’t the first or the last time several systems have caused me problems all at the same time.

A note about Microsoft To do.  If you haven’t tried it, give it a go.  I think you’ll like it.  If you find yourself needing that kind of thing.

OpenVPN configuration in PFSense.

I spent about six hours this weekend installing PFSense, configuring the firewall and setting up OpenVPN. Here’s a quick run through of the problems and the solutions.

LAN to WAN access

I’m not using VLan’s.  The main purpose of running PFSense is I wanted to have traffic filtered through a reasonably decent firewall sitting on a virtual machine.  All the servers that I’m going to use are on the one hyper-V host.   I don’t want to open up a lot of ports going to these services for both general front end access and back end administration.

With the use of a VPN for back end administration, I’ll have three networks in total to set up.

  • WAN interface.
  • LAN network
  • VPN client network.

At the start, prior to configuring OpenVPN, routing between the WAN to the LAN was fine.  But after the configuration of OpenVPN I had problems with routing from the LAN out to the WAN.

I wish I could say I found a solution to this. But I didn’t.  When I fixed the routing issue, I then lost all access to the LAN.  So I restored the factory defaults and began configuration again.

The second time I configured PFSense the routing issues I had were not encountered.


When configuring OpenVPN, I had problems generating the client. The first time It said I had no RCL.  Second time I had no user cert and the third time the server cert wasn’t from a trusted CA.

Here’s what I did to fix all of that:

  • Created a new user. This user doesn’t have admin access. This is a good idea for VPN use anyway. This new user has a user certificate assigned.  This user certificate is created from the CA on the server.
  • I don’t know why but the server certificate created by the OpenVPN server wizard wasn’t signed by the Route CA on the server. I also couldn’t delete that certificate. Instead, I just created a second server certificate and in the properties of the server, I selected that new certificate.
  • No CRL. IF a CRL is required by the OpenVPN server, I’m not sure why it wasn’t created by the wizard. But in the properties of the OpenVPN server, a handy link is provided to bring you right to the CRL tab under the Certificate options.

All of these items were easy to fix.  They seem like bugs in the OpenVPN server creation process.


This one took a while to fix.   I was able to access the PFSense LAN address from VPN clients but I couldn’t access any other devices on the PFSense LAN.

  • Using netstat -r in Windows confirmed that the route was added.
  • There were no firewall rules blocking traffic. But on the up side, I also added tighter  rules to specifically allow the traffic that was needed between VPN devices and the LAN.

I thought it was strange that I could access the LAN devices from the PFSense console.  So after a lot of thinking, I finally decided to add the routes from the other direction.  From the LAN devices to the OpenVPN network.  There’s a way of doing this in OpenVPN I’m sure however if explicitly configuring the routes on the LAN devices, try one of these two commands.

For Windows:
route add mask [PFSenseLANGateway]

For Linux:
ip route add via [PFSenseLANGateway]

Finally. It’s all working.

I used PFSense a lot about eight years ago. I had problems at the time running it in a VM. Now though I’m delighted that I’ve started using it again. I love the UI and it’s all very logical.  I look forward to using the lode balancer functionality soon as well.

Trying to drag the good out of a busy week.

While walking home from work last Friday evening, I was feeling particularly thankful.  It had been a very busy week and I was really looking forward to switching off for a few hours.  I reflected to myself: Each week, we all learn something.  It might be from a casual conversation, a book, a news report or even online on social media. So here’s a few things that I picked up last week.

  • I spoke to a medical missionary nurse on the way home from Dublin on Wednesday.  She started travelling around the world offering help in 1982. Since then she has helped in Nigeria, Kenya, Gana and brazil. When not working as a nurse and a midwife, she was also helping young nuns to as she put it, “make sure this life suited them”.  I found the conversation with this woman fascinating.  She explained that many of the areas she worked in transitioned from providing hospital care / primary care facilities to providing community support and preemtive care. The lady didn’t agree with this entirely but she saw the motivation behind it.  The young nurses she had once helped to train are hoping that reducing the need for hospitals will reduce the burden on the emerging health care system.  She was also telling me that NGO’s such as the red cross took over from medical missionary nurses from the mid to late nineties.  At their peak, there were 250 medical missionary nurses in her organization. Now in recent times there are about 150 MMN’s at any one time.
  • I took three short courses this week on Linked in learning. The first was on Azure architecture fundamentals, the second was on Windows 2019 differences compared to 2016 and the third was specifically related to Active directory on 2019. Each course was about two and a half hours long.  I can’t say I learned anything groundbreaking.  Especially relating to azure, I had done all of this before but it’s good to go back over the basics in case there’s something that’s been forgotten over time. But still, it was reasonably useful.

Azure Point to Site VPN – Add or replace certificates.

A year ago I set up a new environment for a company who decided to host everything in Azure.

I set up the virtual machines, the storage, the backups and everything that came along with that.  I also gave them a Point to Site VPN connection so they could independently make changes and modify / add data as needed.

Today that VPN connection stopped working.  Why? Simple.  The cert expired. Microsoft have written great documentation on this topic but by default, the root and client certificates only last for one year.  That’s for security reasons of course.  Each year, you renew your certificates and if someone has a certificate that should no longer be allowed, that cert becomes invalid. Nice and easy.

However, in addition to using certs, I also have accounts that I can modify on the local machines and each group of people have a different route cert so replacing certs isn’t a major problem.

That said, I wanted the certs to last longer than 1 year.  I could have made them last 10 years but I thought 3 years was a happy medium.

You could of course create the scripts using a GUI but here’s a faster way that uses Powershell.

$date_now = Get-Date
$extended_date = $date_now.AddYears(3)
$cert = New-SelfSignedCertificate -Type Custom -KeySpec Signature `
-Subject “CN=P2SRootCert” -KeyExportPolicy Exportable `
-HashAlgorithm sha256 -KeyLength 2048 `
-CertStoreLocation “Cert:\CurrentUser\My” -KeyUsageProperty Sign -KeyUsage CertSign -Notafter $extended_date

Now create the client cert using this.

New-SelfSignedCertificate -Type Custom -DnsName P2SChildCert -KeySpec Signature `
-Subject “CN=P2SChildCert” -KeyExportPolicy Exportable `
-HashAlgorithm sha256 -KeyLength 2048 `
-CertStoreLocation “Cert:\CurrentUser\My” `
-Signer $cert -TextExtension @(“{text}”) -Notafter $extended_date

When you’re ready, open the route cert.  Remove the lines at the top and bottom of the file that indicate the start and end of the certificate then in Azure, browse to All Resources \ Your VPN Gateway,  Configure Point to Site VPN

Now add the new root certificate.

When you’re ready, download the VPN client.  ON the same Screen in the Azure portal, click Download VPN client.


If needed, remember to export your certificate.  Include to private key and give the exprrted PFX file a good strong password.

Getting stuck in.

It’s been one of those weeks. There’s no major projects looming but yet there’s a lot of what would be called BAU “Business As Usual” tasks that need attention.  BAU tasks are not the most thrilling but never the less they need to be done I suppose.

So. I’m here at my desk. I’m signed in, I’ve the Bose QC35’s on my ears, the Ceol FM energetic stream is playing and I’m ready to go.


  1.  Email department about Azure subscription renewal.  They have several reserved instances and an Email thread with the sales partner has finally answered the questions that I had so I’m now in a position to make informed decisions so that they can renew their reserved instances and decide on their monetary commit for the next twelve months.  ON a separate note, I created that infrastructure in Azure exactly a year ago and it has had 100% up time.
  2. An integration project that has been ticking away needs attention.  The people who’s system I’m integrating have no technical problems in particular but I can tell from their responses that they are worrying so I think I’m going to arrange to meet them for a coffee later just to explain what’s happening and to put them at ease.
  3. I’ve been working on a Shiboleth IdP integration project for the last while but I inherited code and instead of objectively looking at it I just dived right in and started trying to make progress.  a month in, I’ve had to take a step back and look at what I’m trying to do.  The previous developer had tried to reinvent the wheel by manually writing the SAML using an XML writer. That’s fine for login. It’s not ideal but it will work.  But for logout, there’s just too much XML to write and the requirements for logout are tooo complex.  For example, although you might get the SAML write for sending the logout request, the SPNameQualifier meta data that’s needed is generated by a HTTP request that originates from the IdP.  If you query the SP for that directly, it won’t expose the data so it’s very hard to find out what should be written by investigating a working system.  Therefore, I’ve found a library that handles the Shiboleth conversation without needing to write all of the SAML by hand.  I’ve done some work on this on Tuesday but I will need to spent another few hours on this today.
  4. There’s a career progression task on my list. I’ll explain what that will involve later but that’s another hour gone.

There you have it. It’s going to be a full day.


Please send coffee.


Oh I’m also studying Azure enterprise architecture on Linkedin Learning at the moment. I’ve completed several projects in this area and I’ve attended at least a dozen courses and workshops in the area of Azure as well but cloud platforms are constantly evolving.   This particular course uses templates for everything which is a really good idea.  Using the web UI is inefficient. So far the infrastructure I’ve been working on is small enough.  Forty to fifty servers at most but as I start to look into ASR “Azure Site Recovery” and as high availability workloads are pushed to the cloud, I need to be more confident when deploying high availability resources out in bulk and verifying that configuration remains consistent through using templates and desired state configuration powershell scripts.

That’s what’s in my head this morning. Your welcome to it.

Windows Weekly 603. Paul Thurrott is so very wrong.

Paul Thurrott, a tech journalist behind contributes to a rather useful podcast on the network every week called Windows Weekly. I’ve listened to this podcast almost every week probably for at least 10 years.

This week, Paul went on one of his many rants but this rant was ill informed, damaging and utterly unhelpful. He makes certain arguements that could be perceived as being against inclusion and accessibility. Here are a few of the quotes from the podcast excerpt.

It’s done in the name of accessibility. That’s a crock.
Accessibility at any cost is just a brain dead mentality.

There’s this belief that anything that you add that is accessible is a win.

No offence to people who cant see or who see poorly and who want to set up Windows 10 on their own but if your vision is that bad the act of setting up Windows 10 is not a priority. In the background, Leo laughs. Paul continues: It’s something that is going happen once and you probably have someone else that can help you with that.

I have recorded a podcast that includes several extracts from this week’s Windows Weekly. Between the extracts, I have given my considered views on certain parts of Paul’s rant.

I would really hope that Paul listens to this and more importantly that as many people as possible who heard the latest Windows Weekly hears this as well.

Please comment here on the Blog. Facebook comments and Twitter mentions are great but it would be really nice to have the comments right under the post. Thanks.

IN this podcast, there are a number of audio recordings used. Here they are in full.

Differing opinion

An author on a site called BSG has shared an alternative opinion. The author states:

The blog has gone quite a while without me writing about twitter drama, but that is about to come to an end now. Blind twitter is all riled up and flipping out over something someone said on the Windows Weekly podcast. Spoiler alert, all the rage and offense is being blown out of per portion and there is 0 reason for any of it. People just seem to want to be offended, but of course I’ll show why this is all based on someone taking everything out of context to manufacture outrage.

The link to the full post is here.

I assume the someone the author is talking about is me. I have a few points in response and I have left a comment on that post but for your conveenience, I will include my comment here as well:

You have expressed a few interesting opinions and I except that you are entitled to them. However, I disagree that the origional blog post / podcast was out of context. I deliberately left most of the Windows Weekly podcast in my recording so as I couldn’t be accused of taking things out of context. I also provided a link to the Twit.TV Windows Weekly 603 show so that people could listen to it in full. In addition, I provided a link to a Youtube video showing the full Windows 10 setup experience.

My podcast is here for anyone who is interested:

And to show that I hold no ill will toward BSG for your differing opinion, I will include a link to your post as an edit to my origional piece.

I have been on social media for 10 years and I have had a blog for nearly 20 years. I stand by my content and my record. I have never insighted negativity toward another person however in this instance, I firmly believe that paul Thurrott’s comments were distructive, damaging, incorrect and misleading. Time and time again during the podcast and in messages, I have explained that my issue isn’t with the windows 10 setup. Microsoft can disable that feature I wouldn’t give it a second thought. As many have pointed out, it’s not an accessibility consideration. The huge problem I have here is in the way Paul Thurrott ranted. I have listened to Paul on podcasts for nearly ten years now and in the past few years, I have noticed his tendancy to launch into rants. However, this particular one went too far.

I am happy to discuss this with anyone. Including Paul Thurrott directly. My aim here has and is always to ensure that the message that Paul gave on Windows Weekly 603 is corrected. Not that he is attacked directly. And in faireness, I don’t see any indication that he has been personally attacked. In fact, messages to him on social media have been well worded and considered.

One final point. I respectfully submit that you consider that your take on Paul’s thought’s may be very different if you worked in the tech industry.

The podcast has now been listened to over 380 times. It has had 18 responses on Twitter, 4 on Facebook and 9 comments here on the blog. Not many in the grand scheme of things but for this low traffic blog it’s significant. I remain hopeful that the objective of this post will be achieved and TwiT will correct Paul Thurrott’s statements on Windows Weekly 604 due to be aired this Wednesday 16th January.

Blog Archive