SEATT updated to 1.5.1

Simple Event Attendance has been updated to 1.5.1.

This is to fix an issue with PHP7.2 where debug logs contain a count function warning:

Warning: count(): Parameter must be an array or an object that implements Countable in /var/www/html/wp-content/plugins/simple-event-attendance/seatt_events_include.php on line 103

The few lines where this was a problem were updated to check if the variable was empty. See changelog on the PHP count function spec.

Plugin is live on wordpress.org, and available on GitHub.

Using your own domain with AWS S3 and Cloudflare

If you’re using a CNAME on your root domain, you’re gonna have problems. That’s just a DNS thing – and if you want to host a root domain on S3, you won’t be provided with an IP address by AWS. You can solve this if you use Route53, but what about if you want to keep your domain in Cloudflare?

You’ll also have problems if you want to use Cloudflare Full SSL on an S3 bucket configured for static website hosting – resulting in nothing but Cloudflare error 522 (Connection timed out) pages.

My use case is a set of simple redirects, following on from a post about 301 redirects in S3.

The easy solution to both problems is to use CloudFront to serve https requests for your bucket; but I’m going to assume that you want this solution to be as cheap as possible – and use only S3 from within the AWS ecosystem.

Continue reading “Using your own domain with AWS S3 and Cloudflare”

301 Redirects using AWS S3 with Static Hosting and an empty Bucket

S3 buckets allow you to host static content with a pay-per-get model. No monthly fees and no servers – so I considered how I could use this to redirect a limited number of URLs from an old website to a new site.

It couldn’t be a straight forward as the URLs aren’t the same (so using a CNAME, domain forward, or the S3 Redirect requests options were out), but I wanted to preserve the links, and was previously using a .htaccess file to do this. Enter static hosting, on an empty bucket.

We’re setting up the example with “my-redirect-test”
Continue reading “301 Redirects using AWS S3 with Static Hosting and an empty Bucket”

AWS S3 Cross-Region Replication – migrating existing files using AWS CLI

I’m setting up CRR on two buckets, one new, and one existing – which already contains files

When you enable cross-region replication on an existing bucket, it doesn’t copy existing files from the source to the target bucket – it only copies those objects created or updated after the replication was enabled. We need to copy the original files manually using the AWS CLI.

Continue reading “AWS S3 Cross-Region Replication – migrating existing files using AWS CLI”

Reviewing SPF, DKIM and DMARC settings for Google GSuite Mail on your domains

In the past, I’ve configured these on my domains (and wrote about SPF with GSuite – which was at the time, Google Apps). In the last 9 years, the rest of the DNS config has changed a lot, and as I’ve never had issues with mail, I never reviewed my settings. Until today.

For another reason, I checked my config on mx toolbox – and I spotted that some tuning was required.

The DNS report shows a few MX errors, and more warnings

As it happens, Google offer a similar tool for their users in their Google Apps toolbox.

It seems, that at some point the recommended record has changed from:

v=spf1 include:aspmx.googlemail.com ~all

To a different domain:

v=spf1 include:_spf.google.com ~all 

OK; no problem – that one’s easy to fix. Setting up DKIM was easy as well, using the guidance here, and again highlighted that those records were incorrect as well. At some point a CPanel server had managed the DNS config and added it’s own records!

Reminder to self – review my MX settings at least every couple of years!

Automating AWS Lightsail backups using snapshots and Lambda

Some of the most glaring omissions from Lightsail are scheduled tasks or triggers – which would provide the ability to automate backups. Competitors in this space like DigitalOcean are all set, as they offer a backup option, whereas for AWS I’m assuming they hope you’ll shift over to EC2 as fast as possible to get the extra bells and whistles.

Of course you can manually create snapshots – just log in and hit the button. It’s just the scheduling that’s missing.

I have one Lightsail server that’s been running for 6 months now, and it’s all been rosy. Except – I had been using a combination of first AWS-CLI automated backups (which wasn’t ideal as it needed a machine to run them), and then some GUI automation via Skeddly. However – while Skeddly works just fine, I’d rather DIY this problem using Lambda and keep everything in cloud native functions.

Continue reading “Automating AWS Lightsail backups using snapshots and Lambda”

Using the Cloudflare API to provide free Dynamic DNS with Windows and Powershell

This post details my switch over to using Powershell and Cloudflare to update a DNS record to a server’s current IP. This effectively emulates dyndns for this host – except it’s free.

There are a load of other options out there, which even include some simple-but-quite-clunky apps for domain registrars like NameCheap; but installing third party software is not the route I want to take.

I previously had my target domain (let’s call it targetdomain.com) hosted on a Linux box, and used SSH to update the DNS settings via a Windows server. This worked well for three years without a blip – but was clunky. I was using a scheduled task to start a bat file, which then ran Putty to run the shell script…to update a config on a server which was only hosting the domain to serve this purpose.

Although the scheduler>bat>shell tasks have been running well for years, it’s time to simplify!

I’ve been using Cloudflare for years, and set aside time to write a script to use their service for this purpose. As it turns out, people have done this for years – so I’ve taken one off the shelf.

Continue reading “Using the Cloudflare API to provide free Dynamic DNS with Windows and Powershell”

Migrating a private repository from Bitbucket to GitHub with Git

As GitHub private repositories have just become free, I’m jumping on the bandwagon and shipping over a few of the repos I have on Bitbucket to reduce the number of providers I store code with.

The end result – a private GitHub repository with all the metadata from the old Bitbucket repository – note we have maintained the last commit from 10 months ago

The option below uses the shell to migrate the repo, but you can also use the GitHub importer if you prefer an automated solution (you’ll just have to wait while it does it).

Continue reading “Migrating a private repository from Bitbucket to GitHub with Git”

Deleting AWS Glacier Vaults via AWS CLI using a Lightsail Instance

Amazon Web Services (AWS) offers some very affordable archive storage via it’s S3 Glacier service. I’ve used this on a backup account in the past to store archives, and have decided it’s time to clear down this account (oh, and save $0.32 a month in doing so).

The main challenge with doing this, is that unlike S3, S3 Glacier (objects stored directly there rather than using the Glacier storage tier within S3) objects can only be deleted via the AWS CLI. And to delete a Glacier Vault, you’ve got to delete all of the objects.

This account has some wild spending. $4.90 a month!

In this post I’ll spin up a Lightsail box and wipe out the pesky Glacier objects through the AWS CLI. This doesn’t require any changes on your local PC, but will require some patience.

Continue reading “Deleting AWS Glacier Vaults via AWS CLI using a Lightsail Instance”

A quick performance comparison with Qlik Sense – AWS EC2 vs Azure Virtual Machines

Previously, I tested the performance of a load script while using RecNo() and RowNo() functions. This conveniently gave me a script which consumes up to 25GB of RAM, along with considerable CPU power.

So, what about testing it on two cloud boxes? I’ve chosen a machine from both AWS and Azure, loaded them with Qlik Sense September 2018 and run the load script.

Total Test Duration by Host

The summary: The AWS box was approx 8% faster than the Azure box.

Continue reading “A quick performance comparison with Qlik Sense – AWS EC2 vs Azure Virtual Machines”