OpenLiteSpeed
Oh, there's going to be some stuff to say here, provided we end up feeling comfortable with the migration "sticking."
Blocking AI scraper user agents via .htaccess (per-vhost rewrite rules)
As per: https://neil-clarke.com/block-the-bots-that-feed-ai-models-by-scraping-your-website/
In the Virtual Host's Rewrite section, before the 'index.php' bits, include this:
RewriteCond %{HTTP_USER_AGENT} (CCBot|ChatGPT|GPTBot|anthropic-ai|Omgilibot|Omgili|FacebookBot|Diffbot|Bytespider|ImagesiftBot|cohere-ai) [NC] RewriteRule ^ – [F]
Ubuntu 24.04 LTS
Forget ifconfig the new hotness is ip address show and so forth.
Ubuntu Firewall (ufw)
To sum up:
sudo ufw allow http sudo ufw allow https sudo ufw allow ssh sudo ufw allow ntp sudo ufw default allow outgoing sudo ufw default deny incoming
And if we're still using Icecast (maybe..?)
sudo ufw allow 8000
Then we check /etc/ufw/user.rules to make sure things look like they should. If we're happy, issue this to kick things off:
sudo ufw enable
And hopefully we don't get kicked out. Use the ufw status command to check status, obvs.
If someone at a particular IP is being a butt, well, there's:
sudo ufw deny from IPADDRESS to any
(or swap out reject for deny if we actually want the IP in question to "know" they're blocked).
Don't forget to check status: sudo ufw status
NTP
ntpq -p
That should show us the status of the various configured NTP "peers" (for making sure we have good sources to remain a good Pool participant). If we find that one of the source peers (we should always have five working options) is kaput, use https://support.ntp.org/Servers/StratumTwoTimeServers to find a replacement or two.
WordPress 'admin' rename
UPDATE wp_users SET user_login = 'NewUsername' WHERE ID = 1;
PostgreSQL
Making backups is... weird. What seems to be working is setting up a pg_dump cron job as my "normal" user, but first one must create a ~/.pgpass
file that's set chmod 0600
which looks a bit like this:
localhost:5432:databasename:databaseuser:databasepassword
And then the cron job entry itself looks a bit like this:
8 3 * * * pg_dump -h localhost -U databaseuser -d databasename > /opt/backups/db-`date +'%Y%m%d'`.dump.txt
SSMTP
Getting mail out of a new Linux box is relatively easy with SSMTP, which is still available in Ubuntu 20.04, thankfully. We need two files set up:
- /etc/ssmtp/ssmtp.conf contains the actual server & authentication info.
- /etc/ssmtp/revaliases supposedly allows aliasing the sender, but that doesn't seem to have worked yet on the new box. Needs confirmation/testing.
The ssmtp.conf file looks a bit like:
# The person who gets all mail for userids < 1000 # Make this empty to disable rewriting. root=greyduck@greyduck.net # Gmail settings UseTLS=YES UseSTARTTLS=YES #AuthMethod=LOGIN # The place where the mail goes. The actual machine name is required no # MX records are consulted. Commonly mailhosts are named mail.domain.com mailhub=smtp.gmail.com:587 # Where will the mail seem to come from? rewriteDomain=greyduck.net # The full hostname hostname=node3.greyduck.net # Are users allowed to set their own From: address? # YES - Allow the user to specify their own From: address # NO - Use the system generated From: address FromLineOverride=NO # Username and password for Google's Gmail servers # # From addresses are settled by Mutt's rc file, so # # with this setup one can still achieve multi-user SMTP AuthUser=greyduck@gmail.com AuthPass=APP_PASSWORD_GOES_HERE
And revaliases contains basically just this:
root:admin@frell.co:smtp.gmail.com:587
Your mileage may vary.
Duf
Duf is a "better df" basically: https://github.com/muesli/duf