Bashing away

When using Ubuntu under Windows (WSL), the blue-on-green directory display thing, which is eye-searing, is thanks to the "OTHER_WRITABLE" flag because all NTFS directories meet that criteria. To fix, create ~/.dircolors with this content:


That will make it bold cyan text on black for future Ubuntu shell sessions.

Vimming away

The 'desert' color scheme is much more readable on black background for things like commented code which is usually dark blue

Edit ~/.vimrc and add:

colo desert
syntax on


  • screen -S [name] - Creates new named screen session (that's a big S not a little s)
  • screen -r [name] - Reattach to named screen session
  • Ctrl-A then D - Detach from screen session
  • screen -ls - List existing screen sessions

Fixing timestamps with RSYNC

Did you just use SCP to copy over a bunch of, let's say, music files from one server to another, thus breaking the timestamps? Lucky us, rsync has a way to fix that:

rsync -vrt --size-only /src /dest

Curl and Wget not happy with Let's Encrypt certs

If you try to use 'wget' or 'curl' to access a site protected by a Let's Encrypt cert, you get an error like:

curl: (60) SSL certificate problem: unable to get local issuer certificate


sudo curl -o /usr/local/share/ca-certificates/isrgrootx1.crt
sudo curl -o /usr/local/share/ca-certificates/letsencryptauthorityx1.crt
sudo curl -o /usr/local/share/ca-certificates/letsencryptauthorityx2.crt
sudo curl -o /usr/local/share/ca-certificates/letsencryptx1.crt
sudo curl -o /usr/local/share/ca-certificates/letsencryptx2.crt
sudo curl -o /usr/local/share/ca-certificates/letsencryptx3.crt
sudo curl -o /usr/local/share/ca-certificates/letsencryptx4.crt
sudo dpkg-reconfigure ca-certificates

That seems to do the trick, in both 16.04 and 18.04 Ubuntu at any rate.

Mirroring with wget

Speaking of wget...

wget --mirror --page-requisites --convert-links --adjust-extension --compression=auto --reject-regex "/search|/rss" --no-if-modified-since --no-check-certificate --no-parent

Important note: If using --no-parent the URL needs the trailing slash and to be a directory, otherwise it just kind of ignores that option and downloads the whole dingdang site anyway.

Find first file in each subdirectory?

find . -type d -exec sh -c 'find "{}" -maxdepth 2 -type f | grep ogg$ | sort | head -n 1' ";"

Now if I could just filter on .OGG files. The grep takes care of that, and the $ at the end of the grep input means "at the end of the string". No idea why it's pulling in duplicates for a lot (but not all) of the directories under Music\Library but whatever, it's a start. We can always de-dupe. Probably. Note the -maxdepth 2 is for running in the main Music\Library directory. Within a given artist, just use 1, I suppose it doesn't matter though. Using maxdepth 1 within an artist directory removes the dupes, using 2 returns the dupes, WHAT.

Unique entries from text file

The method of gathering a list of Ogg files (one from each directory containing such) results in a lot of dupes for some reason. Here's a quick fix for that:

sort -f -u sourcefile.txt > outfile.txt

The -f removes case-sensitivity, which helps with artists like "dada" and "" in an artist-based sorting operation.

Ogg info - Vendor without the "Vendor: " part.

ogginfo "path/to/file.ogg" | grep Vend | sed -En "s/Vendor: //p"

Seems to get what I need.

Ogg Vendors from File List

Putting it all together:

while read filepath; do ogginfo "$filepath" | grep Vend | sed -En "s/Vendor: //p" ; done< /path/to/filelist.txt > /path/to/outfile.txt

TODO - Maybe find a way to output the filepath and the Vendor data with a comma between as a CSV?

Linux commands on Windows text files

Problem seems to be the line feed characters resulting in "No such file or directory". Sigh.

while IFS=$'\r' read -r -u 3 filepath; do ogginfo "$filepath" | grep Vend | sed -En "s/Vendor: //p" ; done 3< /mnt/e/Temp/FileList.txt


The -u 3 part (with corresponding 3 at the end) is to avoid stdin conflicts. It's the IFS string that really does the trick for us, though.

Page last modified on January 03, 2023, at 06:20 PM
Powered by PmWiki