(how did my hugo public folder get to be a mess?)
I’m thinking changing folder names while hugo serve -D
is running isn’t such a good idea? I somehow ended up with a public folder that was a mess. With multiple folders that clearly were iterations of names I was work-shopping. If I had thought that through, oh course it would do that.
And since I never looked, my rsync script was uploading huge amounts that it would never delete.
I did two things to fix it
1 - Cleaned the public folder
hugo --cleanDestinationDir
2 - Updated my upload script
The original line excluded .DS_Store
but nothing else. Never deleted because what if I wanted to put non-hugo items up there?.
rsync -zaP --exclude='.DS_Store' $SOURCE $DESTINATION
The new one, shown in context below, deletes everything that isn’t in the local public folder, but has an exceptions file. If the destination is space restrained, the slower exclude-before might be a better fit.
- rsync utility home
- rsync man page
- https://rsync.samba.org/examples.html
- https://rsync.samba.org/resources.html
SOURCE="$HUGO_ROOT/public/"
DESTINATION="$USER@$HOST:/home/$USER/$SITENAME"
#echo "#rsync -zaP $SOURCE $DESTINATION"
# z: compress
# a: archive mode (preserve permissions)
# P: show progress, keep partial (I watch every time)
rsync -zaP --delete --exclude='.DS_Store' --exclude-from='exclude-rsync.txt' $SOURCE $DESTINATION
example exceptions file:
the difference between dir/* and dir, is that the first refers to the contents and the later the directory itself as well.
some_special_file.txt
pre*
dir1/*
dir
Might have things like
favi*
robots.txt
.well-known
Note: some things you might manually add can be done via hugo instead of this file
- via the static folder - i.e. favicon
- or configuration - i.e. robots.txt (poor robots.txt)
- 404
Running the script
I run this script by hand from variables in a .env file, but NOT THE SERVER PASSWORD. If remembering the password every time isn’t an option, use a ssh key.