It is in Main Web simply so we do not have to put Main.
in front of every WikiName.
Just protected TWiki.cfg from web browsing - this was recommended by the TWiki core team. This required changing Apache's httpd.conf for the gnhlug.org site.
I changed templates/mailnotify.tmpl to not have HTML in it. This substantailly reduces the content of the message, but I think that's what people want anyway.
Upgrade TWiki from release 3.x dated 2004 September 01 (Cairo?) to 6.1 (2018-07-16). Also move to a new server, was justice
, now petra
. First significant upgrade to our TWiki software in almost two decades. (Not something to be proud of.)
This was a clean install. The GNHLUG and Org Webs had their content migrated over (data
and pub
), but other Webs were created anew. The old install had a ton of spam accounts, mostly neutered, but still cluttering things up. And I judged it easier/better to start fresh than try to figure out what should be kept in webs like Main and TWiki. This does mean users have to create new accounts again, but that is very easy to do, and if one uses the same name, all the history links even keep working.
%DATA%
will use logrotate to rotate daily
I set up Apache to require authentication for the register script. This was
(intended to be) done outside of TWiki, as a separate process, with Apache
using HTTP Basic authentication to get credentials. It worked fine on the
Apache side, but TWiki was picking up the HTTP Basic Auth username "register"
from Apache via the REMOTE_USER environment variable, and treating that as
a TWiki user (despite the lack of user Topic page). I found others with
similar problems at
https://twiki.org/cgi-bin/view/Codev/UnregisteredUsersShouldBeTWikiGuests
but no built-in way to handle it. My fix was to modify lib/TWiki.pm
and insert at the top
$ENV{REMOTE_USER} = undef;
so that REMOTE_USER is always removed from the environment for every TWiki
script. This seems to have worked.
disabled all plugins as part of debugging OOM
2023-12-17 11:28:15 enabled Apache 2 module suexec
2023-12-17 14:20:13 added some robots to Apache blacklist
Still getting OOM crashes. It looks like a single random IP address in the AWS cloud is spidering the entire site as fast it can, and so we get tons of Perl view
processes running sucking up all the RAM. Tried adding MemoryHigh=70%
and MemoryMax=90%
to /etc/systemd/system/apache2.service
but all that is going to do is make sure it's Apache that crashes and not something else. What we really need is a way to limit running CGI process count that doesn't break everything instantly.