I discovered the WikiUniverse? just a few days ago and installed UseModWiki (0.88, I believe) on my Linux box at home. Great forum. Hovever, it seems that if I have "$UseCache=1" and some user have "Display 'Random Page' link" in his preferences, then all pages that he has visited will have a "Random Page" for all subsequent users, no matter if they have set in in their prefs. Is this a bug or a feature? --KlausAlexanderSeistrup
Yes. It is a bug or a feature. I haven't decided which yet. The behavior is caused because the cache-copy is saved when the reader views the page after editing. The cache feature (obviously) has some difficult interactions with individual user features. Since I'm planning some more user-customization in the future, I'll have to make a decision. The only ways to be consistent are to either:
I tend to prefer the cache-skipping option, but other sites might not want the extra CPU-work of generating every page viewed by one of these users. Any other opinions? --CliffordAdams
How about caching only the main page contents without the common header and footer? This should eliminate the most CPU-intensive part of rendering the page while still allowing for (some) user customisations. --AnonymousCoward
For a really quick/dirty fix, you could go into the code in the GetGotoBar subroutine, and change the line reading:
if (&GetParam("linkrandom", 0)) {
...to:
if (1) {
...if you want the Random link on every page. (Change the 1 to a 0 if you don't want the link.) Then remove the html subdirectory in the wiki db-directory. --CliffordAdams
Export as static HTML
Is there any way to export pages (maybe starting at a specific page) as a set of HTML pages? I notice that there are caching options in the script already, so assume much of the code is already pretty close (I have not looked in detail). This could be really useful for FAQ type situations, where the on-line version is updated regularly, but every now and again, a snapshot could be taken (for example, for sending around on a CD, etc).
(I have been reading the existing discussions about export and import (eg via XML, etc) on UseMod and MeatBall, but these appear to centre around the exchange of the Wiki database and pages, rather than what I describe above ...)
I'm also interested in the ability to do static pages. I have a few web sites in mind that could be totally implemented using Wiki, but I want search engine web crawlers to come and be happy. Usually, they do not follow cgi-bin links so the crawlers won't be crawling the Wiki. In general, I'm interested in anything that makes UseMod Wiki more interesting to the crawlers. -- RayCote
#!/bin/sh rm -rf *.htm *.bak cp $1/*/* . perl -p -i.bak -e 's/Content-Type: text\/html; charset=ISO-8859-1//' * perl -p -i.bak -e 's/href=\"[\w-\.]*\?([\w_]*)\"/href=\"$1\.htm\"/g' * #rm -rf *.bak
wiki-mirror.sh /path/to/usemod/cache
Config for serving public static and private dynamic pages together
I found I could serve static pages alongside the proper editable wiki pages by doing the following:
Make a new copy of the wiki.cgi .. I call it static.cgi.
Edit static.cgi to use a new config file ($ConfigFile? = $DataDir/config_static;).
Make a new copy of your config file as $DataDir/config_static. Edit your new config_static file to serve uneditable cached pages as an embedded wiki ($UseCache = 1; $EditAllowed? = 0; $EmbedWiki = 1; ... I also set an Admin and Edit password because I was paranoid). The $SiteBase and $FullUrl should be changed to reflect the new static.cgi location.
Make a new cached page directory ($DataDir/html_static), and change $HtmlDir? = "$DataDir/html_static"; to reflect this.
I secure the editable wiki.cgi version with .htaccess/.htpassword files, and allow the static.cgi version to be world accessible. Seems to work (no in-depth 'audit' has been done) ... any comments on potential security flaws or other problems would be appreciated.