How to run Dokuwiki on a server with limited filespace ?
To be more precisely:
How to configure filespace consumption of the cache, index, meta.. directories ?
From what I see as rookie, the filespace is growing permanently, even if the Dokuwiki is just used "read-only".
But what if the filespace is limited, so no 100 MB just for such files ?
So how to run Dokuwiki for months or years on a server,
without the need to delete the contents of the directories cache, index, meta regularly ?
1)
Lets assume the easiest case: "No data is added or changed", as it is true if the website is run as homepage for a personal user, who does not update his/her website regularly ( or if, he/she can delete the old versions of images and files, after editing).
2)
So of course if there are massive changes of data, of course there are old pages, old images...
So what then ?
If Dokuwiki is run on a "managed server", there can be no CGI/Perl script as Cron-Job which deletes the files from the directories, regularly...
Any solutions ?
Or do I see a problem where there is no problem ?
I am going to run Dokuwiki on A_PLACE_YOU_SHOULD_NOT_MENTION_HERE, so my space is limited ( 100 MB), I don´t have such much filespace for garbage like cache files... as I want almost 100 MB for my data, not for cache...
The only solution I found so far was
http://www.dokuwiki.org/caching
To prevent a page from ever being cached, use the NOCACHE tag anywhere in the document
Hmm... Must I do that in ANY document, how about just doing this in a configuration file ?
I read
http://www.dokuwiki.org/devel:caching
too..
and read bout the configuration item
cachetime
So what is the best strategy ?
Sincerely
Rolf