Please, can anyone tell how many pages can dokuwiki contain without noticeable performance loss, and how will a large number of pages affect the search?
My wiki runs on a rather slow Synology Disk Station DS115j. It holds about 3500 pages of text and about the same number of images.
Using the texts is not noticeably slower than when the wiki was close to empty. Managing the media (adding images to the collection) is definitely sluggish, but showing a small number of images at a time runs crisply.
Searching is really fast. The Wiki maintains an index.
I moved the whole Wiki to a Banana Pi (similar to a Raspberry Pi) for testing. It was much faster than the Synology, presumably on account of the solid state storage used by the Pi.
Working at times, better try to (very busy page with huge edits), with 20.000+ pages (most of single pages are book of some hundred pages each) possible up to some 100.000... the "big" problem seems to be indexing RAM and the history by saving edits a space issue. Normal use, search... even quicksearch, works fine.
Note that a it-ignorant and unskilled comments. But after all pretty fascinated of what is possible having "simple" textfiles, easy to handle.