pe0m Read https://www.dokuwiki.org/config:indexdelay - since I'm the only one editing pages in one of my wikis, I have set this value to 0, so everything is indexed immediately.
For me the robots.txt is necessary and helpful. To exclude discussion pages for example, or I exclude all of my images with
And after this:
# Allow Google Adsense
which overwrites the disallow to allow the ads.
Since my images are in a subfolder of the root domain, I have the robots.txt in the root folder. I cannot say if a robots.txt in a subfolder has any use (probably not).
There are pages by Google (support or developers, you can google this), where you can exactly test the direct effect of your robots.txt. This is the best way because you then can be sure what is happening.
The Google robots.txt testing tool doesn't work at the moment. One can use https://technicalseo.com/tools/robots-txt/ which es very easy to handle.