andi:1187701347 wroteA Google sitemap is meant to help Google's spider to index your site. Google's spider is an anonymous user and thus needs anonymous read access to the pages. It does make no sense to build a sitemap of read restricted pages.
Well I've got somewhat of a problem with my sitemap.
If I delete the almost empty sitemap.xml.gz and replace it with an empty file and run the index (with debug) I get this :
runIndexer(): started
metaUpdate(): started
runSitemapper(): started
runSitemapper(): using sitemap.xml.gz
runSitemapper(): creating sitemap using 0 pages
runSitemapper(): pinging google
runSitemapper(): finished
The sitemap contains that
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.google.com/schemas/sitemap/0.84">
</urlset>
Just to be sure the wiki is capable of making a difference between pages that are allowed to everyone (in my ACL : * @ALL 1) and the pages I have restricted (in the ACL category:* @ALL 0) ? I mean there are parts of my wiki which are not public but nontheless I'd like to have a sitemap of all the parts that are accessible to everyone (nearly 4000 pages I'd say...).