Michitux wrote
I don't really understand why we are breaking the specification. From all I know it is completely valid to provide just a single sitemap file and I don't understand how we are breaking the robots.txt sitemap specification syntax as it should be easily possible to specify the sitemap that is generated by DokuWiki in a robots.txt file.
Not breaking. We are missing parts.
Michitux wrote
The hidepages option is for hiding these pages, this means that it is intended for hiding these pages from all listings and in my opinion it is a contradiction to the intended usage of the hidepages option to provide a listing of these pages.
Again, guessing the intention is secondary to exploitation of existing and established features. IMO the later can be considered a publicised contract.
Michitux wrote
The feed and the sitemap have nothing in common and don't share any code. While the feed just lists a certain number of pages, respects ACLs and is only generated on demand the sitemap - if enabled - is generated automatically, uses a different (and for getting all pages more efficient) mechanism for getting all pages and only includes pages that can be read by anonymous users.
IMO both features differ only in presentation of a larger superset of functionality. As you said both provide lists (that is the common denominator) for certain actors in appropriate formats (presentation issue). In terms of behavioural classification they share the same grounds.
I suppose you've given me enough background to provide a little something in the future.
Thanks