Hay
I have an online DokuWiki (with htaccess limited access) for my server, where I put all stuff (even trivial passwords, sometime) I need to manage my server. So its full with critical data.
Sometimes, I have to go offline and do updates or such, but then I have to write into the dokuwiki heavily, because I write, what and how I update, which errors, how I fixed them etc., for the next update 😉
Then I copy the whole dokuwiki to a local instance and write it there.
Good. But after beeing online again, I want to continue to write online. So I copy back the stuff from local to my online dokuwiki. And this is a bit tedious, I copy these folders back:
data/pages - Enthält die aktuellen Seiten
data/attic - Alte Versionen der Seiten
data/meta - Metadaten zu den Seiten (Wer hat die Seite erstellt, …)
data/media - Beinhaltet die Medien (Bilder, PDFs, …)
data/media_attic - Veraltete Medien
data/media_meta - Metadaten für die Medien
as I read in FAQ - Backup and it takes a while. Now I found that sync Plugin, but it seems to be not too secure, and anyway, with htaccess control, I guess it wont work. I didnt find a way to tunnel it through ssh or such and after all it would be more work than just copy with /S/FTP/S.
Is there maybe a trick to just copy the last entries of all relevant changed files? To keep two same DokuWikis identical?
I sort the files by date already to copy, but it is 6 folders and not just one button 😉
Thanks for hints.
frank