Not logged in. · Lost password · Register
Forum: General Help and Support Server Setup RSS
DokuWiki tries to load over 400 non-existent files or directories, results in performance hit
Avatar
NickD #1
Member since Feb 2019 · 4 posts
Group memberships: Members
Show profile · Link to this post
Subject: DokuWiki tries to load over 400 non-existent files or directories, results in performance hit
Hi,

I'm using apache2 on one system, that then accesses Dokuwiki over a "network drive" on another system.

While both the systems in use have low technical specifications I've assumed DokuWiki is light enough that isn't doesn't need much resource throwing at it for such a small use case. This is a for a "family wiki", maybe thirty of so pages, and just the standard plugins.

However it can take 10 to 15 seconds to load the wiki in the browser the first time, and maybe 5 to 10 each subsequent time. From looking at this, when loading every page Dokuwiki is making over 400 requests for files or directories that don't exist. I've listed the files or directories it looks for the most, and the number of time it searches for them. Is this standard behaviour? Does this illustrate a typical configuration error?

24 auth.php
24 auth
22 action.php
22 action
19 speech.css
18 speech.less
18 screen.less
18 screen.css
18 print.less
18 print.css
18 all.css
17 all.less
16 style.less
13 style.css
08 .htaccess
Avatar
turnermm (Moderator) #2
Member since Oct 2009 · 4643 posts · Location: Canada
Group memberships: Global Moderators, Members, Super Mods
Show profile · Link to this post
Where are you getting these counts from?
Myron Turner
github: https://github.com/turnermm
plugins, templates: http://www.mturner.org/devel
Avatar
NickD #3
Member since Feb 2019 · 4 posts
Group memberships: Members
Show profile · Link to this post
Quote by turnermm:
Where are you getting these counts from?

Essentially "strace -p <apache2> | grep "No such file or directory" | cut and awk stuff just to list the file or directory accessed | basename | sort | uniq", but slightly more complicated than that ;)
Avatar
turnermm (Moderator) #4
Member since Oct 2009 · 4643 posts · Location: Canada
Group memberships: Global Moderators, Members, Super Mods
Show profile · Link to this post
I'm not a familiar enough with strace to know how you get your result.  But all the files you list are dokuwiki files and are required for the running of the wiki.  They are not "non-existent" files.
Myron Turner
github: https://github.com/turnermm
plugins, templates: http://www.mturner.org/devel
Avatar
andi (Administrator) #5
User title: splitbrain
Member since May 2006 · 3432 posts · Location: Berlin Germany
Group memberships: Administrators, Members
Show profile · Link to this post
In reply to post #1
Quote by NickD:
I'm using apache2 on one system, that then accesses Dokuwiki over a "network drive" on another system.

Don't. DokuWiki makes use of the filesystem for everything and assumes that file accesses are reasonably fast and are cached by the underlying OS. Running DokuWiki off a networked filesystem will give you a big performance hit.
Read this if you don't get any useful answers.
Lies dies wenn du keine hilfreichen Antworten bekommst.
Avatar
schplurtz (Moderator) #6
Member since Nov 2009 · 439 posts · Location: France, Finistère
Group memberships: Global Moderators, Members
Show profile · Link to this post
In reply to post #4
Quote by turnermm on 2019-02-10, 21:16:
I'm not a familiar enough with strace to know how you get your result.  But all the files you list are dokuwiki files and are required for the running of the wiki.  They are not "non-existent" files.
I was surprised too, and tried strace too (could/should have used the code). The explanation is that DW looks for given files at certain places. For example, for each plugin, DW can't know if the plugin is an action plugin or not, so it tries to access action.php. When the plugin is not an actin plugin, you get a "No such file or directory" error.
This post was edited on 2019-02-12, 05:42 by schplurtz.
Avatar
schplurtz (Moderator) #7
Member since Nov 2009 · 439 posts · Location: France, Finistère
Group memberships: Global Moderators, Members
Show profile · Link to this post
In reply to post #5
Quote by andi on 2019-02-10, 21:57:
DokuWiki makes use of the filesystem for everything and assumes that file accesses are reasonably fast and are cached by the underlying OS.
Even when cached, system calls are time consuming by nature (well that's what I've allways thought). Wouldn't DW benefit from a software cache here ?
Avatar
NickD #8
Member since Feb 2019 · 4 posts
Group memberships: Members
Show profile · Link to this post
Firstly, and most importantly, thank you all for the replies in response to my wandering into the forum and starting with "your software doesn't work on my weird setup".

I need to update all of the plugins anyway, and see if there's any optimisations I can make so that the networked file system will run faster. If that makes a huge difference I'll report back.
Avatar
Michaelsy #9
Member since Jun 2015 · 925 posts · Location: Düsseldorf, Germany
Group memberships: Members
Show profile · Link to this post
Quote by NickD:
so that the networked file system will run faster
Just out of curiosity, why this system structure?
By Patreon.com a few eurons can be fed into the code phasers of
the DokuWiki engine. Besides, Andi's posts are worth reading.
Avatar
NickD #10
Member since Feb 2019 · 4 posts
Group memberships: Members
Show profile · Link to this post
Quote by Michaelsy:
Quote by NickD:
so that the networked file system will run faster
Just out of curiosity, why this system structure?

This is for an "internal" wiki, just so everyone at home can share a few common notes, or lists of procedures on how to fix things.

Overall it's an unusual setup - there's a pair of Debian virtual systems, in an HA configuration, that have /usr/share/dokuwiki and /var/lib/dokuwiki running on a networked "share", which is run using GlusterFS, and is also running on a pair of virtual systems.  So any single virtual system can break, or the host running one of the Apache2 servers and the backend gluster servers can break, and the wiki should just stay up.

Partly this is just to see if it will work - I do realise this is non-standard.

Partly this isn't as though an internal wiki needs this kind of resilience and uptime, but it means various components on the network can die and the wiki will stay up - as it can take a while before I've either got the money to replace something, or the time to rebuild something.
Avatar
Michaelsy #11
Member since Jun 2015 · 925 posts · Location: Düsseldorf, Germany
Group memberships: Members
Show profile · Link to this post
Thanks for the reply.

Ok, so you want to achieve greater reliability through a physical separation. But for this physical separation you are currently using the wrong tool, definitely. Better methods would be data replication / mirroring and iSCSI. The latter is a network protocol developed specifically for your use case:

It is a storage area network (SAN) protocol, allowing organizations to consolidate storage into storage arrays while providing clients (such as database and web servers) with the illusion of locally attached SCSI disks.
Source: https://en.wikipedia.org/wiki/ISCSI
By Patreon.com a few eurons can be fed into the code phasers of
the DokuWiki engine. Besides, Andi's posts are worth reading.
Close Smaller – Larger + Reply to this post:
Verification code: VeriCode Please enter the word from the image into the text field below. (Type the letters only, lower case is okay.)
Smileys: :-) ;-) :-D :-p :blush: :cool: :rolleyes: :huh: :-/ <_< :-( :'( :#: :scared: 8-( :nuts: :-O
Special characters:
Go to forum
Imprint
This board is powered by the Unclassified NewsBoard software, 20150713-dev, © 2003-2015 by Yves Goergen
Current time: 2019-06-19, 07:11:55 (UTC +02:00)