Recently I have been spending lots of time, tuning a PHP application. The main finding was: It's not the application that slows the server, but loading the PHP files and running PHP itself. Most PHP scripts load a bunch of further PHP scripts (framework, configuration, etc.) and that makes a script so much slower than a plain HTML file.
Based on this learning, I would like to suggest a "hardcopy cache" feature to optimize caching. In most cases (not regarding some plugins) the page DokuWiki serves to a user is non-dynamic. So why not store a HTML copy for each (public) page in the wiki? For read-only users this can speed up page delivery by about the factor 100 or more while massively reducing CPU load.
There are only few changes necessary to use such a feature:
- Check if a page is suitable for caching (e.g. ACL, dynamic content plugins)
- Store HTML copy
- Link the HTML file instead of doku.php on other pages
- Rewrite all HTML hardcopies if a page is created or deleted (because links may change in this case)
If you see major or minor objections against such a feature, please leave a comment.
PS: There is no place on the homepage to say, so let it say it here: I love Dokiwiki. It is slim but has all the features the documentation requires - and the option for an own ACL is great!
DokuWiki uses a cache of the full page content (i.e. the part you enter in wiki syntax). The PHP code that is executed on a page load for every user is basically there to check permissions, if the caches are valid, and to display the template. I don't think that it is easily possible to dramatically enhance this without removing a lot of features, like e.g. the breadcrumbs that display the recently visited pages or the possibility for plugins to decide on their own if the cache should be used for a certain page or not. Some templates (including the new default template) include also a sidebar that has it's own cache and is independent of the page caches. Your suggestion would imply that changes to the sidebar lead to a full rewrite of all HTML copies. However with some thousands of pages this doesn't work.
Furthermore you suggest that URLs for logged in and logged out users should be different. I really don't like this idea. The cache systems I know e.g. for Wordpress usually check directly in the web server if a user is logged in or not and display a HTML file or call PHP depending on this. This is however not possible in DokuWiki as external authentication systems can be used that can use arbitrary PHP code in order to decide if a user is logged in or not. However if the URLs are different this means that whenever a page changes from static to dynamic (e.g. because a syntax that involves dynamic data is used in the page), all links to the page will be broken unless the web server is configured to do a redirect in the case the cache file is missing.
I think there is also one point in your list that definitely doesn't work: "Rewrite all HTML hardcopies if a page is created or deleted (because links may change in this case)". Dokuwiki.org has some thousand public pages. You can't rewrite them whenever a page is created or deleted.
I think with some modifications (like not using different URLs for the static HTML files) your suggestion might work, at least with the normal, user/password based authentication system that uses special authentication cookies the webserver could look for, but I don't think this is something DokuWiki should include, especially as sidebars used in DokuWiki can contain dynamic data which makes the whole cache system very difficult. I think this could be implemented as plugin.
Actually I did not consider page permissions and dynamic content enough. When writing the suggestion, I thought about a documentation in the old-style sense - something that looks quite similar to everybody reading it.
But your arguments are quite convincing - so I retreat this idea :)