This is a static dump of issues in the old "Flyspray" bugtracker for DokuWiki. Bugs and feature requests
are now tracked at the issue tracker at Github.
Closed
Fixed
FS#2593 js_compress() cause a php "Maximum execution time of 60 seconds exceeded" because of useless while
Backend
2012-09-14entreloup
Hi,
I use this version: Release 2012-09-10 "Adora Belle RC1" (did not find it in select field).
When I installed dokuwiki, it crashed because of js_compress(). I was unable to use it.
It appears that deactivating compression worked (but not for installation because it is not allowed to alter conf before installation) and commenting out this function in code too.
I looked closer and made a small code change that solved my problem:
In lib/exe/js.php line 304, I commented out the second while and replaced it by a "else $j++;" after the next test.
In other words, this (sorry for poor formatting):
while( ($s{$i+$j} != '\\') && ($s{$i+$j} != '/')){
$j = $j + 1;
} if($s{$i+$j} == '\\') $j = $j + 2;
Less loops, less tests, less time spent, no more error.
I double checked that the resulting javascript is the same (I allowed a bigger timeout for testing).
Regards.
Regis.
2012-09-14Michitux
I think the code you propose is functionally identical to the old code so I see no problem with it (the loop around it these two lines does the '/' check), I just wonder why this actually improves the speed as this code should only be executed for regular expressions and I wouldn't expect that there are so many regular expressions. How much did this increase the speed for you?
2012-09-14entreloup
Well, I'd like to be able to give you a full mathematical proof, but I can't.
What I can tell you :
- with your code, I had a timeout after 60 seconds (the script only finish by setting timeout to 160 but I didn't get an exact duration)
- with my code, it stopped before the 60 seconds timeout
- while in while in while is pure evil ;p
- more seriously, in each main loop you first do 1 test that you repeat in the next loop (adding another test), when you find a "\" you double the test again: as it is a n3 loop, this could be a few seconds more
If you really don't believe it, you still can test it with a microtime() call before and after this loop, with your code and mine.
Last, I have a question: why do the compression each page load? Don't you think it could be better to save the compressed version to a cache directory and to refresh it only if the cache is missing or if the source is newer? On my intranet server (not a very quick one but still 1GHz and 1Gb RAM) it disabled compression to gain a lot of speed.
If this seems OK to you, I can do this code, I already made some like this.
Regis.
2012-09-14Michitux
The compressed version is saved, it should only be generated once and then loaded from the cache (this http_cached_finish-function saves the cache file and at the beginning it is checked if the cache file can be used). If the js is not cached, something must be wrong.
Any other opinion? If not, I'll change this as suggested in the next days.
2012-09-14entreloup
OK, I guess something is wrong then.
In my case disabling compression is OK so that's not a real problem.
Regis.
2012-09-18Michitux
Fixed in e71b260a - feel free to open another bug report for the caching problem.