Less loops, less tests, less time spent, no more error.
I think the code you propose is functionally identical to the old code so I see no problem with it (the loop around it these two lines does the '/' check), I just wonder why this actually improves the speed as this code should only be executed for regular expressions and I wouldn't expect that there are so many regular expressions. How much did this increase the speed for you?
Well, I'd like to be able to give you a full mathematical proof, but I can't.
What I can tell you :
- with your code, I had a timeout after 60 seconds (the script only finish by setting timeout to 160 but I didn't get an exact duration)
- with my code, it stopped before the 60 seconds timeout
- while in while in while is pure evil ;p
- more seriously, in each main loop you first do 1 test that you repeat in the next loop (adding another test), when you find a "\" you double the test again: as it is a n3 loop, this could be a few seconds more
If you really don't believe it, you still can test it with a microtime() call before and after this loop, with your code and mine.
Last, I have a question: why do the compression each page load? Don't you think it could be better to save the compressed version to a cache directory and to refresh it only if the cache is missing or if the source is newer? On my intranet server (not a very quick one but still 1GHz and 1Gb RAM) it disabled compression to gain a lot of speed.
If this seems OK to you, I can do this code, I already made some like this.
The compressed version is saved, it should only be generated once and then loaded from the cache (this http_cached_finish-function saves the cache file and at the beginning it is checked if the cache file can be used). If the js is not cached, something must be wrong.
Any other opinion? If not, I'll change this as suggested in the next days.
OK, I guess something is wrong then.
In my case disabling compression is OK so that's not a real problem.
Fixed in e71b260a - feel free to open another bug report for the caching problem.