This is a static dump of issues in the old "Flyspray" bugtracker for DokuWiki. Bugs and feature requests
are now tracked at the issue tracker at Github.
This task was never closed in our old bug tracker.
Feel free to open a new task at Github if you feel this is still relevant.
FS#2004 Large tables required too much memory to parse.
Large tables required too much memory to parse. I've had to increase the memory_limit in my php a few times as certain tables grow in size. Currently it takes more then 34 megabytes to process a 41k wikipage that's mostly a table.
I can confirm that the recommended 32M memory limit is not enough to parse a table in the range 450 x 12 cells (roughly 40k page size).
I can confirm this still exists with release 2011_05_25a. I can't confirm it yet but it looks like it might take more memory. I'm currently tracking down a bug where the indexer fails on pages with large tables due to memory constraints. The indexer dies with a file of 400kb consisting two headers and a table with 4 columns and 11757 rows. (Hitting it's 128mb cap - I'm testing at 256mb) My suspicion that tables now take more memory come from the fact the indexer didn't die on this file prior. This might be due to a change in the indexer though.
I'm not in favor of such large tables but they're created by some of the teams here regardless of the fact they don't actually render. I can't enforce a file size restriction because files are written to a nfs share.
Are here specific improvements possible? Which?
For 5 years no other complains are added here. So for most people this use case is not a real bottleneck?
I would say it's a known issue. But I have no idea if this is possible to fix?