A number of 'special' files, including 'robots.txt', 'sitemap.xml', 'sitemap.xlm.gz' etc should always produce a 404 response if they don't exist, regardless of namespace, under the DowkuWiki basedir.
This should not be related to whether $conf['send404'] is enabled or not (the default is 'disabled').
Maybe some configuration file or an array, listing those 'special' files, is needed.
This mostly relates to search engine spiders (esp. Google sitemaps with a sitemap at other than the top level URL).