> this time i've a question about metabase xml schemas. i've built one
> setting up my open source application. it contains both the schema and
> the initialization for db tables, but the problem is that it grew to
> over 6000 lines and even if it is "only" 360 kb, metabase schema
> wants 18 megs of memory to install it.
Yes ... I also had a user for MDB that had a 7mb dump. He wanted to move
the data from one rdbms to another iirc ...
It took a couple hundred mb's of ram and quite some time to read that
file. He therefore reimplemented the parser using the PEAR XML_Parser
class and was able to make it feasible to parse such large xml files. We
took great care to ensure that the parser works like the old parser, but
of course this might be a source of trouble moving to this parser
I think you will probably just have to modify the
ParseDatabaseDefinitionFile() method in the metabase manager to be like
parseDatabaseDefinitionFile() in the MDB manager
And of course you would need the XML_Parser package
> i'm wondering if it is possibile to take the initialization for at
> a table out of this file and keep in a separate file.
I think it is even described in the documentation.
An important tag to use there is the <create> tag which should then be
set to '0'.