Loading ...
Sorry, an error occurred while loading the content.

6920Re: [XSL-FO] FOP: Will not process large file

Expand Messages
  • jasmine2501@netzero.com
    Nov 2, 2005
    • 0 Attachment
      Yes, thanks. We are going to try to make it in smaller chunks and put it together using ActivePDF. My JVM is running out of memory apparently, but I can not get it to stop doing that, no matter what I set the memory on. I'm using a command like this:

      java -mx150m -jar fop.jar -fo "C:\770470789E.fo" -pdf "c:\770470789E.pdf"

      Is this the correct way to do it, or am I missing something? This should set the JVM to use 150MB, but it seems like it's blowing up way before it gets to 150MB. My machine has 256MB of on-board memory, but when the process is running, it doesn't come anywhere close to using that before it blows up with the OutOfMemoryError:

      C:\Program Files\fop-0.20.5\lib>java -mx250m -jar fop.jar -fo "C:\770470789E.fo" -pdf "c:\770470789E.pdf"
      [INFO] Using org.apache.xerces.parsers.SAXParser as SAX2 Parser
      [INFO] FOP 0.20.5
      [INFO] Using org.apache.xerces.parsers.SAXParser as SAX2 Parser
      [INFO] building formatting object tree
      [INFO] setting up fonts
      Exception in thread "main" java.lang.OutOfMemoryError





      -----------------------------------------------
      Jasmine wrote:
      > I have a really large file I want to process with FOP. The XSL-FO file
      > is 21MB. It is as small as we can make it. We have considered breaking
      > it up into smaller files, but for our client, this would kind of
      > defeat the purpose of generating the PDF in the first place. They can
      > look at smaller chunks on the web site, but the PDF report is supposed
      > to be a complete listing. We only have one client out of 8500 which
      > has this problem, but they are the biggest and most valuable client,
      > and we would like to make this work for them. Am I running up against
      > some kind of limit for FOP? I don't think I'm exceeding the limit for
      > PDF, cuz I know I've seen PDF files much larger than this. This PDF,
      > if processed correctly, would result in 547 pages. I can not send
      > sample code because it contains confidential healthcare information.

      You could always process it in chunks to produce separate PDFs and then
      concatenate them together using either Acrobat or any number of PDF
      processing libraries that are out there. It shouldn't be hard to do at all.

      The other thing to check is that you are setting the memory size for the
      JVM to the highest setting your machine can accomodate--you might just
      be hitting a Java memory limit.

      Cheers,

      Eliot
    • Show all 13 messages in this topic