Loading ...
Sorry, an error occurred while loading the content.
 

opening *really* big files

Expand Messages
  • Ron Aaron
    I innocently tried to open a log file (from a debugging run) and vim was unable to handle it. The file is 1Gig in size, so perhaps I shouldn t expect vim to
    Message 1 of 3 , Dec 2, 2003
      I innocently tried to open a log file (from a debugging run) and vim was
      unable to handle it. The file is 1Gig in size, so perhaps I shouldn't
      expect vim to deal with it.

      But I would like some reasonable way of dealing with extremely large
      files in vim. Dragging my machine down while it counts all the lines in
      the file (and uses 900Meg++ of RAM on my XP box) is not a reasonable
      option.

      It doesn't seem that 'maxmem' etc settings are useful on Windows. I
      have 'maxmem=357378' and 'maxmemtot=357378', yet vim was taking every
      byte of RAM in the system, making it page and thrash horribly.

      I would like to see a 'bigfile' option, which might tell vim that if a
      file is bigger than bigfile megabytes (maybe default to 50% of available
      RAM or something) that vim would load it as a 'bigfile', perhaps with
      reduced functionality or something. It is a *very* common thing to want
      to view very large files without editing them, and currently vim does a
      poor job of dealing with VLFs.
    • William Natter
      I would welcome a bigfile option. Let me propose related ideas regarding big files (my collegues and I very often open files in the 300+ MB range, sometimes
      Message 2 of 3 , Dec 2, 2003
        I would welcome a bigfile option. Let me propose related ideas
        regarding big files (my collegues and I very often open files in the
        300+ MB range, sometimes GB). As I am only a user and not a developer,
        you are welcome to ignore them, but I think they make some sense.

        If all I want to do is view a large file, "less" works fine with me (on
        a Solaris box, soon-to-become Linux), and I strongly encourage other
        people not to use vim then. I don't know what equivalent there is for
        Windows.

        If I want to edit a big file, then I would love vim to:
        - not use a swap file (sometimes, I am limited in disk space)
        - not try to remember changes too much (limited undo, to avoid RAM
        explosion)
        - not load the whole file in memory (which is probably what you meant, Ron)

        A file size threshold would be a handy way to let vimmers handle such
        issues.

        Ron Aaron wrote:

        >I innocently tried to open a log file (from a debugging run) and vim was
        >unable to handle it. The file is 1Gig in size, so perhaps I shouldn't
        >expect vim to deal with it.
        >
        >But I would like some reasonable way of dealing with extremely large
        >files in vim. Dragging my machine down while it counts all the lines in
        >the file (and uses 900Meg++ of RAM on my XP box) is not a reasonable
        >option.
        >
        >It doesn't seem that 'maxmem' etc settings are useful on Windows. I
        >have 'maxmem=357378' and 'maxmemtot=357378', yet vim was taking every
        >byte of RAM in the system, making it page and thrash horribly.
        >
        >I would like to see a 'bigfile' option, which might tell vim that if a
        >file is bigger than bigfile megabytes (maybe default to 50% of available
        >RAM or something) that vim would load it as a 'bigfile', perhaps with
        >reduced functionality or something. It is a *very* common thing to want
        >to view very large files without editing them, and currently vim does a
        >poor job of dealing with VLFs.
        >
        >
      • hotmail
        Ron, I am surprised because I remember opening extemely big files with vim (log files also) when all else failed. It was actually on aix but it saved me as the
        Message 3 of 3 , Dec 3, 2003
          Ron,
          I am surprised because I remember opening extemely big files with vim (log
          files also) when all else failed.
          It was actually on aix but it saved me as the built in editors didn't work.
          On the other hand I noticed that vim doesn't like files with very long
          lines.
          Regards
          Benoit
          ----- Original Message -----
          From: "Ron Aaron" <ron@...>
          To: <vim-dev@...>
          Sent: Wednesday, December 03, 2003 3:33 AM
          Subject: opening *really* big files


          > I innocently tried to open a log file (from a debugging run) and vim was
          > unable to handle it. The file is 1Gig in size, so perhaps I shouldn't
          > expect vim to deal with it.
          >
          > But I would like some reasonable way of dealing with extremely large
          > files in vim. Dragging my machine down while it counts all the lines in
          > the file (and uses 900Meg++ of RAM on my XP box) is not a reasonable
          > option.
          >
          > It doesn't seem that 'maxmem' etc settings are useful on Windows. I
          > have 'maxmem=357378' and 'maxmemtot=357378', yet vim was taking every
          > byte of RAM in the system, making it page and thrash horribly.
          >
          > I would like to see a 'bigfile' option, which might tell vim that if a
          > file is bigger than bigfile megabytes (maybe default to 50% of available
          > RAM or something) that vim would load it as a 'bigfile', perhaps with
          > reduced functionality or something. It is a *very* common thing to want
          > to view very large files without editing them, and currently vim does a
          > poor job of dealing with VLFs.
          >
        Your message has been successfully submitted and would be delivered to recipients shortly.