1725Re: [xenu-usergroup] Xenu running out of memory
- Dec 22, 2013Hi Tom,
Error 80 is "file exists". That doesn't really make sense. Please go to your %temp% (C:\Users\Ento\AppData\Local\Temp\) directory, clean up everything (is there a lot?), maybe reboot your pc...
all I found on google is this, which is similar to what I suggest:
This one claims (at the end of the page) that GetTempFileName is unreliable
If this still won't work (and you're the first person ever with this), I will try locking, or retries.
Am 22.12.2013 11:14, schrieb Tom Wenseleers:
After a few more tries I got it to run for a while now, but after 1 million links or so I now get the error message
“Fatal error: GetTempFileName() failed, szTmpDir=’C:\Users\Ento\AppData\Local\Temp\’, GetLastError()=80”
Please try again... I did have a crash too when testing (I hadn't touched the 64bit version for two years), and then the next time nothing happened. No idea why.
Am 20.12.2013 17:52, schrieb Tom Wenseleers:
Many thanks for this! That’s great! When I run it the program crashes though after checking the first link in the site. The reported error is
Problem Event Name: APPCRASH
Application Name: Xenu.exe
Application Version: 184.108.40.206
Application Timestamp: 52b3da1c
Fault Module Name: Xenu.exe
Fault Module Version: 220.127.116.11
Fault Module Timestamp: 52b3da1c
Exception Code: c000041d
Exception Offset: 000000000002ec35
OS Version: 6.1.7600.2.0.0.256.48
Locale ID: 2057
Additional Information 1: 8d31
Additional Information 2: 8d314098954c7e7d286a2b7ff688fe99
Additional Information 3: aae1
Additional Information 4: aae1f4c2b8d313ff58125b2c9ddd56ae
(I am using Windows 7 64 bit)
There is no such option, my question was whether such a feature would help. Anyway, here it is:
There are now two files in the ZIP, because there's a dll for the gzip compression which speeds up downloading. Although not with that site, I looked at it and it doesn't use gzip transport.
Am 19.12.2013 13:43, schrieb Tom Wenseleers:
Yes I did in fact uncheck all those extra options, and that gets me a little bit further before it crashes, but not much.
Is there any other way perhaps in which Xenu could be made more memory efficient and successful at indexing large sites?
Trying with HTTrack now, to download the whole thing…
Does this mean you could
- go without titles, description, keywords
- go without external links
Am 18.12.2013 14:10, schrieb Tom Wenseleers:
Further to my previous message – I now tried this site again using the Xenu 64 bit beta, but after downloading about 3 million links it consistently crashes, even though I tried on several PCs, including one with 32 Gb of internal memory. Do you have any thoughts perhaps how I could get around this, or if there is perhaps another nice free crawler that could be used to collect all links without crashing on such big sites?
As I mentioned I don’t really need the sitemap, just the links of all the pages on the site…
See the FAQ...
http://home.snafu.de/tilman/xenulink.html Nr. 9
However: a sitemap with a million URLs doesn't make sense. Such a web page would be huge. (Or do you mean a google sitemap?)
Am 05.12.2013 11:01, schrieb Tom Wenseleers:
I was trying to use Xenu to make a sitemap of this site:
I considered links starting with
is site links.
After collecting about 1.7 million links (25% progress) it crashes though on my machine, either giving me an out of memory error (I have 6 Gb of internal memory and I am using a 64 bit Windows 8 OS) or crashing.
I was wondering if there is anything I could do about this? E.g., does there happen to be a 64 bit version of Xenu that can use all available memory?
(I have some other machines with 32 Gb of internal memory, but I don’t think it will help using those if Xenu can’t use all this memory)
- << Previous post in topic Next post in topic >>