Huge thanks to our Platinum Members Endace and LiveAction,
and our Silver Member Veeam, for supporting the Wireshark Foundation and project.

Ethereal-users: Re: [Ethereal-users] Tethereal analyzing a large capture file crashes...

Note: This archive is from the project's previous web site, ethereal.com. This list is no longer active.

From: Lars Roland <lars.roland@xxxxxxx>
Date: Mon, 07 Mar 2005 23:04:46 +0100
Joel R. Helgeson schrieb:
I have a 24 hour packet capture that I did at a customers network. I only captured the 68byte headers and the file size came out to a mere 40gigs. I'd now like to do analysis of the file using tethereal, I'd like to give them a graph showing kbps at 5 minute intervals:
tethereal -r capture.cap -z io,stat,300 -q >io_stat_results.txt
or any other similar analysis just causes my machine to process the file for a while then bomb out. I'm running the analysis on a Windows XP sp2 machine with 2gb of ram and 400 gigs of disk space. The processor is an Intel 3.0ghz. What seems to happen is that tethereal uses up the ram and swap, up to a total of 2gigs used then it crashes and states that it failed to allocate xxxx bytes of memory... Can anyone help me? Is there any way that I can run reports on this huge of a file without tethereal bombing on me?

40 gigs!! Hmm. As you noticed on Windows you will certainly run out of memory unless, I guess, you have at least 8 gigs of ram. Try splitting the file into smaller pieces with splitcap and try to analyze those. 500Mb should should be OK for your machine for sure, perhaps even one 1GB or even more. Or give Linux a try. :)

I'd like to hear from you about how big a file can be to be successfully analyzed on windows with 2 gigs of ram using tethereal.

Regards,
Lars