Huge thanks to our Platinum Members Endace and LiveAction,
and our Silver Member Veeam, for supporting the Wireshark Foundation and project.

Ethereal-users: Re: [Ethereal-users] Tethereal analyzing a large capture file crashes...

Note: This archive is from the project's previous web site, ethereal.com. This list is no longer active.

From: "Joel R. Helgeson" <joel@xxxxxxxxxxxx>
Date: Mon, 7 Mar 2005 16:54:50 -0600
The exact error I get is:
GLib-ERROR **: gmem.c:140: failed to allocate 16384 bytes
aborting...
 
Ideas? Is it just not possible to analyze a 40gig capture file in one piece?
 
Joel R. Helgeson
 
"Give a man fire, and he'll be warm for a day; set a man on fire, and he'll be warm for the rest of his life."
----- Original Message -----
Sent: Monday, March 07, 2005 3:44 PM
Subject: [Ethereal-users] Tethereal analyzing a large capture file crashes...

I have a 24 hour packet capture that I did at a customers network.  I only captured the 68byte headers and the file size came out to a mere 40gigs.
 
I'd now like to do analysis of the file using tethereal, I'd like to give them a graph showing kbps at 5 minute intervals:
tethereal -r capture.cap -z io,stat,300 -q >io_stat_results.txt
 
or any other similar analysis just causes my machine to process the file for a while then bomb out.  I'm running the analysis on a Windows XP sp2 machine with 2gb of ram and 400 gigs of disk space. The processor is an Intel 3.0ghz. What seems to happen is that tethereal uses up the ram and swap, up to a total of 2gigs used then it crashes and states that it failed to allocate xxxx bytes of memory...
 
Can anyone help me? Is there any way that I can run reports on this huge of a file without tethereal bombing on me?
 
Any ideas?
 
Joel
 


_______________________________________________
Ethereal-users mailing list
Ethereal-users@xxxxxxxxxxxx
http://www.ethereal.com/mailman/listinfo/ethereal-users