Huge thanks to our Platinum Members Endace and LiveAction,
and our Silver Member Veeam, for supporting the Wireshark Foundation and project.

Wireshark-users: Re: [Wireshark-users] Filtering a very large capture file

From: Jeff Morriss <jeff.morriss@xxxxxxxxxxx>
Date: Fri, 26 Jan 2007 12:58:25 +0800


Stuart MacDonald wrote:
I have a very large capture file from tcpdump, 16 Gb. Wireshark
crashes trying to open it, a known issue.

For some of my investigation I used editcap and split it into smaller
captures, and that worked okay, but there were 1000 of them and each
is still slow to load/filter/etc; the size ranges from 14 to 28 Mb.

I need to locate a small handful of packets within the large capture;
there's some infrequent traffic I'm interested in. It's not feasible
for me to open, apply display filter, close, each of the 1000 smaller
files. I estimate it would take about 20 hours to do a brute force
search.

I've read the man pages on the tools that come with Wireshark. I was
hoping to find a tool that opens a capture, applies a filter and
outputs matching packets to a new file. Here's a sample run of the
hypothetical filtercap tool:
# filtercap -r very-large.eth -w only-infrequent.eth -f "tcp.port==50000"

What about:

- split the files into 1000 smaller files
- use a (decent) shell with tshark to process those files with tshark

The latter could be achieved in a Korn style shell with something like:

(for f in *.eth
do
    tshark -r $f -w - -R "tcp.port=50000"
done) > only-infrequent.eth

That would work on Unix though I'm not sure about Windoze (IIRC in the past there have been issues with reading/writing stdin/stdout on that OS though maybe they're all fixed).