ANNOUNCEMENT: Live Wireshark University & Allegro Packets online APAC Wireshark Training Session
April 17th, 2024 | 14:30-16:00 SGT (UTC+8) | Online

Wireshark-dev: Re: [Wireshark-dev] Collection of captures for each supported dissector?

From: Peter Wu <peter@xxxxxxxxxxxxx>
Date: Mon, 30 Jun 2014 15:05:57 +0200
On Monday 30 June 2014 07:12:56 Evan Huus wrote:
> The "menagerie" is our collection of capture files that the fuzz-bot uses to
> test with. It contains a substantial number of files across as many
> protocols as we have been able to accumulate. However, I am not sure it is
> entirely publicly accessible?

I have seen the menagerie mentioned in bug reports, but could never find this 
publically.

> Additionally, it is not indexed. There is a script somewhere to use tshark
> to extract the protocols contained in each capture and build a list, but it
> only works for protocols which are dissectible by default (no "decode as",
> decryption, or other special settings usually).
> 
> One of the ideas floated at sharkfest this year was the possibility of a
> proper interface to the menagerie, but I don't think anything really came
> of it. What protocol are you interested in right now?

There is no particular protocol I am interested at, it was an idea to improve 
regression testing. Right now I am looking at all dissectors below TCP (or on 
top, depending on how you look at it).


By the way, could I get delete permissions for attachments for the 
SampleCaptures page on the wiki? There are a bunch of duplicates (and even 
some empty files) listed as attachment and not linked. Some are not even
captures files although their extension suggest so.

Empty files:
mount-de.pcap.gz
omron-test-csum.pcap
wireshark.org.pcap.gz

Not pcap (but tcpdump text output or even a media file):
packetout.pcap
RTSP.pcap

Duplicates can be found with:
md5sum * | sort | uniq -w32 -D | while read sum file; do echo $sum $(date 
+"%Y-%m-%d %H:%M" -r "$file") "$(du -hD "$file")"; done

Are there known efforts to index the files? I don't think that the wiki is a 
sustainable way to collect them?

Kind regards,
Peter
https://lekensteyn.nl