ANNOUNCEMENT: Live Wireshark University & Allegro Packets online APAC Wireshark Training Session
April 17th, 2024 | 14:30-16:00 SGT (UTC+8) | Online

Wireshark-dev: [Wireshark-dev] Limiting amount of memory used to analyze TCP (HTTP) traffic. Qu

From: Vitaly Repin <vitaly.repin@xxxxxxxxx>
Date: Thu, 25 Feb 2016 23:53:36 +0200

I am trying to understand how the Wireshark TCP dissector utilizes memory.

If I analyze huge HTTP session (e.g., download of ISO image over HTTP)
it starts to eat a lot of memory.

In our setup, wireshark is running for online analysis of the traffic
which makes it important to control the amount of memory consumed.
We need to get HTTP payload and tracking of the connection
(request/response relations).

My first idea was to limit the amount of fragments collected for PDU

I added the following condition to tcp-packet.c (function
desegment_tcp, before adding new fragment for desegmentation) just to
test the behavior:

if(msp->nxtpdu < 64*1024) {
   ipfd_head = fragment_add(&tcp_reassembly_table, tvb, offset,
                               pinfo, msp->first_frame, NULL,
                               seq - msp->seq, len,
                              (LT_SEQ (nxtseq,msp->nxtpdu)) );
} else ipfd_head = NULL;

I see that if the content size is more than 64K (nxtpdu is set with
the help of HTTP dissector which analyzes HTTP Content-Length header)
then no data is sent to the PDU defragmentation routine.  Which means
that tcp_reassembly_table.fragment_table is not growing and memory
consumption is decreased.  (works if the huge response was sent in the
beginning of the session)

But unfortunatelly memory consumption is still very significant.

Then I have taken a look into  tcpd->acked_table.  According to the
comment it "contains a tree containing all the various ta's keyed by
frame number".
I see that this list monothonically grows during the analysis. It is
expected behavior?

Any ideas how I can decrease memory consumption (even for the price of
not being able to analyze the whole TCP session if it contains huge
amount of data)?

Thanks in advance!

WBR & WBW, Vitaly