ANNOUNCEMENT: Live Wireshark University & Allegro Packets online APAC Wireshark Training Session
April 17th, 2024 | 14:30-16:00 SGT (UTC+8) | Online

Wireshark-dev: Re: [Wireshark-dev] memory allocation assertion failure reading 219MB file with

From: "Ravi Kondamuru" <ravikondamuru@xxxxxxxxx>
Date: Thu, 24 Aug 2006 22:47:09 -0700

Along with your proposed memory limit per dissector, Wireshark will still need another limit [I had listed 3 possibilities in my earlier email]. The use case I am referring to being that, while each dissector might be allocating memory well below the defined limit, the number of packets in the file can be large causing system to slowdown, and wireshark to become unresponsive and abort [windows-wireshark behavior].

We do lot of trace analysis on files that are big, like the one I was testing with. Being able to see a large file in Wireshark, quickly, is what I am looking to get to. If that means processing only a part of the whole file with option to process subsequent part of the file, that is absolutely fine.

thanks,
Ravi.

On 8/24/06, ronnie sahlberg <ronniesahlberg@xxxxxxxxx> wrote:
i think significant parts of the utilized memory is memory allocated by wireshark itself for things like reassembly  and state management of protocols.

since most se allocation is done from within a dissector
once more/most of this is converted to use emem allocators  we could maybe do something like :

1, set an upper limit on how much memory we allow to be allocated by the se allocator
2, when se_alloc is called and we have reached this limit,   just cause a new exception MemError
to cause dissection of the packet to be aborted but allow wireshark to continue.

this might actually work





On 8/25/06, Jeff Morriss < jeff.morriss@xxxxxxxxxxx> wrote:


Ravi Kondamuru wrote:
>
> Thanks for the wiki link.
>
> In the workarounds highlighed, I feel that point 3 (Split the capture
> file into several smaller ones) can be made more appealing by
> programatically limiting the amount of data (packets/ memory consumed/
> load time) wireshark already read/ used.
>
> Wireshark does something similar when a large file is selected in the
> "Select a capture file" dialog box when opening a file. After 3 secs
> (prefs: file_open_preview_timeout) of reading a file, it stops reading
> further and displays "more than xyz packets (preview timeout)".
>
> My point being, can the same approach be taken with large files during
> the actual display?
>
> An option will let the user make wireshark parse the subsequent or
> previous packets till a timeout happens again. An option will let users
> to make wireshark read the complete file before display. How much to
> read at a time can be determined as mentioned earlier on one of 1)
> number of packets read, 2) memory consumed so far or 3) amount of time
> spent reading.
>
> Please mail, if you guys think of any issues that might make this
> approach not worth pursuing.

I think the problem with this approach is that it's difficult to know
[at least in a cross-platform manner that works on all the platforms
Wireshark runs on] when you're going to run out of memory until you
actually have run out of memory (and malloc() fails).  As mentioned in
the Wiki, Wireshark and (more importantly as it's a bigger job to
change) some of the libraries Wireshark uses simply call abort() when
malloc() fails.

-J

> On 8/22/06, *Jeff Morriss* < jeff.morriss@xxxxxxxxxxx
> <mailto: jeff.morriss@xxxxxxxxxxx>> wrote:
>
>
>
>     Guy Harris wrote:
>      > Ravi Kondamuru wrote:
>      >
>      >> My question:
>      >> Is there a known limit on the number of packets that wireshark
>     can deal
>      >> with in a single file?
>      >
>      > The number of packets that Wireshark (or, I suspect, any network
>      > analyzer) can deal with is limited; due to a number of factors,
>     the GUI
>      > widget used to implement the packet list display being one of
>     them (it
>      > allocates a string for the text value in every column, which eats
>     a lot
>      > of memory), Wireshark's limit might be lower than some other
>     analyzers.
>      >
>      > This is not a limit saying something such as "Wireshark can't
>     read more
>      > than 1,227,399 packets"; the point at which it'd run out of memory
>      > depends on the contents of the packets.
>
>     See this page for more info:
>
>     http://wiki.wireshark.org/KnownBugs/OutOfMemory
>
>     _______________________________________________
>     Wireshark-dev mailing list
>     Wireshark-dev@xxxxxxxxxxxxx <mailto: Wireshark-dev@xxxxxxxxxxxxx>
>     http://www.wireshark.org/mailman/listinfo/wireshark-dev
>     <http://www.wireshark.org/mailman/listinfo/wireshark-dev >
>
>
>
> ------------------------------------------------------------------------
>
> _______________________________________________
> Wireshark-dev mailing list
> Wireshark-dev@xxxxxxxxxxxxx
> http://www.wireshark.org/mailman/listinfo/wireshark-dev
_______________________________________________
Wireshark-dev mailing list
Wireshark-dev@xxxxxxxxxxxxx
http://www.wireshark.org/mailman/listinfo/wireshark-dev


_______________________________________________
Wireshark-dev mailing list
Wireshark-dev@xxxxxxxxxxxxx
http://www.wireshark.org/mailman/listinfo/wireshark-dev