![]() We kept four different sizes for the buffer: 1K, 100K, 1M, and 16M. Even if the request is for 1b, allocate a 1kb buffer. We used a BufferPool to solve the memory fragmentation. This was a result of fragmentation due to random buffer size. It was allocating more new pages and was not reusing already allocated and unused pages. ![]() What was happening here? Because we were allocating a buffer of differing sizes - from 1 byte to 16MB - Go's memory scavenger was not handling those requests efficiently. The top command on Linux also was showing the same information, very low RSS (resident set size) memory but high VSZ (virtual memory) usage. If it was not released by GC, it would have been captured in the in-use memory profile. What does this mean? This means GC has released the memory. The profile showed huge allocated memory but very low in-use memory. We took memory profiles at frequent intervals and specifically at a time when we were seeing high memory usage. But then we noticed a surprising result in our memory profiling. As expected, it was happening at a place where we were allocating buffers before reading data into the buffer. We used pprof to do memory profiling and to find out where maximum memory allocation happens. We tried forcing frequent GC by setting different config values - it did not help. With Go being a garbage-collected language, that possibility was very minimal. We scanned our code multiple times to see if we were releasing any object references. How did we identify the root cause of the issue?Īt first, it looked like a memory leak issue. ![]() The end result is a backup process that consumes huge amounts of memory. However, previous buffers that are no longer in use are not freed (or at least that’s what it looked like at the start). To read these files, it allocates and processes buffers. The backup process works on GBs of data per minute. It reads hundreds of files in parallel, does some processing like compression of data, and sends that over the network. This is a typical use case where the process requires frequent memory allocation. This blog gives insight into how we at Druva overcame this problem and leverage Go Lang in a memory-intensive application. When processes written in Go Lang work with unstructured data, they tend to consume a lot more memory than expected.
0 Comments
Leave a Reply. |