Take advantage of Microsoft.IO.RecyclableMemoryStream to eliminate LOH allocations and avoid memory fragmentation and memory leaks in your .NET Core applications. Credit: Gonin / Getty Images Microsoft.IO.RecyclableMemoryStream is a high-performance library designed to improve application performance when working with streams. It is a replacement for MemoryStream and provides better performance than MemoryStream instances. You can use RecyclableMemoryStream to eliminate LOH (large object heap) allocations and avoid memory fragmentation and memory leaks. This article talks about the Microsoft.IO.RecyclableMemoryStream library, its purpose, and how it can be used in .NET Core applications to boost application performance. To work with the code examples provided in this article, you should have Visual Studio 2019 installed in your system. If you don’t already have a copy, you can download Visual Studio 2019 here. Create a .NET Core console application project in Visual Studio First off, let’s create a .NET Core console application project in Visual Studio. Assuming Visual Studio 2019 is installed in your system, follow the steps outlined below to create a new .NET Core console application project in Visual Studio. Launch the Visual Studio IDE. Click on “Create new project.” In the “Create new project” window, select “Console App (.NET Core)” from the list of templates displayed. Click Next. In the “Configure your new project” window shown next, specify the name and location for the new project. Click Create. This will create a new .NET Core console application project in Visual Studio 2019. We’ll use this project to work with Microsoft.IO.RecyclableMemoryStream in the subsequent sections of this article. RecyclableMemoryStream benefits Microsoft.IO.RecyclableMemoryStream provides the following benefits: Eliminates LOH allocations using pooled buffers. Incurs far fewer generation 2 GCs and spends much less time pausing while a GC operation is in progress. Avoids memory fragmentation and memory leaks. Provides metrics that can be used for tracking and analyzing performance. How RecyclableMemoryStream works RecyclableMemoryStream stores the large buffers used for streams in the generation 2 heap and ensures that these buffers stay there forever. This also ensures that full collection occurs infrequently. The RecyclableMemoryStreamManager class maintains two separate pools: Small pool – contains small buffers that are used in read/write operations Large pool – contains large buffers used only when you have a contiguous buffer Multiple small pools of 128 KB each and large pools of 1 MB each (default) are created. The large pool has two versions — the linear large pool and the exponential large pool. The linear large pool is the default and grows linearly, and the exponential large pool grows in an exponential manner, i.e., the buffers double in size for each slot. A RecyclableMemoryStream instance starts by allocating a small buffer initially. Additional buffers are chained together as the stream capacity increases. Usually, the small pool is used for normal read/write operations and memory usage is optimized and efficient because the buffers are abstracted from you. This is exactly why RecyclableMemoryStream is much more efficient than MemoryStream. The large pool is used only when the application is in need of a contiguous memory block. When you call the GetBuffer method using the code shown below, the small buffers are converted to a single, large, contiguous buffer. var buffer = recyclableMemoryStreamManager.GetStream().GetBuffer(); Again, the large pool can be either of two types, linear (default) or exponential. Although streams are not thread-safe themselves, the MemoryManager class is thread-safe. Install the RecyclableMemoryStream NuGet package The RecyclableMemoryStream library is available as a NuGet package. To get started working with the RecyclableMemoryStream library, you must install it from NuGet. You can either install it from the NuGet package manager or by using the following command at the NuGet package manager console window. Install-Package Microsoft.IO.RecyclableMemoryStream Create a memory stream instance in .NET Core Assuming that Microsoft.IO.RecyclableMemoryStream has already been installed on your project, you can write the following source code to write data as a memory stream. Note the usage of the RecyclableMemoryStreamManager class. The GetStream method of the RecyclableMemoryStreamManager class returns a memory stream instance. class Program { private static readonly RecyclableMemoryStreamManager recyclableMemoryStreamManager = new RecyclableMemoryStreamManager(); static void Main(string[] args) { string data = "This is a sample text message."; var buffer = Encoding.ASCII.GetBytes(data); using (var memoryStream = recyclableMemoryStreamManager.GetStream()) { memoryStream.Write(buffer, 0, buffer.Length); } Console.ReadKey(); } } It should be noted here that we’ve declared the RecyclableMemoryStreamManager instance just once. The instance should live as long as the process is alive, i.e., for the lifetime of the process. You can optionally provide a string tag when calling the GetStream method. The following code snippet illustrates this. using (var memoryStream = recyclableMemoryStreamManager.GetStream ("High_Performance_Stream_Demo.Program.Main")) { memoryStream.Write(buffer, 0, buffer.Length); } Change the parameters of a memory stream pool in .NET Core It is also possible to customize the pool, i.e., change the parameters of the pool when creating an instance of RecyclableMemoryStreamManager. The following code snippet illustrates how this can be achieved. int blockSize = 1024; int largeBufferMultiple = 1024 * 1024; int maximumBufferSize = 16 * largeBufferMultiple; int maximumFreeLargePoolBytes = maximumBufferSize * 4; int maximumFreeSmallPoolBytes = 250 * blockSize; var recyclableMemoryStreamManager = new RecyclableMemoryStreamManager(blockSize, largeBufferMultiple, maximumBufferSize); recyclableMemoryStreamManager.AggressiveBufferReturn = true; recyclableMemoryStreamManager.GenerateCallStacks = true; recyclableMemoryStreamManager.MaximumFreeLargePoolBytes = maximumFreeLargePoolBytes; recyclableMemoryStreamManager.MaximumFreeSmallPoolBytes = maximumFreeSmallPoolBytes; RecyclableMemoryStream best practices Memory fragmentation can impact the performance of your application, and the large object heap in .NET is prone to fragmentation. The following guidelines or best practices should be adhered to when working with RecyclableMemoryStream: Set the blockSize, largeBufferMultiple, maxBufferSize, MaximumFreeLargePoolBytes, and MaximumFreeSmallPoolBytes properties to appropriate values. Dispose of any stream object as soon as you’re done using it. Never call the ToArray method. Avoid calling the GetBuffer method. Microsoft.IO.RecyclableMemoryStream is a pooled memory stream allocator that is adept at reducing GC load and improving the performance of your applications. Microsoft.IO.RecyclableMemoryStream takes advantage of pooled buffers to eliminate large object heap (LOH) allocations. It not only avoids memory fragmentation and memory leaks but also provides metrics that can be used for tracking performance. How to do more in ASP.NET Core: How to use IHttpClientFactory in ASP.NET Core How to use the ProblemDetails middleware in ASP.NET Core How to create route constraints in ASP.NET Core How to manage user secrets in ASP.NET Core How to build gRPC applications in ASP.NET Core How to redirect a request in ASP.NET Core How to use attribute routing in ASP.NET Core How to pass parameters to action methods in ASP.NET Core MVC How to use API Analyzers in ASP.NET Core How to use route data tokens in ASP.NET Core How to use API versioning in ASP.NET Core How to use Data Transfer Objects in ASP.NET Core 3.1 How to handle 404 errors in ASP.NET Core MVC How to use dependency injection in action filters in ASP.NET Core 3.1 How to use the options pattern in ASP.NET Core How to use endpoint routing in ASP.NET Core 3.0 MVC How to export data to Excel in ASP.NET Core 3.0 How to use LoggerMessage in ASP.NET Core 3.0 How to send emails in ASP.NET Core How to log data to SQL Server in ASP.NET Core How to schedule jobs using Quartz.NET in ASP.NET Core How to return data from ASP.NET Core Web API How to format response data in ASP.NET Core How to consume an ASP.NET Core Web API using RestSharp How to perform async operations using Dapper How to use feature flags in ASP.NET Core How to use the FromServices attribute in ASP.NET Core How to work with cookies in ASP.NET Core How to work with static files in ASP.NET Core How to use URL Rewriting Middleware in ASP.NET Core How to implement rate limiting in ASP.NET Core How to use Azure Application Insights in ASP.NET Core Using advanced NLog features in ASP.NET Core How to handle errors in ASP.NET Web API How to implement global exception handling in ASP.NET Core MVC How to handle null values in ASP.NET Core MVC Advanced versioning in ASP.NET Core Web API How to work with worker services in ASP.NET Core How to use the Data Protection API in ASP.NET Core How to use conditional middleware in ASP.NET Core How to work with session state in ASP.NET Core How to write efficient controllers in ASP.NET Core Related content feature What is Rust? Safe, fast, and easy software development Unlike most programming languages, Rust doesn't make you choose between speed, safety, and ease of use. Find out how Rust delivers better code with fewer compromises, and a few downsides to consider before learning Rust. By Serdar Yegulalp Nov 20, 2024 11 mins Rust Programming Languages Software Development how-to Kotlin for Java developers: Classes and coroutines Kotlin was designed to bring more flexibility and flow to programming in the JVM. Here's an in-depth look at how Kotlin makes working with classes and objects easier and introduces coroutines to modernize concurrency. By Matthew Tyson Nov 20, 2024 9 mins Java Kotlin Programming Languages analysis Azure AI Foundry tools for changes in AI applications Microsoft’s launch of Azure AI Foundry at Ignite 2024 signals a welcome shift from chatbots to agents and to using AI for business process automation. By Simon Bisson Nov 20, 2024 7 mins Microsoft Azure Generative AI Development Tools news Microsoft unveils imaging APIs for Windows Copilot Runtime Generative AI-backed APIs will allow developers to build image super resolution, image segmentation, object erase, and OCR capabilities into Windows applications. By Paul Krill Nov 19, 2024 2 mins Generative AI APIs Development Libraries and Frameworks Resources Videos