Skip to main content

.NET Memory Profiling

Introduction

Memory profiling is a crucial skill for .NET developers who want to build efficient, responsive applications. Even in a managed environment like .NET, where the garbage collector automatically handles memory management, memory issues can still occur and impact your application's performance and stability.

Memory profiling is the process of analyzing how your application uses memory during execution. It helps you identify:

  • Memory leaks (when objects remain in memory longer than needed)
  • Excessive memory usage
  • Inefficient object allocation patterns
  • Garbage collection bottlenecks

In this tutorial, we'll explore memory profiling in .NET applications, from understanding the basics to using professional tools for in-depth analysis.

Why Memory Profiling Matters

Even with .NET's automatic garbage collection, several memory issues can arise:

  1. Memory leaks: Objects that remain referenced when no longer needed
  2. High memory pressure: Frequent allocations causing excessive garbage collections
  3. Large object heap fragmentation: Leading to inefficient memory use
  4. Excessive memory usage: Causing application slowdowns or crashes

These issues can manifest as:

  • Applications that slow down over time
  • Increasing memory usage (visible in Task Manager)
  • OutOfMemoryException errors
  • Frequent garbage collection pauses

Basic Memory Monitoring

Using Performance Counters

.NET provides performance counters that you can monitor to get insight into your application's memory usage:

csharp
using System;
using System.Diagnostics;
using System.Threading;

class MemoryMonitoringDemo
{
static void Main()
{
// Get the current process
Process currentProcess = Process.GetCurrentProcess();

// Initial memory snapshot
Console.WriteLine("Starting memory monitoring...");
PrintMemoryStats(currentProcess);

// Create some objects to see memory impact
var list = new System.Collections.Generic.List<string>();
for (int i = 0; i < 1000000; i++)
{
list.Add($"Item {i} with some extra text to use more memory");

if (i % 100000 == 0)
{
PrintMemoryStats(currentProcess);
Thread.Sleep(500); // Wait to observe changes
}
}

// Final memory snapshot
PrintMemoryStats(currentProcess);

Console.WriteLine("Press any key to exit");
Console.ReadKey();
}

static void PrintMemoryStats(Process process)
{
// Refresh the process info
process.Refresh();

// Convert bytes to MB for readability
double workingSetMB = process.WorkingSet64 / 1024.0 / 1024.0;
double privateMemoryMB = process.PrivateMemorySize64 / 1024.0 / 1024.0;
double managedMemoryMB = GC.GetTotalMemory(false) / 1024.0 / 1024.0;

Console.WriteLine($"[{DateTime.Now:HH:mm:ss}]");
Console.WriteLine($"Working Set: {workingSetMB:F2} MB");
Console.WriteLine($"Private Bytes: {privateMemoryMB:F2} MB");
Console.WriteLine($"Managed Memory: {managedMemoryMB:F2} MB");
Console.WriteLine($"GC Collections: Gen 0: {GC.CollectionCount(0)}, " +
$"Gen 1: {GC.CollectionCount(1)}, " +
$"Gen 2: {GC.CollectionCount(2)}");
Console.WriteLine();
}
}

Output (example):

Starting memory monitoring...
[14:23:15]
Working Set: 11.42 MB
Private Bytes: 17.85 MB
Managed Memory: 0.03 MB
GC Collections: Gen 0: 0, Gen 1: 0, Gen 2: 0

[14:23:16]
Working Set: 32.68 MB
Private Bytes: 39.22 MB
Managed Memory: 15.26 MB
GC Collections: Gen 0: 1, Gen 1: 0, Gen 2: 0

...

[14:23:20]
Working Set: 156.42 MB
Private Bytes: 164.35 MB
Managed Memory: 138.55 MB
GC Collections: Gen 0: 3, Gen 1: 1, Gen 2: 0

Press any key to exit

Triggering Garbage Collection

During development, you may want to explicitly trigger garbage collection to observe memory behavior:

csharp
Console.WriteLine($"Before collection: {GC.GetTotalMemory(false) / 1024.0 / 1024.0:F2} MB");

// Force a full garbage collection
GC.Collect();
GC.WaitForPendingFinalizers();
GC.Collect();

Console.WriteLine($"After collection: {GC.GetTotalMemory(true) / 1024.0 / 1024.0:F2} MB");

Common Memory Issues and How to Detect Them

1. Memory Leaks

A common cause of memory leaks in .NET is event handlers that aren't properly unsubscribed. Here's an example:

csharp
public class LeakyClass
{
public event EventHandler SomeEvent;

public void DoSomething()
{
// This method raises the event
SomeEvent?.Invoke(this, EventArgs.Empty);
}
}

public class Program
{
static void Main()
{
var publisher = new LeakyClass();

// This loop creates memory leaks
for (int i = 0; i < 1000; i++)
{
var subscriber = new TemporaryClass();
subscriber.Subscribe(publisher); // Subscribes to the event

// subscriber goes out of scope here, but the publisher still has a reference
// to it via the event delegate. Memory leak!
}
}
}

public class TemporaryClass
{
public void Subscribe(LeakyClass publisher)
{
publisher.SomeEvent += HandleEvent; // This creates a reference from publisher to this object
}

private void HandleEvent(object sender, EventArgs e)
{
Console.WriteLine("Event handled");
}

// Problem: We never unsubscribe from the event!
}

Fix:

csharp
public class TemporaryClass : IDisposable
{
private LeakyClass _publisher;

public void Subscribe(LeakyClass publisher)
{
_publisher = publisher;
_publisher.SomeEvent += HandleEvent;
}

private void HandleEvent(object sender, EventArgs e)
{
Console.WriteLine("Event handled");
}

public void Dispose()
{
// Proper cleanup
if (_publisher != null)
{
_publisher.SomeEvent -= HandleEvent;
_publisher = null;
}
}
}

2. High Allocation Rates

Creating too many short-lived objects can put pressure on the garbage collector:

csharp
// Inefficient - creates new strings for each iteration
string result = "";
for (int i = 0; i < 10000; i++)
{
result += i.ToString() + ", "; // Creates a new string each time
}

// Better - uses StringBuilder
var sb = new System.Text.StringBuilder();
for (int i = 0; i < 10000; i++)
{
sb.Append(i).Append(", ");
}
string betterResult = sb.ToString();

Professional Memory Profiling Tools

While simple monitoring is helpful, professional tools offer much deeper insights:

1. Visual Studio Memory Profiler

Visual Studio includes a built-in diagnostic tool:

  1. In Visual Studio, go to Debug > Performance Profiler
  2. Select "Memory Usage" and start your session
  3. Take snapshots during execution
  4. Compare snapshots to find retained objects

Visual Studio Memory Profiler

2. dotMemory by JetBrains

JetBrains dotMemory provides comprehensive memory analysis:

csharp
// Sample code to analyze with dotMemory
class Program
{
static List<byte[]> _cache = new List<byte[]>();

static void Main()
{
Console.WriteLine("Press Enter to allocate memory...");
Console.ReadLine();

// Allocate some data
for (int i = 0; i < 100; i++)
{
_cache.Add(new byte[1024 * 1024]); // Add 1MB chunks to our "cache"
Console.WriteLine($"Allocated {i+1} MB");
Thread.Sleep(100);
}

Console.WriteLine("Press Enter to exit");
Console.ReadLine();
}
}

With dotMemory, you can:

  • Track object allocations
  • Find memory leaks
  • View object retention paths
  • Analyze GC behavior

3. ANTS Memory Profiler by Redgate

Another powerful tool for memory profiling. After capturing a memory snapshot, you can:

  • See memory usage trends
  • Identify which objects are using the most memory
  • Find references keeping objects alive
  • Detect memory leaks

Practical Memory Profiling Walkthrough

Let's walk through a complete memory profiling scenario:

Step 1: Identify the Problem

Imagine a web API that slowly consumes more memory over time. Users report it gets slower after running for several days.

Step 2: Reproduce and Monitor

Create a load test or simulate real usage patterns while monitoring memory:

csharp
// In a diagnostic endpoint of your API
[Route("api/diagnostics/memory")]
public IActionResult GetMemoryStats()
{
var stats = new
{
TotalMemoryMB = GC.GetTotalMemory(false) / 1024.0 / 1024.0,
Gen0Collections = GC.CollectionCount(0),
Gen1Collections = GC.CollectionCount(1),
Gen2Collections = GC.CollectionCount(2),
ProcessMemoryMB = Process.GetCurrentProcess().WorkingSet64 / 1024.0 / 1024.0
};

return Ok(stats);
}

Step 3: Capture Memory Snapshots

Using Visual Studio Memory Profiler:

  1. Attach to the running process
  2. Capture a baseline snapshot
  3. Perform operations that might cause memory issues
  4. Take another snapshot
  5. Compare snapshots

Step 4: Analyze a Real Memory Leak

Let's examine a common memory leak in ASP.NET Core:

csharp
public class LeakyService
{
private static List<byte[]> _leakedData = new List<byte[]>();

public void ProcessRequest(int id)
{
// Simulating a memory leak - storing data in a static collection
_leakedData.Add(new byte[1024 * 1024]); // 1MB per request
}
}

This service, when used with dependency injection, would cause the application to leak 1MB per request.

Step 5: Fix the Issue

After identifying the leak, we fix it:

csharp
public class FixedService
{
// Using a cache with size limits
private static MemoryCache _cache = new MemoryCache(
new MemoryCacheOptions {
SizeLimit = 100 // Limit to 100MB
});

public void ProcessRequest(int id)
{
var cacheKey = $"request-{id}";
var data = new byte[1024 * 1024]; // 1MB

// Add with expiration and size
_cache.Set(cacheKey, data, new MemoryCacheEntryOptions {
Size = 1, // Size units (1MB in our case)
AbsoluteExpirationRelativeToNow = TimeSpan.FromMinutes(10)
});
}
}

Step 6: Verify the Fix

Run the profiler again to verify memory usage stabilizes.

Advanced Memory Profiling Topics

Large Object Heap (LOH)

Objects larger than 85KB go to the Large Object Heap, which is collected less frequently:

csharp
// This allocation goes to the regular heap
byte[] smallArray = new byte[80 * 1024]; // 80KB

// This allocation goes to the Large Object Heap
byte[] largeArray = new byte[86 * 1024]; // 86KB

To avoid LOH fragmentation:

  • Reuse large objects instead of repeatedly allocating them
  • Consider object pooling for large objects
csharp
// Example of object pooling for large arrays
using Microsoft.Extensions.ObjectPool;

public class LargeBufferPoolPolicy : PooledObjectPolicy<byte[]>
{
public override byte[] Create()
{
return new byte[86 * 1024]; // Create a large buffer
}

public override bool Return(byte[] obj)
{
// Clear the buffer before returning to pool
Array.Clear(obj, 0, obj.Length);
return true;
}
}

// Usage:
var pool = new DefaultObjectPool<byte[]>(new LargeBufferPoolPolicy());

// Get a buffer from the pool
byte[] buffer = pool.Get();

// Use the buffer...

// Return it to the pool when done
pool.Return(buffer);

Memory Dumps Analysis

For production issues, you can create and analyze memory dumps:

  1. Create a dump using Task Manager, ProcDump, or DebugDiag
  2. Analyze it with WinDbg, Visual Studio, or dotMemory
  3. Look for large object counts and reference chains

Summary

Memory profiling is an essential skill for .NET developers. While the garbage collector handles most memory management automatically, understanding how your application uses memory helps you create more efficient, reliable software.

Key takeaways:

  • Monitor your application's memory usage during development
  • Use specialized tools for deep memory analysis
  • Watch for common issues like event handler leaks and high allocation rates
  • Consider object pooling for large or frequently allocated objects
  • Test memory usage under realistic loads

By mastering these memory profiling techniques, you'll be equipped to diagnose and solve memory issues in your .NET applications, leading to better performance and stability.

Additional Resources

Exercises

  1. Create a simple console application that deliberately leaks memory, then use a memory profiler to identify the leak.
  2. Compare the memory usage between using string concatenation and StringBuilder for building large strings.
  3. Implement an object pooling solution for a class that uses large arrays or collections.
  4. Profile a real application you've developed and identify opportunities for memory optimization.
  5. Create a web API endpoint that exposes memory statistics, then monitor it under load to observe memory patterns.


If you spot any mistakes on this website, please let me know at [email protected]. I’d greatly appreciate your feedback! :)