I'm playing around with BenchmarkDotNet and its MemoryDiagnoser feature.
Considering the following benchmark:
[Benchmark]
public void Dummy()
{
var buffer = new byte[1];
}
I expect it to allocate exactly 1 byte.
But the benchmark result shows that a total of 32 bytes were allocated. How come? I find this quite misleading.
| Method | Mean | Error | StdDev | Median | Ratio | Rank | Gen 0 | Gen 1 | Gen 2 | Allocated |
|------- |---------:|----------:|----------:|---------:|------:|-----:|-------:|------:|------:|----------:|
| Dummy | 4.486 ns | 0.1762 ns | 0.5196 ns | 4.650 ns | 1.00 | 1 | 0.0038 | - | - | 32 B |
why not 1 byte? ^^^^