I ran a benchmark example and got this table.
BenchmarkDotNet=v0.12.0, OS=Windows 7 SP1 (6.1.7601.0)
Intel Xeon CPU E5-4660 v3 2.10GHz, 1 CPU, 28 logical and 14 physical cores
Frequency=2050214 Hz, Resolution=487.7540 ns, Timer=TSC
[Host] : .NET Framework 4.8 (4.8.4018.0), X86 LegacyJIT [AttachedDebugger]
DefaultJob : .NET Framework 4.8 (4.8.4018.0), X86 LegacyJIT
| Method | Mean | Error | StdDev |
|------- |----------:|---------:|---------:|
| Sha256 | 173.60 us | 3.466 us | 9.604 us |
| Md5 | 29.95 us | 0.599 us | 1.709 us |
Well... How to read it?
What is the actual meaning of [ Mean | Error | StdDev ] ?
I'm new to this...
I can't find any reference for this..
Anyone can provide a link that explains this?