I'm trying to understand fully the exact behaviour of DecimalFormat. I'm currently making some tests with the scientific notation capabilities of this class. And I'm facing a problem with the tuning of the exact number of significant digits in scientific notation. According to the Javadoc of Java 7 :
The number of significant digits in the mantissa is the sum of the minimum integer and maximum fraction digits, and is unaffected by the maximum integer digits. For example, 12345 formatted with "##0.##E0" is "12.3E3". To show all digits, set the significant digits count to zero. The number of significant digits does not affect parsing.
Consequently, I'm testing it :
DecimalFormat formatone = new DecimalFormat("##0.##E0");
System.out.println(formatone.format(toF));
And I obtain the following output :
12,345E3
According to the excerpt of the Javadoc I've just shown, I thought I should obtain
12.3E3
Am I doing something wrong ? Have I understood something wrong ?
Thanks in advance for all your clarifications :-)