I'm trying to display a number as a percent by using _.round
and then by multiplying the number by 100. For some reason, when I multiply the rounded number, the precision gets messed up. Here's what it looks like:
var num = 0.056789,
roundingPrecision = 4,
roundedNum = _.round(num, roundingPrecision),
percent = (roundedNum * 100) + '%';
console.log(roundedNum); // 0.0568
console.log(percent); // 5.680000000000001%
Why is the 0.000000000000001 added to the number after multiplying by 100?