Because of JSLint, I almost always use i += 1
to increment a JavaScript for loop, but for quick and dirty scripts, I use i++
instead.
However, I see a lot of for loops in other people's code in which they increment i
by doing ++i
instead.
As far as I know, there is no difference in meaning between i++
and ++i
, and jsPref shows no difference in performance.
As such, I'm wondering where the convention of doing ++i
comes from and why people tend to do it.
Does anyone know why a lot of JS coders tend to prefer ++i
over i++
when incrementing the counter in a for loop?
Thanks.