
%f
in user-facing strings is dangerous.
Depending on the
architecture
, programming language involved, version of that language and
compiler optimization flags
, results can vary slightly.
And if there are multiple languages involved in the serving stack, it is almost impossible to argue with the outcome.
If those variations are immaterial, then use %.1f or %.2f to get one or two digits of precision after the decimal point, respectively.
Otherwise, don’t use %f at all.