I'm baffled what these numbers mean. To me it seems that printf gives me wrong results.
echo printf("%.2f", 1);
// 1.004
echo printf("%.3f", 1);
// 1.005
echo printf("%.2f", 1.1234);
// 1.124
First of all it seems to print too many decimals and I have no idea what those numbers are. Can someone shed some light on this matter?