Please look at the code below and the result in console.
NSString *strRatio = @"0.03" ;
float f = [strRatio floatValue] ;
NSLog(@"%f, %@", f, f == 0.03 ? @"equal" : @"not equal") ;
result:
0.030000, not equal
Also I have a screenshot when I add a breakpoint at NSLog(@"%f, %@", f, f == 0.03 ? @"equal" : @"not equal") ;
, it gives me a different value of f
showing 0.0299999993...
Can anyone explain it ?
- Why is the result of
f == 0.03
is false ? - Why the value of f printed is 0.030000 but it shows 0.0299999993 when debug.
Edit :
I expect that the value of f is 0.03 after converting from @"0.03", how can I achieve it ?
It seems that float can't represent 0.03. Even if I assign 0.03 to float value forcibly, I will get 0.029999993 as the result.