It is understandable that :
public int divide() {
return 23/4; /* gives 5 , implicitly typecasting 5.75 to 5 ,due to 23 and 4 being
both integers ,at least I think this is */
}
and also ,
public double divide() {
return 23.0/4.0; /*gives 5.75 , since it takes 23.0 and 4.0 to be float and not int*/
}
I have a code:
public double divide() {
double intger = 23/4;
return intger;
}
In it ,why is it so that even when I am assigning 23/4 to a double , what I get is just 5.0 ? And also please check if I understood the first two correctly .
Thanks.
EDIT:
I got my answer . Thanks to all those who helped.