I'm having a problem with some C code for a microcontroller. It seems like a language or compiler problem, but the program will compile just fine. When ran on the microcontroller, however, the problem manifests.
(This is for an Atmel AVR AtTiny26.)
I'm reading a value from the analog to digital converter and multiplying it by 10:
int SD;
SD = ADCH * 10;
This did not work correctly. I thought trying to read from the ADC in a calculation was the problem, so I tried:
int SD;
SD = ADCH;
SD = SD * 10;
This also did not work, causing instability on the micro. (It works as long as the analog value is low, but once a certain value is reached, a value is never obtained from ADC again until reset.)
This seems to work:
int SD;
int TEMP;
TEMP = ADCH;
SD = TEMP * 10;
Introducing another variable fixes the problem, but seems confusing. What's going on?
Related question on EE: https://electronics.stackexchange.com/q/38404/2028
Edit:
This may have something to do with compile optimizations. I get different results when I specify -Os versus -O2 or -O3 on the command line when compiling. Could optimization change how such variable assignment works?