-1

Is there any performance difference between

if (flag) {...} 

or

if (flag!=0) {...}

Considering both compiler execution time or resources needed.

BeeOnRope
  • 51,419
  • 13
  • 149
  • 309
Stoica Dan
  • 81
  • 1
  • 7
  • Types, types... – Sourav Ghosh Jan 18 '17 at 09:36
  • To clarify, you're not asking about the resulting code being generated by the compiler, but about the compilation process itself? – Some programmer dude Jan 18 '17 at 09:40
  • More text means the compiler has to parse more text. Might be a couple of nanoseconds in difference there. – Lundin Jan 18 '17 at 09:41
  • @Some programmer dude YES – Stoica Dan Jan 18 '17 at 09:54
  • 1
    Then yes there's a difference, but you need an *extremely* high resolution clock to be able to notice the difference. Or have millions or even *tens* of millions of such statements in a single source file. And there might even not *be* a (measurable) difference, since the the compiler must do different things for the two different expressions, and those differences might outweigh each other. – Some programmer dude Jan 18 '17 at 09:55
  • To be fair, the author is clearly asking about the _performance differences_ of the two approaches, while the linked duplicate seems exclusively about questions of style, readability, maintenance, etc. The author could have made it clear in the title though, rather than leaving it to the last sentence. – BeeOnRope Jan 18 '17 at 20:02

1 Answers1

0

No, there is no practical difference, if flag is an integral type.

Marc Balmer
  • 1,711
  • 1
  • 11
  • 17