Just curious if someone can provide a short example of code that causes a compiler's optimizer to behave incorrectly:
- without optimization (or at lower optimization), code takes X seconds to run
- with optimization (or at higher optimization), code takes Y seconds to run where Y > X
In other words, the optimizer's cost model causes it to make a poor decision with a specific case, even though (presumably) it makes good decisions in general.
This is meant for C (with gcc the comparisons should be between -O0, -O1, -O2, -O3 in increasing order of optimization), but any compiled language should work.
-Os(which is basically O2 + trying to save space) than when using-O3. Does that count? \$\endgroup\$