Since none of you produce end-user software that I am aware of, do you leave debug symbols on in your production builds or not? What level of compiler optimizations do you use?
I suppose I am assuming you're producing compiled code. Are you?
debug symbols
Varies by project. Safety critical and time-critical systems generally disable optimizations and leave the debug symbols alone on the assumption that you did most of your testing that way and don't dare add uncertainty.
Another system was supposed to be released optimized, but that had to be removed when a bad optimization was detected that corrupted data. We tracked that one, but couldn't be sure if there were others, so it was safer to leave it off.
Something similar just happened with a Linux (gcc) optimized build corrupting data where Win32 (VC++) opt and debug and Linux debug worked fine. For that, we stuck a couple "volatile" keywords around to stop local optimizations, and called it a day.
Another system was supposed to be released optimized, but that had to be removed when a bad optimization was detected that corrupted data. We tracked that one, but couldn't be sure if there were others, so it was safer to leave it off.
Something similar just happened with a Linux (gcc) optimized build corrupting data where Win32 (VC++) opt and debug and Linux debug worked fine. For that, we stuck a couple "volatile" keywords around to stop local optimizations, and called it a day.
I feel like I just beat a kitten to death... with a bag of puppies.