addressalign-toparrow-leftarrow-rightbackbellblockcalendarcameraccwcheckchevron-downchevron-leftchevron-rightchevron-small-downchevron-small-leftchevron-small-rightchevron-small-upchevron-upcircle-with-checkcircle-with-crosscircle-with-pluscrossdots-three-verticaleditemptyheartexporteye-with-lineeyefacebookfolderfullheartglobegmailgooglegroupsimageimagesinstagramlinklocation-pinm-swarmSearchmailmessagesminusmoremuplabelShape 3 + Rectangle 1outlookpersonStartprice-ribbonImported LayersImported LayersImported Layersshieldstartickettrashtriangle-downtriangle-uptwitteruseryahoo

Chicago C/C++ Users Group Message Board › Undefined Behavior

Undefined Behavior

Nevin ".
nliber
Libertyville, IL
Post #: 71
Conrad W.
CWeisert
Chicago, IL
Post #: 92
I suppose some of that is amusing to insiders, but for serious software development there's nothing at all amusing about the cost of such beginners' gaffes as using the value of an uninitialized variable, integer overflow, division by zero, buffer overrun, or out-of-bounds array indexing.

Blaming the compiler or, worse, the language for permitting such easily avoided errors is childish. Competent programmers have known how to avoid them since the era of assembly-language programming, preferably in initial coding, but certainly after thorough testing. Object-oriented technology makes it even easier.

That's not to claim that good programmers never make such careless errors. I'm occasionally guilty myself, but I'm then appropriately embarrassed and I do not blame my tools.
Nevin ".
nliber
Libertyville, IL
Post #: 73
beginners' gaffes as using the value of an uninitialized variable,

But what if C had picked as a default that a variable must be initialized unless you explicitly said otherwise? For instance, if there was an "uninitialized" keyword:

int i = 0; /* okay */
int j; /* compile time error */
int k = uninitialized; /* okay */

Wouldn't that have been a better language choice? (Note: I'm not blaming K&R for this, as I'm not sure we knew any better back in the 70s.)

Blaming the compiler or, worse, the language for permitting such easily avoided errors is childish.

Is it? C++ deliberately chooses undefined behavior for performance over safety. Other languages (such as Java) make other choices.

Competent programmers have known how to avoid them since the era of assembly-language programming, preferably in initial coding, but certainly after thorough testing.

Testing cannot tell you if you have violated parts of the standard. You might not know it for a decade or more or ever, depending on optimization technology.

For instance, take the following function:

int a[97];

bool inRange(int i)
{ return &a[0] <= &a && &a <= &a[97]; }

A conforming C or C++ compiler is allowed to *always* return true, because for any value of i outside of [0..97] undefined behavior is invoked. How do you test for that if you currently have a compiler which does not perform that optimization? Heck, inRange might be your test function.

A more complicated example:

struct A { void b() { std::cout << "Hello\n"; } };

void C(A* a)
{
a->b();
assert(a);
LaunchTheMissiles();
}

C(0); // does the assert fire, or are the missiles launched?

The above illustrates the impact of a highly controversial optimization added to gcc a number of years ago, but ultimately, there is nothing in the standard forbidding that optimization. Undefined behavior means the compiler can do whatever it wants.

That's not to claim that good programmers never make such careless errors.

Are both of the above careless errors? Unless you've internalized the standard, how did you know either of the above invoked undefined behavior?

I do not blame my tools.

If many programmers are making the same error, the programmers may not be the problem.

I like C++ a lot, but it certainly isn't perfect, and some of its tradeoffs have a cost, not just for "competent" and "good" programmers, but for all programmers using the language. I don't want to be blind to those costs, nor "blame the programmer" for hitting the problem areas; I'd rather help shape the language so we can be better in the future.
Conrad W.
CWeisert
Chicago, IL
Post #: 93
For instance, take the following function:

int a[97];

bool inRange(int i)
{ return &a[0] <= &a && &a <= &a[97]; }

A conforming C or C++ compiler is allowed to *always* return true, because for any value of i outside of [0..97] undefined behavior is invoked. How do you test for that if you currently have a compiler which does not perform that optimization?


I was puzzled, and I stewed over that seemingly nonsense example for several minutes,
before figuring out that the subscript i was being interpreted as an italic tag for display!
Nevin ".
nliber
Libertyville, IL
Post #: 74
Computers... *sigh*
Conrad W.
CWeisert
Chicago, IL
Post #: 94
Computers... *sigh*

Although the meetup format conventions differ from raw html, meetup provides a preview option, so we needn't be surprised when the displayed text differs from what we intended.

bool inRange(int k)
{ return &a[0] <= &a[k] && &a[k] <= &a[97]; }
Conrad W.
CWeisert
Chicago, IL
Post #: 96

But what if C had picked as a default that a variable must be initialized unless you explicitly said otherwise? For instance, if there was an "uninitialized" keyword:

int i = 0; /* okay */
int j; /* compile time error */
int k = uninitialized; /* okay */

Wouldn't that have been a better language choice? (Note: I'm not blaming K&R for this, as I'm not sure we knew any better back in the 70s.)

I recall having almost the same discussion in 1964. An older operating system (FMS) for the IBM 7090 would always clear memory to all zeroes before loading a user's program. We switched to a different operating system that didn't do that. Several programs (mostly Fortran, some assembly language) got incorrect results because they depended on uninitialized memory.

The staff was split between:
(a) Those who claimed "It serves them right. They should fix their programs."
(b) and those who wanted us to "fix" the new operating system to preset memory.
The argument was settled by providing a user-specified option.

Surprisingly, not all of the different results were incorrect. I knew of at least one program where the change called attention to a bug that had been causing incorrect (but undetected) results all along.
Nevin ".
nliber
Libertyville, IL
Post #: 76
An older operating system (FMS) for the IBM 7090 would always clear memory to all zeroes before loading a user's program.

I'm making a stronger statement in that the initial value should be specified, not just defaulted. (Of course, we'd have to come up with some solution for templates, but that wasn't a concern in the '70s.)

Presetting the memory (or more generally, a default value) both masks the problem of uninitialized variables and also makes it easier to debug once you discover it, as the behavior becomes more deterministic. A double-edged sword...
Conrad W.
CWeisert
Chicago, IL
Post #: 98
[
Presetting the memory (or more generally, a default value) both masks the problem of uninitialized variables and also makes it easier to debug once you discover it, as the behavior becomes more deterministic. A double-edged sword...

Good point!
Powered by mvnForum

Our Sponsors

  • EZE Software Group

    Many thanks to EZE Software Group for providing us with a place to meet.

  • Spot Trading, LLC

    Many thanks to Spot Trading, LLC, for providing us with a place to meet.

  • JetBrains

    Many thanks for giving us free coupons for their software products

  • Optiver

    Many thanks to Optiver for hosting and organization our last event.

  • 8th Light

    Many thanks to 8th Light for providing us with food and beverages.

  • New Horizons Computer Learning Centers

    .

People in this
Meetup are also in:

Sign up

Meetup members, Log in

By clicking "Sign up" or "Sign up using Facebook", you confirm that you accept our Terms of Service & Privacy Policy