8.1 KiB
C Pitfalls
C is a powerful language that offers almost absolute control and maximum performance which necessarily comes with responsibility and danger of shooting oneself in the foot. Without knowledge of the pitfalls you may well find yourself fallen into one of them.
Unless specified otherwise, this article supposes the C99 standard of the C language.
Undefined/Unspecified Behavior
Undefined (completely unpredictable), unspecified (safe but potentially differing) and implementation-defined (consistent within implementation but potentially differing between them) behavior poses a kind of unpredictability and sometimes non-intuitive, tricky behavior of certain operations that may differ between compilers, platforms or runs because they are not exactly described by the language specification; this is mostly done on purpose so as to allow some implementation freedom which allows implementing the language in a way that is most efficient on given platform. One has to be very careful about not letting such behavior break the program on platforms different from the one the program is developed on. Note that tools such as cppcheck can help find undefined behavior in code. Description of some such behavior follows.
Data type sizes including int and char may not be the same on each platform. Even though we almost take it for granted that char is 8 bits wide, in theory it can be wider (even though sizeof(char)
is always 1). The int (and unsigned int) type width should reflect the architectures native integer type, so nowadays it's mostly 32 or 64 bits. To deal with this we can use the standard library limits.h
and stdint.h
headers.
No specific endianness or even encoding of numbers is specified. Nowadays little endian and two's complement is what you'll encounter on most platforms, but e.g. PowerPC uses big endian ordering.
Order of evaluation of operands and function arguments is not specified. I.e. in an expression or function call it is not defined which operands or arguments will be evaluated first, the order may be completely random and the order may differ even when evaluating the same expression at another time. This is demonstrated by the following code:
#include <stdio.h>
int x = 0;
int a(void)
{
x += 1;
return x;
}
int main(void)
{
printf("%d %d\n",x,a()); // may print 0 1 or 1 1
return 0;
}
Overflow behavior of signed type operations is not specified. Sometimes we suppose that e.g. addition of two signed integers that are past the data type's limit will produce two's complement overflow (wrap around), but in fact this operation's behavior is undefined, C99 doesn't say what representation should be used for numbers. For portability, predictability and preventing bugs it is safer to use unsigned types (but safety may come at the cost of performance, i.e. you prevent compiler from performing some optimizations based on undefined behavior).
Bit shifts by type width or more are undefined. Also bit shifts by negative values are undefined. So e.g. x >> 8
is undefined if width of the data type of x
is 8 bits or fewer.
Char data type signedness is not defined. The signedness can be explicitly "forced" by specifying signed char
or unsigned char
.
Floating point results are not precisely specified, no representation (such as IEEE 754) is specified and there may appear small differences in float operations under different machines or e.g. compiler optimization settings -- this may lead to nondeterminism.
Memory Unsafety
Besides being extra careful about writing memory safe code, one needs to also know that some functions of the standard library are memory unsafe. This is regarding mainly string functions such as strcpy
or strlen
which do not check the string boundaries (i.e. they rely on not being passed a string that's not zero terminated and so can potentially touch memory anywhere beyond); safer alternatives are available, they have an n
added in the name (strncpy
, strnlen
, ...) and allow specifying a length limit.
Different Behavior Between C And C++ (And Different C Standards)
C is not a subset of C++, i.e. not every C program is a C++ program (for simple example imagine a C program in which we use the word class
as an identifier: it is a valid C program but not a C++ program). Furthermore a C program that is at the same time also a C++ program may behave differently when compiled as C vs C++, i.e. there may be a semantic difference. Of course, all of this may also apply between different standards of C, not just between C and C++.
For portability sake it is good to try to write C code that will also compile as C++ (and behave the same). For this we should know some basic differences in behavior between C and C++.
One difference lies for example in pointers to string literals. While in C it is possible to have non-const pointers such as
char *s = "abc";
C++ requires any such pointer to be const
, i.e.:
const char *s = "abc";
TODO: more examples
Compiler Optimizations
C compilers perform automatic optimizations and other transformations of the code, especially when you tell it to optimize aggressively (-O3
) which is a standard practice to make programs run faster. However this makes compilers perform a lot of magic and may lead to unexpected and unintuitive undesired behavior such as bugs or even the "unoptimization of code". { I've seen a code I've written have bigger size when I set the -Os
flag (optimize for smaller size). ~drummyfish }
Aggressive optimization may firstly lead to tiny bugs in your code manifesting in very weird ways, it may happen that a line of code somewhere which may somehow trigger some tricky undefined behavior may cause your program to crash in some completely different place. Compilers exploit undefined behavior to make all kinds of big brain reasoning and when they see code that MAY lead to undefined behavior a lot of chain reasoning may lead to very weird compiled results. Remember that undefined behavior, such as overflow when adding signed integers, doesn't mean the result is undefined, it means that ANYTHING CAN HAPPEN, the program may just start printing nonsensical stuff on its own or your computer may explode. So it may happen that the line with undefined behavior will behave as you expect but somewhere later on the program will just shit itself. For these reasons if you encounter a very weird bug, try to disable optimizations and see if it goes away -- if it does, you may be dealing with this kind of stuff. Also check your program with tools like cppcheck.
Automatic optimizations may also be dangerous when writing multithreaded or very low level code (e.g. a driver) in which the compiler may have wrong assumptions about the code such as that nothing outside your program can change your program's memory. Consider e.g. the following code:
while (x)
puts("X is set!");
Normally the compiler could optimize this to:
if (x)
while (1)
puts("X is set!");
As in typical code this works the same and is faster. However if the variable x is part of shared memory and can be changed by an outside process during the execution of the loop, this optimization can no longer be done as it results in different behavior. This can be prevented with the volatile
keyword which tells the compiler to not perform such optimizations.
Of course this applies to other languages as well, but C is especially known for having a lot of undefined behavior, so be careful.
Other
Watch out for operator precedence! Bracket expressions if unsure, or just to increase readability for others.
Also watch out for this one: !=
is not =!
:) I.e. if (x != 4)
and if (x =! 4)
are two different things, the first means not equal and is usually what you want, the latter is two operations, =
and !
, the tricky thing is it also compiles and may work as expected in some cases but fail in others, leading to a very nasty bug.