3/14 Which of the following is better?
#define intptr (int*)
typedef int * intptr;
I think the latter is better, but why?
\_ first, does #define intptr (int*) do what you want? afaik,
(int*) x is not a legal declaration. if you get rid of the
parentheses, consider the difference in the following if you use
typedef or #define :
intptr x, y;
\_ Correct answer moved to the top.
\_ The latter is better because the preprocessor will sometimes
\_ what's wrong with just using int* ?
\_ because it's not hungarian.
\_ because CIVILIZED people don't succumb to those primal pointer
urges, you sick monkey
\_ Use void *, it is used but never used up.
\_ first, does #define intptr (int*) do what you want? afaik,
(int*) x is not a legal declaration. if you get rid of the
parentheses, consider the difference in the following if you use
typedef or #define :
intptr x, y;
substitute (int*) for intptr in bad places. For example if you
hook up two components that both define intptr with #define
the preprocessor will silently choose one, whereas the compiler
will complain about conflicting typedefs.
\_ if you #define something twice, the preprocessor often will
give you a warning.
\_ #define's okay for smaller project. However, imagine you have
constants (#define CONST1 12345) compiled into your objects and
later on you change the semantics of your code. The compiler is
certainly not going to catch your mistakes. Preprocessing sucks,
save it for porting issues.
\_ preprocessing when used properly can be quite useful. Macros can
sometimes do things that inline functions cannot.
\_ The typedef is prefarable for the following reasons:
(1) This is EXACTLY what typedef is defined to do. Using
#define is being unconventional. Since there is no
good reason to buck the convention you might as well
follow it to make everyone's life easier.
For example, somebody reading your code who sees intptr will
probably assume it is a typedef (since that is the convention)
and do things that might break if it was a #define.
(2) Debuggers can figure out what intptr means if it is a typedef
but not if it is a define. For example, "p (intptr) x"
should work properly in gdb with typedef not with #define.
(3) Typedef is more likely to work properly in cases you haven't
considered since the compiler has more information about
what it's doing. Consequently, it is more likely that
compilers will generate proper warnings and error messages
using typedef than with define. Similiarly chaining typedefs
will work, can you guarantee the same with defines? |