0

There are lots of tutorials and quesitons addressing this. But I want to confirm my understanding in one specific case. The two below should not make a difference to the compiler i.e either one is correct. Right?

typedef _GridLayoutInputRepeater<_num-1,Figure,_types...> _base;

and

#define _base _GridLayoutInputRepeater<_num-1,Figure,_types...> 

Similarly , the below should not make the difference?

#define INT_32 uint32_t

and

typedef uint32_t INT_32;

EDIT : Follow up thread here

infoclogged
  • 2,639
  • 2
  • 21
  • 43
  • 2
    You should really be using `using`. – Rakete1111 Jun 23 '17 at 10:05
  • @Rakete1111 : means? can you please give an example? – infoclogged Jun 23 '17 at 10:13
  • 2
    Don't use "underscore uppercase" [ https://stackoverflow.com/questions/228783/what-are-the-rules-about-using-an-underscore-in-a-c-identifier ]. Personally I don't like to see any leading underscores as it makes user code look like system headers, but I'm seeing it more and more as C++ gets pythonized. – Paul Floyd Jun 23 '17 at 10:13
  • 1
    @infoclogged Here's a [question](https://stackoverflow.com/questions/10747810/what-is-the-difference-between-typedef-and-using-in-c11) about it. – Rakete1111 Jun 23 '17 at 10:14
  • @Rakete1111 : thanks, using is indeed much readable. Didnt know that its an alternative to typedefs – infoclogged Jun 23 '17 at 10:35

2 Answers2

4

Currently without showing use-cases the 2 situations are both "equal" but what you should note is that #define is a whole different beast than typedef.

typedef introduces an alias for another type, this alias will be seen by the compiler and thus will follow compiler rules, scoping etc.

A #define is a preprocessor macro, the preprocessor will run before the actual compiler and will literally do a textual replacement, it does not care about scoping or any syntax rules, it's quite "dumb".

Usually, typedefs are the way to go as they are so much less error-prone. In which case you could use using = as well but that's personal preference since they're both the same:

using _base = _GridLayoutInputRepeater<_num-1,Figure,_types...>;
Hatted Rooster
  • 33,170
  • 5
  • 52
  • 104
  • In which uses cases would the above fail i.e one ( typedef ) works but the other ( #define ) does not? – infoclogged Jun 23 '17 at 10:10
  • @Rakete1111 very interesting, but why does #define fail here #define bar int bool bar = true; // fail . and typedef int bar bool bar = true; is successful ! – infoclogged Jun 23 '17 at 10:20
  • 1
    @infoclogged Because for the typedef, the compiler sees that `bar` already has a type attached. For the #define, the compiler doesn't see it because it's already been replaced by the preprocessor, it only sees `bool int = true;`. – Hatted Rooster Jun 23 '17 at 10:21
  • 1
    @infoclogged `foo` is declared to be a type and is not considered in `bool foo=true`. But `bar` is #defined to be `int`, i.e. the compiler sees `bool int=true` which makes no sense. – Walter Jun 23 '17 at 10:22
  • @Walter what do you mean by not considered in bool foo=true? Is the replacement of foo in bool foo = true ignored by the compiler because it already sees a type bool? – infoclogged Jun 23 '17 at 10:33
  • @infoclogged I don't know the exact rules, but it seems that `typedef` only introduces an alias, and not a type, and so the alias type is allowed to be shadowed by another name. – Rakete1111 Jun 23 '17 at 10:37
  • added a new thread : https://stackoverflow.com/questions/44719583/follow-up-on-understanding-of-defines-and-typedef – infoclogged Jun 23 '17 at 10:50
  • One important disadvantage of using the brutal text substitution of a `#define` for types is that you are missing the "regularization" that `typedef` does to types made of more than one token. Once you `typedef` a complex declaration such as `unsigned int`, `int (*)[4]`, `void (*)(int, double)` and similar, you can use functional cast notation, add qualifiers (`const`, `volatile`), create pointers and references, ... with the usual syntax; the `#define` doesn't act as a "syntax firewall", and thus such operations are either impossible or require complex syntax. – Matteo Italia Jun 23 '17 at 14:08
1

The problem with using #define rather than typedef or using is that [as has been pointed out] #define is a macro, and macros are evaluated and expanded by the preprocessor, so the compiler knows nothing about the data type you're trying to create because the #define directive is simply substituted with whatever comes after it.

The reason for using macros in languages such as C and C++ is to allow for things that aren't specifically to do with source code logic but are to do with source code structure.

The #include directive, for instance, quite literally includes the entire content of a file in place of the derective.

So, if myfile.h contains:

void func_1(int t);
void func_2(int t);

then

#inlude "myfile.h" 

would expand the content of myfile.h, replacing the #include preprocessor directive with

void func_1(int t);
void func_2(int t); 

The compiler then comes along and compiles the expanded file with class definitions, and other expanded macros!

It's why the macro

#pragma once 

or

#ifndef  __MYFILE_INCLUDE__
#define __MYFILE_INCLUDE__ 

is used at the start of header files to prevent multiple definitions occurring.

When you use an expression like #define INT64 unsigned int the preprocessor does exactly the same thing. It evaluates the expression, then replaces all occurrences of INT64 with unsigned int.

When you use a typedef, on the other hand, the compiler makes the type substitution, which means the compiler can warn about incorrect use of your newly created type.

#define would simply warn you of an incorrect use of unsigned int which if you have a lot of type substitution can become confusing!

Toby Speight
  • 23,550
  • 47
  • 57
  • 84