TL;DR: define
is letrec
. This is what enables us to write recursive defintions in the first place.
Consider
let fact = fun (n => (n==0 -> 1 ; n * fact (n-1)))
To what entity does the name fact
inside the body of this definiton refer? With let foo = val
, val
is defined in terms of already known entities, so it can't refer to foo
which is not defined yet. In terms of scope this can be said (and usually is) that the RHS of the let
equation is defined in the outer scope.
The only way for the inner fact
to actually point at the one being defined, is to use letrec
, where the entity being defined is allowed to refer to the scope in which it is being defined. So while causing evaluation of an entity while its definition is in progress is an error, storing a reference to its (future, at this point in time) value is fine -- in the case of using letrec
that is.
The define
you refer to, is just letrec
under another name. In Scheme as well.
Without the ability of an entity being defined to refer to itself, i.e. in languages with non-recursive let
, to have recursion one has to resort to the use of arcane devices such as the y-combinator. Which is cumbersome and usually inefficient. Another way is the definitions like
let fact = (fun (f => f f)) (fun (r => n => (n==0 -> 1 ; n * r r (n-1))))
So letrec
brings to the table the efficiency of implementation, and convenience for a programmer.
The quesion then becomes, why expose the non-recursive let
? Haskell indeed does not. Scheme has both letrec
and let
. One reason might be for completeness. Another might be a simpler implementation for let
, with less self-referential run-time structures in memory making it easier on the garbage collector.
You ask for a motivational example. Consider defining Fibonacci numbers as a self-referential lazy list:
letrec fibs = {0} + {1} + add fibs (tail fibs)
With non-recursive let
another copy of the list fibs
will be defined, to be used as the input to the element-wise addition function add
. Which will cause the definition of another copy of fibs
for this one to be defined in its terms. And so on; accessing the nth Fibonacci number will cause a chain of n-1 lists to be created and maintained at run-time! Not a pretty picture.
And that's assuming the same fibs
was used for tail fibs
as well. If not, all bets are off.
What is needed is that fibs
uses itself, refers to itself, so only one copy of the list is maintained.