Re: Local pattern declarations in Curry

From: Michael Hanus <hanus_at_informatik.rwth-aachen.de>
Date: Wed, 11 Nov 1998 17:16:32 +0100

Herbert Kuchen wrote:
> Of course, simple lambda lifting is no longer possible.
>
> One solution would be to preserve the nesting and compile
> the program directly to some abstract machine code.
> This is easy for us, since we are implementing Curry from
> scratch, but difficult, if you are aiming for a
> translation to Prolog. (Moreover, the direct translation
> is more efficient!)

The point is that I am not thinking in terms of the implementation
but I want to define the meaning of local declarations
(without using an "implementation-dependent definition").
Of course, you could define a calculus for functional logic
programs with local declarations and then develop a complete
theory for this (as done in some approaches for modularity
in logic programming), but this takes a long time. Thus, I think
a better solution is to map local declarations into global ones
and use the existing results about such functional logic programs.
Therefore, we have to define a transformation scheme
(which might not be used in a real implementation like yours).

> Another solution would be to use a more sophisticated
> translation scheme. For non-recursive nullary functions,
> one could just use the scheme proposed for the treatment
> of patterns.
>
> In the following, I'll try to sketch a scheme for the recursive
> case. Let's first make the example a bit more complicated,
> since xs in the above example is deterministic:

I would be happier if you provide a set of general transformation
rules instead of explaining the transformation of individual examples.
For instance, how do you transform the following example:

coin = 0
coin = 1
f x = g xs where xs = (x,coin):xs
                  xs = []

If you define the meaning of local pattern declarations
with a few general rules covering all these cases, I would
be better convinced about your proposal.

Let me mention a final point. A weakness of your proposal
is that syntactically identical global and local declarations
might have a different meaning. For instance, consider the
following definitions:

coin = 0
coin = 1
f = coin

Since f is a global function, the expression "f+f" reduces to 0|1|2.
Now consider the expression "let f=coin in f+f". Since the local
symbol f is a variable according your proposal, this expression
reduces to 0|2.

Therefore, I think it is essential to explain the meaning of
local declarations as syntactic sugar for something else
(using some simple transformations), otherwise it might be
too difficult to understand.

Best regards,

Michael
Received on Mi Nov 11 1998 - 17:20:00 CET

This archive was generated by hypermail 2.3.0 : Do Mär 28 2024 - 07:15:06 CET