Browse Groups

• ## Re: Strictly typed lazy evaluated postfix languages with a lambda.

(12)
• NextPrevious
• ... (I assume you mean concatenative (?)) ... arguments) ... inspect the ... How about strongForth ? IIRC every word has to have a statically determinable
Message 1 of 12 , Sep 3, 2001
View Source
--- In concatenative@y..., John Carter <john.carter@t...> wrote:
> One of Manfred's posts got me musing on whether one could create a
> strictly typed postfix language with a lambda abstraction.

(I assume you mean concatenative (?))

> The prime problem is a lambda expression crunches n items (its
arguments)
> off the stack and pushes exactly one item onto the stack (its return
> value).
>
> Most stack based languages are not so constrained. You cannot
inspect the
> code and say what it will do to the stack without knowing what is
> currently _on_ the stack!

IIRC every word has to have a statically determinable stack effect
(in terms of which types the top elements have before and after a word
has been executed)
See e.g. http://home.t-online.de/home/s.becher/forth/intro.htm
about replacement (sort of) of ?dup with ?if .

> It is trivial to write a function that would put 20 things on the
stack if
> the thing on top of the stack is true else take 40 things off the
stack.
>
> If one constrained the language so that every operator chewed off /
> placed on a predetermined number of items off the stack.
>
> ie. Constrain the language so that, you could look at any and every
> operator and tell a priori how many items it would chew off or place
on
> the stack.

I.e. a sort of type system.

> Then for any "word" or function comprising of several operators one
could
> determine apriori exactly how many items that function would chew
off or
> place on the stack.
>
> The prime problem case would be an "if" operator. The "then" part
may chew
> off more or less than the "else" part. However, if one constrained
the
> "then" and the "else" parts to do exactly the same, then the problem
is
> resolved.

Yes, same "problem" with if in typed functional langages, same
solution.

> Then one can talk about a function or operator being a n->m
operator. It
> chews n items off the stack and pushes m items onto the stack.
>
> So if you have a n->m operator concatenated with a j->k operator you
have
> a (n+j)->(m+k) operator.
>
> Strict typing.
> =============
>
> Now having got that far, you can say, for example a + operator is a
2->1
> operator taking (int,int)->(int).
>
> Now lets extend out notation saying instead of merely chew n leave m
n->m,
> we actually list the types.
>
> (T1,T2,...Tn)->(U1,U2,...Um) where Ti,Uj are types.

Now it looks even more like a type system.
(I.e. no dynamic typing/one-universal-type)

> So now we can statically analyze for type safety of a "word" without
even
> running the program.

Exactly.

> For example
> "hello world" 1 +
> is clearly "type" wrong.
> "hello word" is ->(string)
> 1 is ->(int)
>
> So we know the stack is
> something string int
>
> + is (int,int)->int
>
> Type clash.

Yes.

> Now one can recover objects and polymorphic types as well.
>
> I suspect the author of a function need never declare the type
"signature"
> of the function, as it should be trivial to automagically deduce the
> signature!

Yes, it's called type inference.

> Lazy evaluation
> ===============
> This bit is a tad more speculative...
>
> If one redefined the meaning of quoting slightly....
>
> A quoted code fragment would mean "bundle me up with my context, but
don't
> evaluate me unless you have to".
>
> For example, a quoted n->m fragment would attempt to find n items on
the
> stack and include it in the bundle. eg...

Do you mean like building a closure ?

> Think of
> f( a(x), b(y, z), u)
>
> That is
> u z y b x a f
>
> f is 3->1
> b is 2->1
> a is 1->1
> x,y,z,u are ->1
>
> So we can statically analyze this fragment bundling things into
> contexts...
> f is 3->1
> a is 1->1 so can create bundle
> [x a]
> fragment becomes
> u z y b [x a] f
>
> b is 2->1 so can bundle...
> [z y b]
> fragment becomes...
> u [z y b] [x a] f
>
> Now can bundle f
> [u [z y b] [x a] f]

You mean that because a craves 1 input respectively b craves 2 inputs
and because we can locate these inputs syntactically we can bundle
these together and say that [x a] gives one output because a gives one
output respectively [z y b] gives one output because b gives one
output ?

What about if b : 2->2 and f : 4->1 ?
Would this still be valid (I suppose so) ?
[u [z y b] [x a] f]

Could you write [y b] now and that would somehow remove the top
element of the stack and include it in the closure (= bundle ?)
to maybe (if required) evaluate later ?

> The point of the exercise being that we only need evaluate the
bundles if
> and only if we are forced to by the flow of execution. ie. We have
> Functional Language style lazy evaluation again.

Very interesting.

> And the lambda?
> ===============
>
> Everything is a lambda.
>
> Type in anything, the interpretor would lazily refuse to evaluate it
as
> long as it could. Wherever possible it would automagically rewrite
your
> code into lambda'd bundles.

Like in other lazy languages like Haskell, Miranda, Clean.

How to do I/O and side-effects, then ?
Monads, uniqueness typing or something else ?

(If a language is lazy, then we generally don't know exactly when, if,
and how many times a bundle passed to a function/word is going to be
evaluated
(Well it's only going to be evaluated at most once
(that's what lazy means), but if the bundle is in turn a word,
then that word may perform side effects when it gets executed with
parameters on the stack by the controlling (higher-order) word
)
)

(long somewhat original-post-off-topic :)
I've recently found an interesting embedding for a typed forth-like
language in Haskell (which happens to be a lazy language, but I think
that one could very easily rewrite it in a strict (functional)
language, like SML,OCaml ..).

Normally (I think ?) forth-embeddings or interpreters in functional
languages use a stack ADT or linked list for the forth stack and
a forth word is a stack transformer.

The problem with this in a statically typed language is that, normally
type (This is of course an improvement over (as in some statically
typed languagese.g. Java) having to define one stack type for every
element type or make a "generic" stack type with elements of type
void *,Object or some similar universal-type), so that we can have a
list of integers or a list of doubles etc.. but no mixed types in
lists. This is in (at least some problem domains) not so often needed.

But in this case we want to model forth programs which store objects
of different type on the stack.

Na"ive/Toy implementations (e.g. just for educational or
proof-of-concept) can of course just provide a list of integers and be
content which that.

One other solution is to make a disjoint union type for all types that
we are going to need/use and make a list/stack of this universal type.
But then it begins to look more like a interpreter (which can be fun)
than an embedding (which I wanted here).

So if we are going to model/type forth stack, then we are going to
need a way to incorporate the type of the top element and the second
top element and so on (As we saw, just one type for all elements isn't
enough) (Hopefully not all the way down, though).

One way (in a haskell framework) seems to be to make a type
constructor that takes the type of the top-element and the type of the
rest of the stack and constructs a type for this whole stack.

Like :
plus :: Stack Int (Stack Int st) -> Stack Int st

(program fragment :: type of program fragment)

st represents the type of the unaltered stack below the "arguments"
and "results" of the word.
This is of course an awkward syntax, but it's an embedding.
We can (in haskell) use the (built-in syntax for) pair constructors :
plus :: (Int,(Int,st)) -> (Int,st)

My approach, instead, uses currying (i.e. the ability of a function
which hasn't been supplied with all of it's arguments to just return
a new function (closure) that waits for the missing arguments before
computing the result) in haskell as implementation/embedding method.
(I.e. i'm not introducing currying in forth (not as yet anyway).)

In haskell there's a function called flip :
flip :: (b -> a -> c) -> a -> b -> c
or, equivalently
flip :: (b -> a -> c) -> (a -> b -> c)
which takes a function as first argument (say f) and two extra
argument (say y and x) and applies f to x and y. Alternatively (and
equivalently) it takes a function f and gives a new function that does
the same work as f, but which takes it's two arguments in the reversed
(or flipped or swapped) order.

This is very much like swap in forth !
I.e. flip takes a function that represent what to do after (i.e.
continuation-passing-style (almost)) and eats two arguments
and continues with f such that f think that the two arguments were
given to it in the reverse order.

Similary, one can define a function dup that takes a
continuation-function and one argument and then continues with the
function after giving it the argument duplicated as two arguments.

Ok, implementation :

swap :: (b -> a -> stack) -> (a -> b -> stack)
swap c x y = c y x

dup :: (a -> a -> stack) -> (a -> stack)
dup c x = c x x

drop :: (stack) -> (a -> stack)
drop c x = c

Now, in concatenative languages, juxtaposition of two program
fragments mean (stack-transform-) function composition, while in many
applicative languages it mean (unsurprisingly) function application.

So for an embedding we need an explicit function composition, like

(.) :: (stack1 -> stack0) -> (stack2 -> stack1) -> (stack2 -> stack0)
(f . g) c = f (g c)

I.e. g's continuation is c (which is the continuation of the whole
composition) and f's continuation is (g c).

This can be considered as syntax as opposed to a word.

Now we can define up primitive (from the forth POV) words like
swap,dup,drop,rot,plus,times, ...

But the types of these words is still a little confusing because
it seems to be backwards, just look at dup again :
dup :: (a -> a -> stack) -> (a -> stack)
so I introduce a type synonym Word, such that
Word stack0 stack1
is equal to
stack1 -> stack0

type Word s0 s1 = s1 -> s0

Now dup has type :
dup :: Word (a -> s) (a -> a -> s)

(.) :: Word stack0 stack1 -> Word stack1 stack2 -> Word stack0 stack2

drop :: Word (a -> s) s

plus :: Word (Int -> Int -> s) (Int -> s)
plus c n m = c (n+m)

minus :: Word (Int -> Int -> s) (Int -> s)
minus c n m = c (n-m)

times :: Word (Int -> Int -> s) (Int -> s)
times c n m = c (n*m)

equal :: Word (Int -> Int -> s) (Bool -> s)
equal c n m = c (n==m)

int2float :: Word (Int -> s) (Float -> s)
int2float c i = c (fromInt i)

squareroot :: Word (Float -> s) (Float -> s)
squareroot c f = c (sqrt f)

donothing :: Word s s
donothing c = c

ifThenElse is not a word, but syntax, same for lit.

ifThenElse :: Word s0 s1 -> Word s0 s1 -> Word (Bool -> s0) s1
ifThenElse t e c True = t c
ifThenElse t e c False = e c

lit :: a -> Word s (a -> s)
lit x c = c x

lit is needed because literals in haskell doesn't denote a stack

Example derived words :

square :: Word (Int -> s) (Int -> s)
square = dup
. times

pyth :: Word (Int -> Int -> s) (Float -> s)
pyth = square
. swap
. square
. plus
. int2float
. squareroot

fact :: Word (Int -> s) (Int -> s)
fact = dup
. lit 0
. equal
. ifThenElse
( drop
. lit 1
)
( dup
. lit 1
. swap
. minus
. fact
. times
)

Oh, I almost forgot :

eval1answer :: Word a (a -> a) -> a
eval1answer w = w (\o -> o)

eval1answer (lit 3 . lit 4 . pyth)
=>
5.0

By the way, one can easily add first-class labels/continuation to this
and add a goto primitive if one wants to. This is possible because of
the use of closures.
(An real (efficient ?) compiler (and/or intereter) might not so easily
provide continuations, though)

What do you think ?

> John Carter Phone : (64)(3) 358 6639
> Tait Electronics Fax : (64)(3) 359 4632
> PO Box 1645 Christchurch Email : john.carter@t...
> New Zealand
>
> I'm becoming less and less convinced of humans as rational beings.
> I suspect we are merely meme collectors, and the reason meme is only
> kept on to help count our change.

--
Stefan Lj
• 1. I have added a library for writing interactive tutorials to the Joy page. It contains some operators for interactive programs which use get and need some
Message 1 of 12 , Sep 3, 2001
View Source
1. I have added a library for writing interactive tutorials to
the Joy page. It contains some operators for interactive
programs which use "get" and need some clown-proofing.
Also included are operators specifically for tutorials
that have a specific form (the content is in 2.).
2. As an example tutorial, there is one that teaches -- you
guessed it, Joy. Don't take it too seriously as a piece
of pedagogy though.
3. Also included is a (pseudo) input file from a clownish, partly
malicious and not too bright "student".
4. Finally, there is the output from that interaction.

I shall respond to the two very thoughtful postings from Stefan Lj
as soon as I get some time.

- Manfred
• I m sorry it took so long to look at your contribution. It deserves more comments than I have written, but this is all I can do at the moment. - Manfred ...
Message 1 of 12 , Sep 16, 2001
View Source
I'm sorry it took so long to look at your contribution.
It deserves more comments than I have written, but this is all
I can do at the moment.
- Manfred

On Mon, 3 Sep 2001 md9slj@... wrote:

> --- In concatenative@y..., John Carter <john.carter@t...> wrote:
> > One of Manfred's posts got me musing on whether one could create a
> > strictly typed postfix language with a lambda abstraction.
>
> (I assume you mean concatenative (?))
>
> > The prime problem is a lambda expression crunches n items (its
> arguments)
> > off the stack and pushes exactly one item onto the stack (its return
> > value).
> >
> > Most stack based languages are not so constrained. You cannot
> inspect the
> > code and say what it will do to the stack without knowing what is
> > currently _on_ the stack!
>
> IIRC every word has to have a statically determinable stack effect
> (in terms of which types the top elements have before and after a word
> has been executed)
> See e.g. http://home.t-online.de/home/s.becher/forth/intro.htm
> about replacement (sort of) of ?dup with ?if .

Yes, something like this could be useful for a Joy-like language.

> > It is trivial to write a function that would put 20 things on the
> stack if
> > the thing on top of the stack is true else take 40 things off the
> stack.
> >
> > If one constrained the language so that every operator chewed off /
> > placed on a predetermined number of items off the stack.
> >
> > ie. Constrain the language so that, you could look at any and every
> > operator and tell a priori how many items it would chew off or place
> on
> > the stack.
> I.e. a sort of type system.

I would call it an "arity system" - it merely checks the
number of parameters expected and the number of new values returned.
>
> > Then for any "word" or function comprising of several operators one
> could
> > determine apriori exactly how many items that function would chew
> off or
> > place on the stack.
> >
> > The prime problem case would be an "if" operator. The "then" part
> may chew
> > off more or less than the "else" part. However, if one constrained
> the
> > "then" and the "else" parts to do exactly the same, then the problem
> is
> > resolved.
>
> Yes, same "problem" with if in typed functional langages, same
> solution.

In lambda calculus languages this problem(?) does not arise,
as far as I can see.

> > Then one can talk about a function or operator being a n->m
> operator. It
> > chews n items off the stack and pushes m items onto the stack.
> >
> > So if you have a n->m operator concatenated with a j->k operator you
> have
> > a (n+j)->(m+k) operator.

Exactly right.

> > Strict typing.
> > =============
> >
> > Now having got that far, you can say, for example a + operator is a
> 2->1
> > operator taking (int,int)->(int).
> >
> > Now lets extend out notation saying instead of merely chew n leave m
> n->m,
> > we actually list the types.
> >
> > (T1,T2,...Tn)->(U1,U2,...Um) where Ti,Uj are types.
>
> Now it looks even more like a type system.
> (I.e. no dynamic typing/one-universal-type)
>
> > So now we can statically analyze for type safety of a "word" without
> even
> > running the program.
>
> Exactly.
>
> > For example
> > "hello world" 1 +
> > is clearly "type" wrong.
> > "hello word" is ->(string)
> > 1 is ->(int)
> >
> > So we know the stack is
> > something string int
> >
> > + is (int,int)->int
> >
> > Type clash.
>
> Yes.
>
> > Now one can recover objects and polymorphic types as well.
> >
> > I suspect the author of a function need never declare the type
> "signature"
> > of the function, as it should be trivial to automagically deduce the
> > signature!
>
> Yes, it's called type inference.

Yes, but I don't know to what extent the existing technology
would apply to Joy - if at all.

> > Lazy evaluation
> > ===============
> > This bit is a tad more speculative...
> >
> > If one redefined the meaning of quoting slightly....
> > A quoted code fragment would mean "bundle me up with my context, but
> don't
> > evaluate me unless you have to".
> >
> > For example, a quoted n->m fragment would attempt to find n items on
> the
> > stack and include it in the bundle. eg...
>
> Do you mean like building a closure ?
>
> > Think of
> > f( a(x), b(y, z), u)
> >
> > That is
> > u z y b x a f
> >
> > f is 3->1
> > b is 2->1
> > a is 1->1
> > x,y,z,u are ->1
> >
> > So we can statically analyze this fragment bundling things into
> > contexts...
> > f is 3->1
> > a is 1->1 so can create bundle
> > [x a]
> > fragment becomes
> > u z y b [x a] f
> >
> > b is 2->1 so can bundle...
> > [z y b]
> > fragment becomes...
> > u [z y b] [x a] f
> >
> > Now can bundle f
> > [u [z y b] [x a] f]
>
> You mean that because a craves 1 input respectively b craves 2 inputs
> and because we can locate these inputs syntactically we can bundle
> these together and say that [x a] gives one output because a gives one
> output respectively [z y b] gives one output because b gives one
> output ?
>
> What about if b : 2->2 and f : 4->1 ?
> Would this still be valid (I suppose so) ?
> [u [z y b] [x a] f]
>
> Could you write [y b] now and that would somehow remove the top
> element of the stack and include it in the closure (= bundle ?)
> to maybe (if required) evaluate later ?
>
> > The point of the exercise being that we only need evaluate the
> bundles if
> > and only if we are forced to by the flow of execution. ie. We have
> > Functional Language style lazy evaluation again.
>
> Very interesting.
>
> > And the lambda?
> > ===============
> >
> > Everything is a lambda.
> >
> > Type in anything, the interpretor would lazily refuse to evaluate it
> as
> > long as it could. Wherever possible it would automagically rewrite
> your
> > code into lambda'd bundles.
>
> Like in other lazy languages like Haskell, Miranda, Clean.

Except that the whole point of Joy at least is NOT to have named
formal parameters.

> How to do I/O and side-effects, then ?
> Monads, uniqueness typing or something else ?
>
> (If a language is lazy, then we generally don't know exactly when, if,
> and how many times a bundle passed to a function/word is going to be
> evaluated
> (Well it's only going to be evaluated at most once
> (that's what lazy means), but if the bundle is in turn a word,
> then that word may perform side effects when it gets executed with
> parameters on the stack by the controlling (higher-order) word

I have recetnly looked at streams in Scheme again,
Abelson and Sussman, Structure and interpretation of Computer Programs.
I think one might be able to do something resembling lazy
evaluation in Joy. I say "resembling", because there is a huge
difference.

> (long somewhat original-post-off-topic :)
> I've recently found an interesting embedding for a typed forth-like
> language in Haskell (which happens to be a lazy language, but I think
> that one could very easily rewrite it in a strict (functional)
> language, like SML,OCaml ..).
>
> Normally (I think ?) forth-embeddings or interpreters in functional
> languages use a stack ADT or linked list for the forth stack and
> a forth word is a stack transformer.
>
> The problem with this in a statically typed language is that, normally
> the linked-list (or stack ADT) type is parametriced over the element
> type (This is of course an improvement over (as in some statically
> typed languagese.g. Java) having to define one stack type for every
> element type or make a "generic" stack type with elements of type
> void *,Object or some similar universal-type), so that we can have a
> list of integers or a list of doubles etc.. but no mixed types in
> lists. This is in (at least some problem domains) not so often needed.
>
> But in this case we want to model forth programs which store objects
> of different type on the stack.
>
> Na"ive/Toy implementations (e.g. just for educational or
> proof-of-concept) can of course just provide a list of integers and be
> content which that.
>
> One other solution is to make a disjoint union type for all types that
> we are going to need/use and make a list/stack of this universal type.
> But then it begins to look more like a interpreter (which can be fun)
> than an embedding (which I wanted here).
>
> So if we are going to model/type forth stack, then we are going to
> need a way to incorporate the type of the top element and the second
> top element and so on (As we saw, just one type for all elements isn't
> enough) (Hopefully not all the way down, though).

I fear it will have to be all the way down!

> One way (in a haskell framework) seems to be to make a type
> constructor that takes the type of the top-element and the type of the
> rest of the stack and constructs a type for this whole stack.
>
> Like :
> plus :: Stack Int (Stack Int st) -> Stack Int st
>
> (program fragment :: type of program fragment)
>
> st represents the type of the unaltered stack below the "arguments"
> and "results" of the word.
> This is of course an awkward syntax, but it's an embedding.
> We can (in haskell) use the (built-in syntax for) pair constructors :
> plus :: (Int,(Int,st)) -> (Int,st)
>
> My approach, instead, uses currying (i.e. the ability of a function
> which hasn't been supplied with all of it's arguments to just return
> a new function (closure) that waits for the missing arguments before
> computing the result) in haskell as implementation/embedding method.
> (I.e. i'm not introducing currying in forth (not as yet anyway).)
>
> In haskell there's a function called flip :
> flip :: (b -> a -> c) -> a -> b -> c
> or, equivalently
> flip :: (b -> a -> c) -> (a -> b -> c)
> which takes a function as first argument (say f) and two extra
> argument (say y and x) and applies f to x and y. Alternatively (and
> equivalently) it takes a function f and gives a new function that does
> the same work as f, but which takes it's two arguments in the reversed
> (or flipped or swapped) order.
>
> This is very much like swap in forth !
> I.e. flip takes a function that represent what to do after (i.e.
> continuation-passing-style (almost)) and eats two arguments
> and continues with f such that f think that the two arguments were
> given to it in the reverse order.
>
> Similary, one can define a function dup that takes a
> continuation-function and one argument and then continues with the
> function after giving it the argument duplicated as two arguments.
>
> Ok, implementation :
>
> swap :: (b -> a -> stack) -> (a -> b -> stack)
> swap c x y = c y x
>
> dup :: (a -> a -> stack) -> (a -> stack)
> dup c x = c x x
>
> drop :: (stack) -> (a -> stack)
> drop c x = c
>
> Now, in concatenative languages, juxtaposition of two program
> fragments mean (stack-transform-) function composition, while in many
> applicative languages it mean (unsurprisingly) function application.
>
> So for an embedding we need an explicit function composition, like
> the predefined haskell one :
>
> (.) :: (stack1 -> stack0) -> (stack2 -> stack1) -> (stack2 -> stack0)
> (f . g) c = f (g c)
>
> I.e. g's continuation is c (which is the continuation of the whole
> composition) and f's continuation is (g c).

Why do you consider these to be continuations?? Am I missing something?

> This can be considered as syntax as opposed to a word.
>
> Now we can define up primitive (from the forth POV) words like
> swap,dup,drop,rot,plus,times, ...
>
> But the types of these words is still a little confusing because
> it seems to be backwards, just look at dup again :
> dup :: (a -> a -> stack) -> (a -> stack)
> so I introduce a type synonym Word, such that
> Word stack0 stack1
> is equal to
> stack1 -> stack0
>
> type Word s0 s1 = s1 -> s0
>
> Now dup has type :
> dup :: Word (a -> s) (a -> a -> s)
>
> (.) :: Word stack0 stack1 -> Word stack1 stack2 -> Word stack0 stack2
>
> drop :: Word (a -> s) s
>
> plus :: Word (Int -> Int -> s) (Int -> s)
> plus c n m = c (n+m)
>
> minus :: Word (Int -> Int -> s) (Int -> s)
> minus c n m = c (n-m)
>
> times :: Word (Int -> Int -> s) (Int -> s)
> times c n m = c (n*m)
>
> equal :: Word (Int -> Int -> s) (Bool -> s)
> equal c n m = c (n==m)
>
> int2float :: Word (Int -> s) (Float -> s)
> int2float c i = c (fromInt i)
>
> squareroot :: Word (Float -> s) (Float -> s)
> squareroot c f = c (sqrt f)
>
> donothing :: Word s s
> donothing c = c
>
> ifThenElse is not a word, but syntax, same for lit.
>
> ifThenElse :: Word s0 s1 -> Word s0 s1 -> Word (Bool -> s0) s1
> ifThenElse t e c True = t c
> ifThenElse t e c False = e c
>
> lit :: a -> Word s (a -> s)
> lit x c = c x
>
> lit is needed because literals in haskell doesn't denote a stack
> transformer, but the value instead.
>
>
> Example derived words :
>
> square :: Word (Int -> s) (Int -> s)
> square = dup
> . times
>
>
> pyth :: Word (Int -> Int -> s) (Float -> s)
> pyth = square
> . swap
> . square
> . plus
> . int2float
> . squareroot
>
> fact :: Word (Int -> s) (Int -> s)
> fact = dup
> . lit 0
> . equal
> . ifThenElse
> ( drop
> . lit 1
> )
> ( dup
> . lit 1
> . swap
> . minus
> . fact
> . times
> )
>
> Oh, I almost forgot :
>
> eval1answer :: Word a (a -> a) -> a
> eval1answer w = w (\o -> o)
>
> eval1answer (lit 3 . lit 4 . pyth)
> =>
> 5.0
>
> By the way, one can easily add first-class labels/continuation to this
> and add a goto primitive if one wants to. This is possible because of
> the use of closures.
> (An real (efficient ?) compiler (and/or intereter) might not so easily
> provide continuations, though)
>
>
> What do you think ?

> Stefan Lj

It looks as though you hae in effect implemented a small
It would be interesting to see how you fare with combinators.
(You might want to have a look at the "Miscellaneous" page
on the Joy site, which has a miniature Joy in Common Lisp.
Or you might prefer not to, it is probably a terrible program.)

-Manfred
• Isn t strong typing and type-inference a big problem because recursion in Joy relies on the K-combinator which can t be typed in Hindsley-Milner type systems?
Message 1 of 12 , Sep 17, 2001
View Source
Isn't strong typing and type-inference a big problem because recursion in
Joy relies on the K-combinator which can't be typed in Hindsley-Milner
type systems?

-Alex-

___________________________________________________________________
S. Alexander Jacobson Shop.Com
1-646-638-2300 voice The Easiest Way To Shop (sm)
• ... Ok, no prob. ... a ... [snip] ... the ... the ... off / ... every ... place ... Ok, I ll go for arity system. The important bit was that it checks the
Message 1 of 12 , Sep 17, 2001
View Source
--- In concatenative@y..., Manfred von Thun <phimvt@l...> wrote:
> I'm sorry it took so long to look at your contribution.
> It deserves more comments than I have written, but this is all
> I can do at the moment.
> - Manfred

Ok, no prob.

> On Mon, 3 Sep 2001 md9slj@m... wrote:
>
> > --- In concatenative@y..., John Carter <john.carter@t...> wrote:
> > > One of Manfred's posts got me musing on whether one could create
a
> > > strictly typed postfix language with a lambda abstraction.
> >
> > (I assume you mean concatenative (?))

[snip]

> > > It is trivial to write a function that would put 20 things on
the
> > stack if
> > > the thing on top of the stack is true else take 40 things off
the
> > stack.
> > >
> > > If one constrained the language so that every operator chewed
off /
> > > placed on a predetermined number of items off the stack.
> > >
> > > ie. Constrain the language so that, you could look at any and
every
> > > operator and tell a priori how many items it would chew off or
place
> > on
> > > the stack.
> > I.e. a sort of type system.
>
> I would call it an "arity system" - it merely checks the
> number of parameters expected and the number of new values returned.

Ok, I'll go for arity system. The important bit was that it checks
the consistency of some aspects of programs statically, Ok ?

> > > Then for any "word" or function comprising of several operators
one
> > could
> > > determine apriori exactly how many items that function would
chew
> > off or
> > > place on the stack.
> > >
> > > The prime problem case would be an "if" operator. The "then"
part
> > may chew
> > > off more or less than the "else" part. However, if one
constrained
> > the
> > > "then" and the "else" parts to do exactly the same, then the
problem
> > is
> > > resolved.
> >
> > Yes, same "problem" with if in typed functional langages, same
> > solution.
>
> In lambda calculus languages this problem(?) does not arise,
> as far as I can see.

Not in untyped lambda calculus, but in typed lambda calculus it does.
Look at this typing judgement for if in ditto :

Gamma |- e1 : Bool Gamma |- e2 : a Gamma |- e3 : a
------------------------------------------------------
Gamma |- if e1 then e2 else e3 : a

Both branches have the same type a (which is a type variable,
instantiable to any type)

> > > Then one can talk about a function or operator being a n->m
> > operator. It
> > > chews n items off the stack and pushes m items onto the stack.
> > >
> > > So if you have a n->m operator concatenated with a j->k operator
you
> > have
> > > a (n+j)->(m+k) operator.
>
> Exactly right.

The reason I said "type system" above is because I think this looks
like some sort of "type". Compare with matrix multiplication :

A matrix of "type" [n x m] has n rows and m columns.

Matrix multiplication :
X : [n x o],[o x m] -> [n x m]

I.e. :

[[1,2] [[1,2,3,4] [[11,14,17,20]
,[3,4] X ,[5,6,7,8]] = ,[23,30,37,44]
,[5,6]] ,[35,46,57,68]]

"Types" :
[3 x 2] [2 x 4] [3 x 4]

> > > Strict typing.
> > > =============

[snip]

> > > Now lets extend out notation saying instead of merely chew n
leave m
> > n->m,
> > > we actually list the types.
> > >
> > > (T1,T2,...Tn)->(U1,U2,...Um) where Ti,Uj are types.
> >
> > Now it looks even more like a type system.
> > (I.e. no dynamic typing/one-universal-type)

[snip]

> > > Now one can recover objects and polymorphic types as well.
> > >
> > > I suspect the author of a function need never declare the type
> > "signature"
> > > of the function, as it should be trivial to automagically deduce
the
> > > signature!
> >
> > Yes, it's called type inference.
>
> Yes, but I don't know to what extent the existing technology
> would apply to Joy - if at all.

I'm not sure what/how you mean.

> > > Lazy evaluation
> > > ===============
> > > This bit is a tad more speculative...

[snip]
(I didn't specially advocate lazy evaluation in a concatenative
language, but maybe one could do it ??)

> > > And the lambda?
> > > ===============
> > >
> > > Everything is a lambda.
> > >
> > > Type in anything, the interpretor would lazily refuse to
evaluate it
> > as
> > > long as it could. Wherever possible it would automagically
rewrite
> > your
> > > code into lambda'd bundles.
> >
> > Like in other lazy languages like Haskell, Miranda, Clean.
>
> Except that the whole point of Joy at least is NOT to have named
> formal parameters.

I think he (John Carter) mean't only the delaying mechanism and not
the named parameter mechanism when he said lambda.

Of course one could experiment with a concatenative language,
which also has named parameters, but then it wouldn't be Forth or Joy.
(It wouldn't be "pure" concatenative, would it ?)

> > How to do I/O and side-effects, then ?
> > Monads, uniqueness typing or something else ?
> >
> > (If a language is lazy, then we generally don't know exactly when,
if,
> > and how many times a bundle passed to a function/word is going to
be
> > evaluated
> > (Well it's only going to be evaluated at most once
> > (that's what lazy means), but if the bundle is in turn a word,
> > then that word may perform side effects when it gets executed
with
> > parameters on the stack by the controlling (higher-order) word
>
> I have recetnly looked at streams in Scheme again,
> Abelson and Sussman, Structure and interpretation of Computer
Programs.
> I think one might be able to do something resembling lazy
> evaluation in Joy. I say "resembling", because there is a huge
> difference.

Ok, care to elaborate ?
(Do you mean simulate (either full or partial lazyness) with quoting,
resembling what can be done is Scheme ?)

> > (long somewhat original-post-off-topic :)
> > I've recently found an interesting embedding for a typed
forth-like
> > language in Haskell (which happens to be a lazy language, but I
think
> > that one could very easily rewrite it in a strict (functional)
> > language, like SML,OCaml ..).
> >
> > Normally (I think ?) forth-embeddings or interpreters in
functional
> > languages use a stack ADT or linked list for the forth stack and
> > a forth word is a stack transformer.
> >
> > The problem with this in a statically typed language is that,
normally
> > the linked-list (or stack ADT) type is parametriced over the
element
> > type

[snip]

> > , so that we can have a
> > list of integers or a list of doubles etc.. but no mixed types in
> > lists. This is in (at least some problem domains) not so often
needed.
> >
> > But in this case we want to model forth programs which store
objects
> > of different type on the stack.
> >
> > Na"ive/Toy implementations (e.g. just for educational or
> > proof-of-concept) can of course just provide a list of integers
and be
> > content which that.
> >
> > One other solution is to make a disjoint union type for all types
that
> > we are going to need/use and make a list/stack of this universal
type.
> > But then it begins to look more like a interpreter (which can be
fun)
> > than an embedding (which I wanted here).
> >
> > So if we are going to model/type forth stack, then we are going to
> > need a way to incorporate the type of the top element and the
second
> > top element and so on (As we saw, just one type for all elements
isn't
> > enough) (Hopefully not all the way down, though).
>
> I fear it will have to be all the way down!

I think not..
(One can provide a type variable for the rest of the stack, "all the
way down", and say in the word's type that under the arguments and
results, its the same (unknown, from the word's perspective,) stack
(variable)
)

[snip]

> > [about flip which has type (b -> a -> c) -> (a -> b -> c)]
> >
> > This is very much like swap in forth !
> > I.e. flip takes a function that represent what to do after (i.e.
> > continuation-passing-style (almost)) and eats two arguments
> > and continues with f such that f think that the two arguments were
> > given to it in the reverse order.
> >
> > Similary, one can define a function dup that takes a
> > continuation-function and one argument and then continues with the
> > function after giving it the argument duplicated as two arguments.
> >
> > Ok, implementation :
> >
> > swap :: (b -> a -> stack) -> (a -> b -> stack)
> > swap c x y = c y x
> >
> > dup :: (a -> a -> stack) -> (a -> stack)
> > dup c x = c x x
> >
> > drop :: (stack) -> (a -> stack)
> > drop c x = c
> >
> > Now, in concatenative languages, juxtaposition of two program
> > fragments mean (stack-transform-) function composition, while in
many
> > applicative languages it mean (unsurprisingly) function
application.
> >
> > So for an embedding we need an explicit function composition, like
> > the predefined haskell one :
> >
> > (.) :: (stack1 -> stack0) -> (stack2 -> stack1) -> (stack2 ->
stack0)
> > (f . g) c = f (g c)
> >
> > I.e. g's continuation is c (which is the continuation of the whole
> > composition) and f's continuation is (g c).
>
> Why do you consider these to be continuations?? Am I missing
something?

Look at e.g. dup above, it takes a function c, representing what to
do "afterwards" and calls it with x and x instead of just returning
(Normally continuations only take one argument, though in Scheme one
can make multi-arg ones, but those are uncurried, mine are curried
and returns (at least conceptually) between each argument and
possibly after the last also.
So it isn't really what is usually mean by continuations but it looks
much similar (Like composable continuations isn't proper
continuations).
)

Here's the implementation of continuations/first class-labels
(same thing in concatenative langs.) in my embedding :
(Using the syntax described later)
(Beware : Iv'e no idea if these are implementable (at all or
efficiently) in a compiler implementaion)
(-; I'm also not responsible for perverting innocent programmers
with this goto-on-stereoids view
(Some people like continuation, many do not. In any case, they're
both powerful and hard to understand)
;-)

An object of type Label s, a label, represents a position in the
code that expect that the stack has type s, therfore before jumping
through a label, one must make sure that the stack, is of the
appropriate type.

type Label s = s

--Pushes a label on the top of the stack, that represents the
--position in the code just after executing the label word
label :: Word s (Label s -> s)
label c = c lbl
where lbl = c lbl

--This primitive syntax takes a word that expects to find a label,
--representing the position after the word, and pushes it on the
--stack, then executing the word (which may jump out) and then
--continuing with the rest of the program as usual
forwardLabel :: Word (Label s1 -> s0) s1 -> Word s0 s1
forwardLabel w c = w c lbl
where lbl = c

--This jumps to the label on top of the stack (passing the stack)
--Because this word never terminates normally, we can think of it
--as leaving anything we can imagine on the stack, because it doesn't
--return, it doesn't matter. (That's why the "return" stack type is
--another type vaiable than the "argument" stack (type))
jump :: Word (Label s0 -> s0) s1
jump c lbl = lbl

Iterative (using jump and friends) versions of triangular numbers
(1+2+3+...+n), and factorial

-- i (before tri is called)
tri = lit 0 -- s i
. swap -- i s
. label -- l i s (here is the jumped to point)
. rot -- i s l
. dup -- i i s l
. rot_ -- s i i l
. plus -- s i l (add i to the sum)
. swap -- i s l
. dup -- i i s l
. lit 0 -- 0 i i s l
. nequal -- b i s l (test if i is 0)
. ifThen
( -- i s l (comes here if the bool was true)
lit 1 -- 1 i s l
. minus -- i s l (decrement i)
. rot_ -- l i s
. jump -- (jumps to after the label was created)
-- i s l (what should be on the stack now,
-- had not we jumped away)
) -- i s l (after the if (i.e. the bool was false))
. drop -- s l
. swap -- l s
. drop -- s (now the sum is the answer)

tst = lit 4 -- 4
. tri -- 10

-- i (before fac is called)
fac = lit 1 -- p i
. swap -- i p
. forwardLabel -- f i p (forward-label now on stack)
( rot -- i p f
. label -- l i p f (backward-label on stack)
. rot -- i p l f
. dup -- i i p l f
. lit 0 -- 0 i i p l f
. equal -- b i p l f (test if i is 0)
. ifThen
( -- i p l f (comes here if the bool was true)
. rot4_ -- f i p l
. jump -- (jumps forward out of the loop)
) -- i p l f (after the if (the bool was false))
. dup -- i i p l f
. rot_ -- p i i l f
. times -- p i l f (multiply the product by i)
. swap -- i p l f
. lit 1 -- 1 i p l f
. minus -- i p l f (decrement i)
. rot_ -- l i p f (make ready for jump)
. jump -- (jump back to beginning of loop)
-- i p l (what *should* be on stack here,
) -- i p l (here comes the forward jump)
. drop -- p l
. swap -- l p
. drop -- p (the product is the result)

tst2 = lit 5 -- 5
. fac -- 120

(Beware : Above code is untested)

The above example code is just standard goto examples.

Note : I'm not directly encouraging anyone to seriously use
continuations in their code. I just demonstrated a
very simple use for them (which one can do better without
continuation, I think).

These labels doesn't retain any context of the labelled point
in the program (such as stack content), except the actual code
position. When jumping to a label, the stack before the jump is
retained (untouched) after the jump (except the label is dropped
from the stack (if not a forward-label, the label is pushed anew
after the jump, of course)).

(My earlier minus should be changed to take it's parameters in the
reverse order)

Phew !

[snip]

> > By the way, one can easily add first-class labels/continuation to
this
> > and add a goto primitive if one wants to. This is possible because
of
> > the use of closures.
> > (An real (efficient ?) compiler (and/or intereter) might not so
easily
> > provide continuations, though)
> >
> >
> > What do you think ?
>
> > Stefan Lj
>
> It looks as though you hae in effect implemented a small
> version of Joy in Haskell.

At least a (somewhat Forth-like) concatenative language,
I haven't added any quotations or bundles/delays/thunks/suspensions/
closures and the literal values isn't the same as their respective
stack words.

I had a long time before thought on something forth-like in Haskell
after I had seen how flip worked, and now I seem to have a nice
(for an embedding) implementation of a concatenative language in
Haskell. Of course it's not exactly the same as a "real"
implementation, but I'm amazed of how long Iv'e come while staying

> It would be interesting to see how you fare with combinators.

I think it's interesting to express thing with combinators.
I often define short combinators in haskell and use them to combine
the thing I want, instead of writing all the code in one lump (with
parts of the lump unreusable).

Did you know that there are *applicative* languages without named
parameters ? I think one of the original (?) functional languages, FP
by Backus
("Can Programming be Liberated from the von Neumann Style"
by John Backus, 1978
,http://www.cp.eng.chula.ac.th/faculty/pjw/teaching/fp/index-fp.htm
,ftp://gatekeeper.dec.com/pub/usenet/comp.sources.unix/volume13/funcproglang/
,ftp://gatekeeper.dec.com/pub/usenet/comp.sources.unix/volume20/fpc),
was such a one. I know for sure that at least UnLambda
don't have any variables at all, also SKI calculus is variable-free.

> (You might want to have a look at the "Miscellaneous" page
> on the Joy site, which has a miniature Joy in Common Lisp.

I propably will, when I have time and motivation, i.e.
not today^H^H^Hnight, tired..

(Hmm, maybe I'll put together a whole implementation/embedding so
you can put it on the page, hmm ? (I'll see how much of Joy specifics
can be retained when adapting to static typing))

> Or you might prefer not to, it is probably a terrible program.)

(Maybe, but I might not notice that because I'm almost a novice in
Lisp (and have not written many large programs at all).)

> -Manfred

Phew and Phew.

I'm going home now.

--
Stefan Lj
md9slj
• ... recursion in ... Hindsley-Milner ... Are you sure that recursion in Joy depends on a K-combinator ? Shouldn t it be an Y-combinator or something similar ?
Message 1 of 12 , Sep 17, 2001
View Source
--- In concatenative@y..., "S. Alexander Jacobson" <shop2it@y...>
wrote:
> Isn't strong typing and type-inference a big problem because
recursion in
> Joy relies on the K-combinator which can't be typed in
Hindsley-Milner
> type systems?

Are you sure that recursion in Joy depends on a K-combinator ?
Shouldn't it be an Y-combinator or something similar ?

It's true that a direct translation of Y from untyped lambda
calculus can't be typed in most Hindley-Milner type systems,
but that's only because circular type expression are disallowed.
If they are allowed, as in O'Caml (with -rectypes flag), then Y is
directly typeable. (I think the explanation for normally disallowing
circular type expressions is that, if you have them, some (type
checking) errors are not as clearly explained as without them.
(Or in some case found later and harder to localize))

But I think I saw a definition of Y without circular type expressions,
that used a recursive (algebraic) datatype (found in most statically
typed functional languages).

In combinatory logic (SK(I) calculus),
K has type forall a . forall b . a -> (b -> b)
or shorthand just a -> b -> b
Y has type (a -> b -> (a -> b)) -> a -> b (eager/strict evaluation)

Joy k : k == [pop] dip i
A version of this in my haskell embedding might have type :
k :: Word (Word s0 s1 -> a -> s0) s1
where
i :: Word (Word s0 s1 -> s0) s1
i c w = w c
dip :: Word (Word s0 s1 -> a -> s0) (a -> s1)
dip c w x = w (c x)
(though my current representation of Word does not allow introspection
as Joy qoutation's do)

> -Alex-
>
> ___________________________________________________________________
> S. Alexander Jacobson Shop.Com
> 1-646-638-2300 voice The Easiest Way To Shop (sm)

--
Stefan Lj
md9slj
• ... I wish I could elaborate. My remark was prompted by a discussion on the group a short while ago, where I wrote a few unconsidered remarks. That caused me
Message 1 of 12 , Sep 18, 2001
View Source
On Mon, 17 Sep 2001 md9slj@... wrote:

> --- In concatenative@y..., Manfred von Thun <phimvt@l...> wrote:
>
> > On Mon, 3 Sep 2001 md9slj@m... wrote:

> > > (Well it's only going to be evaluated at most once
> > > (that's what lazy means), but if the bundle is in turn a word,
> > > then that word may perform side effects when it gets executed
> with
> > > parameters on the stack by the controlling (higher-order) word
> >
> > I have recetnly looked at streams in Scheme again,
> > Abelson and Sussman, Structure and interpretation of Computer
> Programs.
> > I think one might be able to do something resembling lazy
> > evaluation in Joy. I say "resembling", because there is a huge
> > difference.
>
> Ok, care to elaborate ?
> (Do you mean simulate (either full or partial lazyness) with quoting,
> resembling what can be done is Scheme ?)

I wish I could elaborate. My remark was prompted by a discussion
on the group a short while ago, where I wrote a few unconsidered
remarks. That caused me to look at that book again, and I think
it would be a good starting point for implementing lazy (perhaps
infinite) lists in Joy. I'm not sure whether it would really need
some counterpart of set! in Scheme. Maybe it can be done with
Joy as-is. Dunno. Nice topic for somebody to investigate.

> > > So if we are going to model/type forth stack, then we are going to
> > > need a way to incorporate the type of the top element and the
> second
> > > top element and so on (As we saw, just one type for all elements
> isn't
> > > enough) (Hopefully not all the way down, though).
> >
> > I fear it will have to be all the way down!
>
> I think not..
> (One can provide a type variable for the rest of the stack, "all the
> way down", and say in the word's type that under the arguments and
> results, its the same (unknown, from the word's perspective,) stack
> (variable)

My "fear was prompted by the presence of the "stack" and "unstack"
operators. (Actually I see less and less use for them, but I do not
want to remove them yet.)

> I had a long time before thought on something forth-like in Haskell
> after I had seen how flip worked, and now I seem to have a nice
> (for an embedding) implementation of a concatenative language in
> Haskell. Of course it's not exactly the same as a "real"
> implementation, but I'm amazed of how long Iv'e come while staying
> within the embedding language (Haskell).
>
> > It would be interesting to see how you fare with combinators.
>
> I think it's interesting to express thing with combinators.
> I often define short combinators in haskell and use them to combine
> the thing I want, instead of writing all the code in one lump (with
> parts of the lump unreusable).

I think that the "machine language" that you were developing
might be a very useful starting point for a Joy-compiler-to
some sequential code. In SICP the Scheme compiler is being developed
in much the same way.

> Did you know that there are *applicative* languages without named
> parameters ? I think one of the original (?) functional languages, FP
> by Backus

Oh yes, see the "Rationale" and the "Comparison" papers on the
main Joy page. Shoenfinkel/Curry combinatory calculus, and then
Backus. But I must read Backus again. There is also the
(compositional, non-applicative) language of Category theory,
but it is strongly typed.

> ("Can Programming be Liberated from the von Neumann Style"
> by John Backus, 1978
> ,http://www.cp.eng.chula.ac.th/faculty/pjw/teaching/fp/index-fp.htm
> ,ftp://gatekeeper.dec.com/pub/usenet/comp.sources.unix/volume13/funcproglang/
> ,ftp://gatekeeper.dec.com/pub/usenet/comp.sources.unix/volume20/fpc),
> was such a one. I know for sure that at least UnLambda
> don't have any variables at all, also SKI calculus is variable-free.

Yes. I have seen UnLambda. It is the SK "machine language",
not intended for human programmers. Joy is intended for people.

> (Hmm, maybe I'll put together a whole implementation/embedding so
> you can put it on the page, hmm ? (I'll see how much of Joy specifics
> can be retained when adapting to static typing))

A very worthwhile project. I will be glad to put it on the Joy page.

- Manfred
• In re: . . . implementing lazy (perhaps infinite) lists in Joy. I m not sure whether it would really need some counterpart of set! in Scheme. Maybe it can
Message 1 of 12 , Sep 19, 2001
View Source
In re: ". . . implementing lazy (perhaps
infinite) lists in Joy. I'm not sure whether it would really need
some counterpart of set! in Scheme. Maybe it can be done with
Joy as-is. Dunno. Nice topic for somebody to investigate.":

I thought I already presented infinite lists in unmodified Joy (no
set!), in
http://groups.yahoo.com/group/concatenative/message/774
Your message has been successfully submitted and would be delivered to recipients shortly.
• Changes have not been saved
Press OK to abandon changes or Cancel to continue editing
• Your browser is not supported
Kindly note that Groups does not support 7.0 or earlier versions of Internet Explorer. We recommend upgrading to the latest Internet Explorer, Google Chrome, or Firefox. If you are using IE 9 or later, make sure you turn off Compatibility View.