-
Notifications
You must be signed in to change notification settings - Fork 39
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Parallel concatenation #174
Comments
This seems like it would be useful, but I don’t see a good way to make it work. It makes infix desugaring more complicated—in particular, it would depend on type information, which in turn would lock us in to requiring type signatures for definitions. One thing I would consider is something like #166 for ordinary expressions, e.g., making Haskell would use |
Haskell approach is very ad-hoc, it becomes compicated when functions take more than one argument. Concatenative languages have a huge advantage here, they don't require things like
So for
It requites only arity, everything else should be straightforward. So type signature can be just |
Btw, a somewhat more realistic example:
(here This is not a full point-free non-von Neumann style definition, but it doesn't have to. I'm not a fan of APL/J style one-liners, I prefer simple expressions with lots of PS: suddenly, UPDATE: it can be UPDATE 2: full point-free definition is actually nice too:
|
I agree with you there. I just don’t think lifting is necessarily awkward. As long as there’s a concise notation for it, it’s nice to have some indication that operators and functions are being used in lifted form. We could make some progress with traits—for example, defining an instance of
(Although in reality I’d probably factor out But the many
However, your |
It is necessarily ad-hoc.
Again ad-hoc.
The reason I prefer the latter is because Meanwhile, I realized that |
I like where you’re going with that article. One thing to note is that defining Your reference to the visual “generalised composition” notation makes me realise that ordinary composition and parallel composition form two distinct monoids on dataflow graphs—vertical and horizontal concatenation, respectively. These strongly remind me of both Functional Geometry and the As for the theory side, it seems you’ve reinvented arrows: parallel composition
Parallel composition can also be implemented in terms of a retain stack. For example, if f : m → n means that f consumes m inputs and produces n outputs, then:
|
That's exactly what I had in mind while talking about composition tree. So, it is the same for graphs?
I'm not even surprised, arrows look like a poor's man version of concatenative language. But concatenative languages keep everything amazingly simple. I'm still impressed how there's almost nothing you have to do to convert concatenative code into ANF. Like ANF is actually concatenative programming. |
I implemented a toy language with this feature: https://github.com/suhr/esobsc
With cancellation, you could write something like this:
or
I'll write more about it later. I'm thinking about implementing some more useful language, so I want to understand recursion better. How does Kitten handle recursion, mutual recursion, TCO? Do you have any advices about the implementation? |
Yeah, pattern matching as function inversion seems to play nicely with this notation. Kitten’s implementation is pretty simple and obvious. It uses two stacks, one for data (the Kitten stack) and one for return addresses & locals (the C stack). Calling a function pushes its return address and jumps (x86 As a debugging aid, I’ve also considered keeping a “stack log” (like a stack trace) to recover information lost (optimised out) by TCO. Also, destructors don’t work very well with TCO (#156)—a call that’s syntactically in tail position might not be semantically in tail position if there’s a value in scope that has a destructor. I believe it’s possible to get away with a single stack—and thus further optimisation opportunities such as converting to SSA form—if you impose one restriction: higher-order stack-polymorphic functions such as this cannot be recursive.
This allows you to specialise them for a certain number of stack values, or always inline them, which means that the arity of every call is known statically and you don’t need to reify the stack—it can live in registers and spill to the same stack as return addresses and locals. Unfortunately this means that you can’t have something like a |
Let
foo: a b -> x
,bar: c d e -> y
. Thenfoo ; bar
is the function of typea b c d e -> x y
, such asx = a b foo
,y = c d e bar
.Infix notation
foo `F` bar
can be defined as(foo ; bar) F
. This allows you, for example, to define the mean value function asdup (sum) / (len)
. For now, infix notation just reordersf `op` g
intof g op
.This is basically a more sane version of forks and hooks in J.
Unresolved question: can we avoid using
ndup
when number of arguments divides arity of(f ; g ; ...)
?PS: I want to create my own little concatenative array language, so discussions are highly welcome.
The text was updated successfully, but these errors were encountered: