Tail-recursion optimization in Oz
Asked Answered
F

2

7

In the chapter about function in the Oz tutorial, it says that:

similar to lazy functional languages Oz allows certain forms of tail-recursion optimizations that are not found in certain strict functional languages including Standard ML, Scheme, and the concurrent functional language Erlang. However, standard function definitions in Oz are not lazy.

It then goes on to show the following function which is tail-recursive in Oz:

fun {Map Xs F}
   case Xs
   of nil then nil
   [] X|Xr then {F X}|{Map Xr F}
   end 
end 

What this does is, it maps the empty list to the empty list and non-empty list, to the result of applying the function F to its head and then prepending that to the result of calling Map on the tail. In other languages this would not be tail recursive, because the last operation is the prepend, not the recursive call to Map.

So my question is: If "standard function definitions in Oz are not lazy", what does Oz do that languages like Scheme or Erlang can't (or won't?) to be able to perform tail-recursion optimization for this function? And exactly when is a function tail-recursive in Oz?

Fiftyfifty answered 3/10, 2009 at 11:9 Comment(1)
more info at stackoverflow.com/tags/tailrecursion-modulo-cons/info.Squilgee
D
6

This is called Tail Recursion Modulo Cons. Basically, prepending to the list directly after the recursive call is the same as appending to the list directly before the recursive call (and thus building the list as a "side-effect" of the purely functional "loop"). This is a generalization of tail recursion that works not just with cons lists but any data constructor with constant operations.

It was first described (but not named) as a LISP compilation technique in 1974 by Daniel P. Friedman and David S. Wise in Technical Report TR19: Unwinding Structured Recursions into Iterations and it was formally named and introduced by David H. D. Warren in 1980 in the context of writing the first-ever Prolog compiler.

The interesting thing about Oz, though, is that TRMC is neither a language feature nor an explicit compiler optimization, it's just a side-effect of the language's execution semantics. Specifically, the fact that Oz is a declarative concurrent constraint language, which means that every variable is a dataflow variable (or "everything is a promise", including every storage location). Since everything is a promise, we can model returning from a function as first setting up the return value as a promise, and then later on fulfilling it.

Peter van Roy, co-author of the book Concepts, Techniques, and Models of Computer Programming by Peter Van Roy and Seif Haridi, also one of the designers of Oz, and one of its implementators, explains how exactly TRMC works in a comment thread on Lambda the Ultimate: Tail-recursive map and declarative agents:

The above example of bad Scheme code turns into good tail-recursive Oz code when translated directly into Oz syntax. This gives:

fun {Map F Xs}
   if Xs==nil then nil
   else {F Xs.1}|{Map F Xs.2} end
end

This is because Oz has single-assignment variables. To understand the execution, we translate this example into the Oz kernel language (I give just a partial translation for clarity):

proc {Map F Xs Ys}
   if Xs==nil then Ys=nil
   else local Y Yr in
      Ys=Y|Yr
      {F Xs.1 Y}
      {Map F Xs.2 Yr}
   end end
end

That is, Map is tail-recursive because Yr is initially unbound. This is not just a clever trick; it is profound because it allows declarative concurrency and declarative multi-agent systems.

Diatonic answered 5/8, 2014 at 11:34 Comment(3)
"Since everything is a promise, we can model returning from a function as first setting up the return value as a promise, and then later on fulfilling it." that would make it fully lazy, wouldn't it? Instead, the quoted Oz code seems to show (as expected from my Prolog-based experience with the matter) that the structure of the returned value is fully set up right away filled with variables which happen to be not-yet-set (analogous of uninstantiated logical variables of Prolog). If I understand correctly, in Oz any attempt to read from the not-yet-set variable blocks 1/Squilgee
2/ until some other thread sets that var's value; I guess that's what makes such vars "dataflow vars". So then ,Ys=Y|Yr sets up the "cons" structure with two holes; {F Xs.1 Y} procedure call sets Y's value, and then the {Map F Xs.2 Yr} procedure call is made (obviously, in tail position), and no blocking/resuming need even come into play as this same thread continues with the tail call and sets all the resulting list's cells. So the magic seems to be in these Oz "procedures", analogous to Prolog predicates (again, I guess). /2Squilgee
OK, so, the more precise wording could be "everything is expected to be a promise" and every not-yet-set variable is a promise which has not yet been set up. An "empty promise" so to speak.Squilgee
M
3

I am not too familiar with lazy functional languages, but if you think about the function Map in your question, it is easy to translate to a tail-recursive implementation if temporarily incomplete values in the heap are allowed (muted into more complete values one call at a time).

I have to assume that they are talking about this transformation in Oz. Lispers used to do this optimization by hand -- all values were mutable, in this case a function called setcdr would be used -- but you had to know what you were doing. Computers did not always have gigabytes of memory. It was justified to do this by hand, it arguably no longer is.

Back to your question, others modern languages do not do it automatically probably because it would be possible to observe the incomplete value while it is being built, and this must be what Oz has found a solution to. What other differences are there in Oz as compared to other languages that would explain it?

Midden answered 3/10, 2009 at 11:46 Comment(4)
I don't actually know too much about oz. As far as I can tell, it's not too different from lisp. I just played around a bit with it in the last couple of days and stumbled upon the fact that it optimized recursions, that I didn't expect it to.Fiftyfifty
I only know Oz from reading Peter Van Roy's Concepts, Techniques, and Models of Computer Programming, but it does in fact has incomplete values--they are used extensively in concurrent programming, because reading an incomplete value cause the current thread to block. So Pascal's guess is probably how it works. (Here is the book link: info.ucl.ac.be/~pvr/book.html He used to have a draft version online, but has removed it. :( )Ep
As Oz is also a logic language then it might have the concept of a logical variable in which case it would be tail-recursive in the same way as for other logic languages like prolog.Bursitis
See here for how exactly it works: lambda-the-ultimate.org/node/2273#comment-40235Misgiving

© 2022 - 2024 — McMap. All rights reserved.