Can every recursion be converted into iteration?
Asked Answered
E

18

237

A reddit thread brought up an apparently interesting question:

Tail recursive functions can trivially be converted into iterative functions. Other ones, can be transformed by using an explicit stack. Can every recursion be transformed into iteration?

The (counter?)example in the post is the pair:

(define (num-ways x y)
  (case ((= x 0) 1)
        ((= y 0) 1)
        (num-ways2 x y) ))

(define (num-ways2 x y)
  (+ (num-ways (- x 1) y)
     (num-ways x (- y 1))
Eliason answered 31/5, 2009 at 9:48 Comment(6)
I don't see how this is a counter-example. The stack technique will work. It won't be pretty, and I'm not going to write it, but it is doable. It appears akdas acknowledges that in your link.Delldella
Your (num-ways x y) is just (x+y) choose x = (x+y)!/(x!y!), which doesn't need recursion.Refund
Duplicate of: stackoverflow.com/questions/531668Currency
I would say that recursion is merely a convenience.Whirly
Possible duplicate of Which recursive functions cannot be rewritten using loops?Phraseology
The other thing I might ask is.. wouldn't the compiler be able to turn recursion into iteration?Ednaedny
A
219

Can you always turn a recursive function into an iterative one? Yes, absolutely, and the Church-Turing thesis proves it if memory serves. In lay terms, it states that what is computable by recursive functions is computable by an iterative model (such as the Turing machine) and vice versa. The thesis does not tell you precisely how to do the conversion, but it does say that it's definitely possible.

In many cases, converting a recursive function is easy. Knuth offers several techniques in "The Art of Computer Programming". And often, a thing computed recursively can be computed by a completely different approach in less time and space. The classic example of this is Fibonacci numbers or sequences thereof. You've surely met this problem in your degree plan.

On the flip side of this coin, we can certainly imagine a programming system so advanced as to treat a recursive definition of a formula as an invitation to memoize prior results, thus offering the speed benefit without the hassle of telling the computer exactly which steps to follow in the computation of a formula with a recursive definition. Dijkstra almost certainly did imagine such a system. He spent a long time trying to separate the implementation from the semantics of a programming language. Then again, his non-deterministic and multiprocessing programming languages are in a league above the practicing professional programmer.

In the final analysis, many functions are just plain easier to understand, read, and write in recursive form. Unless there's a compelling reason, you probably shouldn't (manually) convert these functions to an explicitly iterative algorithm. Your computer will handle that job correctly.

I can see one compelling reason. Suppose you've a prototype system in a super-high level language like [donning asbestos underwear] Scheme, Lisp, Haskell, OCaml, Perl, or Pascal. Suppose conditions are such that you need an implementation in C or Java. (Perhaps it's politics.) Then you could certainly have some functions written recursively but which, translated literally, would explode your runtime system. For example, infinite tail recursion is possible in Scheme, but the same idiom causes a problem for existing C environments. Another example is the use of lexically nested functions and static scope, which Pascal supports but C doesn't.

In these circumstances, you might try to overcome political resistance to the original language. You might find yourself reimplementing Lisp badly, as in Greenspun's (tongue-in-cheek) tenth law. Or you might just find a completely different approach to solution. But in any event, there is surely a way.

Adlai answered 1/6, 2009 at 8:32 Comment(19)
Isn't Church-Turing yet to be proven?Kristelkristen
Here's a really short outline: pick two models of computation A and B. Prove that A is at least as powerful as B by writing an interpreter of B using A. Do this in both directions, and you have shown that A and B have equivalent power. Consider that machine code is almost the Turing-machine model, and that lisp interpreters/compilers exist. The debate should be over. But for more information, see: alanturing.net/turing_archive/pages/Reference%20Articles/…Adlai
Ian, I'm not sure that proves each has equivalent power, only that each uses equivalent power. Use demonstrates certain capabilities, but not necessarily the extent of them.Unpolitic
@eyelidlessness: If you can implement A in B, it means B has at least as much power as A. If you cannot execute some statement of A in the A-implementation-of-B, then it's not an implementation. If A can be implemented in B and B can be implemented in A, power(A) >= power(B), and power(B) >= power(A). The only solution is power(A) == power(B).Eliason
Would this answer depend on how restrictive the (iterative) language is? What if the language has very limited use of GOTO statements? How is it possible for the program to "remember" the instruction at which the recursion/iteration was last called from without some kind of return address?Aromaticity
@T.Webster You can't get much more limited than a Turing Machine - gotos? ha waaay too advanced feature. You can look at the informal definition to see what you need to get the same power as any modern iterative programming language.Anesthetic
re: 1st paragraph: You are speaking about equivalence of models of computation, not the Church-Turing thesis. The equivalence was AFAIR proved by Church and/or Turing, but it is not the thesis. The thesis is an experimental fact that everything intuitively computable is computable in strict mathematical sense (by Turing machines/recursive functions etc.). It could be disproven if using laws of physics we could build some nonclassical computers computing something Turing machines cannot do (e.g. halting problem). Whereas the equivalence is a mathematical theorem, and it will not be disproven.Gearwheel
How the heck did this answer get any positive votes? First it mixes up Turing completeness with the Church-Turing thesis, then it makes a bunch of incorrect handwaving, mentioning "advanced" systems and dropping lazy infinite tail recursion (which you can do in C or any Turing complete language because.. uh.. does anyone know what Turing complete means?). Then a hopeful hand-holding conclusion, like this was a question on Oprah and all you need is to be positive and uplifting? Horrid answer!Alary
And the bs about semantics??? Really? This is a question on syntactic transformations, and somehow it's became a great way to name drop Dijkstra and imply you know something about the pi-calculus? Let me make this clear: whether one looks at the denotational semantics of a language or some other model will have no bearing on the answer to this question. Whether the language is assembly or a generative domain modeling language means nothing. Its only about Turing completeness and transforming "stack variables" to "a stack of variables".Alary
If any recursive function can be transformed into a non-recursive version, then why don't compilers do this? It is common advice to write non-recursive methods if possible because of performance overhead and SO issues with a recursive method.Patmos
@morpheus, in fact compilers for functional languages (and many lisps) routinely apply several techniques to automatically convert a large category of recursive-looking definitions into object code that doesn't overuse stack space. But in any event, object code can be considered as iterative by considering the CPU to run a fetch-execute iteration.Adlai
Some other practical reasons for converting recursive functions to other implementations: (1) You want to write an iterator class over a complex data structure, but you don't have the equivalent of C#'s yield. (2) You have many executing co-processes, and you do not want to (or cannot) implement them as separate threads. This is seen in algorithms for executing data-flow graphs, where assigning a thread to each vertex may be impractical.Tripitaka
@Patmos Iteration isn't magic. If a recursive function calls itself more than once during a single call (e.g. merge sort), then it requires unbounded memory space, and implementing it via iteration won't change that, you'll just be manually managing a stack. But, many naturally recursive algorithms only call themselves once per call and can be written as a tail call. Tail calls can work in fixed memory space, but require special treatment from the compiler. Telling users to manually convert to iteration means compiler writers can do other stuff.Candiot
How about the inverse? Can you always turn a iterative function into an recursive one?Schmaltzy
@Schmaltzy ,yes, you can. By one definition, iterative functions are primitive-recursive to begin with. But if your goal is to write it without explicit looping constructs, then a correspondingly-designed recursive call stands in for the construct. The details are part of any course on functional programming.Adlai
[donning asbestos underwear] deserves 50 points in itself. :-)Discriminate
@Schmaltzy an iterative function is already a (primitive-)recursive function. The operation is vacuous.Adlai
@Adlai Knuth offers several techniques in "The Art of Computer Programming". Could You please tell which volume of the book you are talking about? Also, could you please tell any other source which tells a general techniques to avoid recursion?Munday
@Adlai I'd also like to know which TAOCP volume you were referring to.Appetizing
S
57

Is it always possible to write a non-recursive form for every recursive function?

Yes. A simple formal proof is to show that both µ recursion and a non-recursive calculus such as GOTO are both Turing complete. Since all Turing complete calculi are strictly equivalent in their expressive power, all recursive functions can be implemented by the non-recursive Turing-complete calculus.

Unfortunately, I’m unable to find a good, formal definition of GOTO online so here’s one:

A GOTO program is a sequence of commands P executed on a register machine such that P is one of the following:

  • HALT, which halts execution
  • r = r + 1 where r is any register
  • r = r – 1 where r is any register
  • GOTO x where x is a label
  • IF r ≠ 0 GOTO x where r is any register and x is a label
  • A label, followed by any of the above commands.

However, the conversions between recursive and non-recursive functions isn’t always trivial (except by mindless manual re-implementation of the call stack).

For further information see this answer.

Spinthariscope answered 2/11, 2009 at 17:8 Comment(3)
Great answer! However in practice I have great difficulty turing recursive algos into iterative ones. For example I was unable so far to turn the monomorphic typer presented here community.topcoder.com/… into an iterative algorithmNobility
"strictly equivalent" - Does that mean something different than just "equivalent"?Maryammaryann
@KellyBundy “equivalent” on its own can be a vague term. I could have instead written “mathematically equivalent” or “equivalent under formal rules of computation”.Spinthariscope
D
32

Recursion is implemented as stacks or similar constructs in the actual interpreters or compilers. So you certainly can convert a recursive function to an iterative counterpart because that's how it's always done (if automatically). You'll just be duplicating the compiler's work in an ad-hoc and probably in a very ugly and inefficient manner.

Douro answered 31/5, 2009 at 10:10 Comment(2)
why ugly an inefficient ? I thought it was cleaner (and easier to debug) using loops, in particular for avoiding a stack overflow when the number of recursions is large.Tymon
No, the compiler uses stack memory, you'd be using heap memory. There is a huge difference: for example preventing stackoverflows; ironic considering the name of the sire we're on.Carborundum
K
14

Basically yes, in essence what you end up having to do is replace method calls (which implicitly push state onto the stack) into explicit stack pushes to remember where the 'previous call' had gotten up to, and then execute the 'called method' instead.

I'd imagine that the combination of a loop, a stack and a state-machine could be used for all scenarios by basically simulating the method calls. Whether or not this is going to be 'better' (either faster, or more efficient in some sense) is not really possible to say in general.

Kinsella answered 31/5, 2009 at 10:1 Comment(0)
E
14
  • Recursive function execution flow can be represented as a tree.

  • The same logic can be done by a loop, which uses a data-structure to traverse that tree.

  • Depth-first traversal can be done using a stack, breadth-first traversal can be done using a queue.

So, the answer is: yes. Why: https://mcmap.net/q/82465/-which-recursive-functions-cannot-be-rewritten-using-loops-duplicate.

Can any recursion be done in a single loop? Yes, because

a Turing machine does everything it does by executing a single loop:

  1. fetch an instruction,
  2. evaluate it,
  3. goto 1.
Ednaedny answered 23/3, 2013 at 16:17 Comment(0)
I
11

Yes, using explicitly a stack (but recursion is far more pleasant to read, IMHO).

Indraft answered 31/5, 2009 at 9:52 Comment(1)
I wouldn't say it's always more pleasant to read. Both iteration and recursion have their place.Delldella
A
7

Yes, it's always possible to write a non-recursive version. The trivial solution is to use a stack data structure and simulate the recursive execution.

Abroach answered 2/11, 2009 at 16:52 Comment(2)
Which either defeats the purpose if your stack data structure is allocated on the stack, or takes way longer if it's allocated on the heap, no? That sounds trivial but inefficient to me.Apparently
@conradk In some cases, it's the practical thing to do if you must perform some tree-recursive operation on a problem that is sufficiently large to exhaust the call stack; heap memory is typically much more plentiful.Taciturnity
J
4

In principle it is always possible to remove recursion and replace it with iteration in a language that has infinite state both for data structures and for the call stack. This is a basic consequence of the Church-Turing thesis.

Given an actual programming language, the answer is not as obvious. The problem is that it is quite possible to have a language where the amount of memory that can be allocated in the program is limited but where the amount of call stack that can be used is unbounded (32-bit C where the address of stack variables is not accessible). In this case, recursion is more powerful simply because it has more memory it can use; there is not enough explicitly allocatable memory to emulate the call stack. For a detailed discussion on this, see this discussion.

Jamesy answered 8/7, 2009 at 7:47 Comment(0)
S
3

All computable functions can be computed by Turing Machines and hence the recursive systems and Turing machines (iterative systems) are equivalent.

Shrivel answered 21/5, 2013 at 11:10 Comment(0)
S
1

Sometimes replacing recursion is much easier than that. Recursion used to be the fashionable thing taught in CS in the 1990's, and so a lot of average developers from that time figured if you solved something with recursion, it was a better solution. So they would use recursion instead of looping backwards to reverse order, or silly things like that. So sometimes removing recursion is a simple "duh, that was obvious" type of exercise.

This is less of a problem now, as the fashion has shifted towards other technologies.

Swaim answered 31/5, 2009 at 11:23 Comment(0)
G
1

Recursion is nothing just calling the same function on the stack and once function dies out it is removed from the stack. So one can always use an explicit stack to manage this calling of the same operation using iteration. So, yes all-recursive code can be converted to iteration.

Gabriel answered 29/7, 2021 at 19:49 Comment(0)
L
0

Removing recursion is a complex problem and is feasible under well defined circumstances.

The below cases are among the easy:

Lamination answered 31/5, 2009 at 10:1 Comment(0)
I
0

Appart from the explicit stack, another pattern for converting recursion into iteration is with the use of a trampoline.

Here, the functions either return the final result, or a closure of the function call that it would otherwise have performed. Then, the initiating (trampolining) function keep invoking the closures returned until the final result is reached.

This approach works for mutually recursive functions, but I'm afraid it only works for tail-calls.

http://en.wikipedia.org/wiki/Trampoline_(computers)

Imperative answered 31/5, 2009 at 10:17 Comment(0)
P
0

I'd say yes - a function call is nothing but a goto and a stack operation (roughly speaking). All you need to do is imitate the stack that's built while invoking functions and do something similar as a goto (you may imitate gotos with languages that don't explicitly have this keyword too).

Prevost answered 2/11, 2009 at 16:52 Comment(1)
I think the OP is looking for a proof or something else substantiveParboil
D
0

Have a look at the following entries on wikipedia, you can use them as a starting point to find a complete answer to your question.

Follows a paragraph that may give you some hint on where to start:

Solving a recurrence relation means obtaining a closed-form solution: a non-recursive function of n.

Also have a look at the last paragraph of this entry.

Dumfries answered 2/11, 2009 at 17:5 Comment(0)
A
0

It is possible to convert any recursive algorithm to a non-recursive one, but often the logic is much more complex and doing so requires the use of a stack. In fact, recursion itself uses a stack: the function stack.

More Details: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Functions

Addia answered 25/1, 2016 at 14:41 Comment(0)
C
-2

tazzego, recursion means that a function will call itself whether you like it or not. When people are talking about whether or not things can be done without recursion, they mean this and you cannot say "no, that is not true, because I do not agree with the definition of recursion" as a valid statement.

With that in mind, just about everything else you say is nonsense. The only other thing that you say that is not nonsense is the idea that you cannot imagine programming without a callstack. That is something that had been done for decades until using a callstack became popular. Old versions of FORTRAN lacked a callstack and they worked just fine.

By the way, there exist Turing-complete languages that only implement recursion (e.g. SML) as a means of looping. There also exist Turing-complete languages that only implement iteration as a means of looping (e.g. FORTRAN IV). The Church-Turing thesis proves that anything possible in a recursion-only languages can be done in a non-recursive language and vica-versa by the fact that they both have the property of turing-completeness.

Corazoncorban answered 6/5, 2010 at 10:59 Comment(1)
The Church-Turing thesis is an informal hypothesis that anything that can be computed by any kind of algorithm, including kinds that haven't been discovered or invented yet, can be computed by a Turing machine. Since there is no formal definition of "any kind of algorithm", the C-T thesis is not a mathematical theorem. What is a theorem (proven by Church and Turing) is the equivalence between Turing machines and Church's lambda calculus.Catalogue
S
-3

Here is an iterative algorithm:

def howmany(x,y)
  a = {}
  for n in (0..x+y)
    for m in (0..n)
      a[[m,n-m]] = if m==0 or n-m==0 then 1 else a[[m-1,n-m]] + a[[m,n-m-1]] end
    end
  end
  return a[[x,y]]
end
Syntactics answered 31/5, 2009 at 10:43 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.