Is (pure) functional programming antagonistic with "algorithm classics"?
Asked Answered
A

4

31

The classic algorithm books (TAOCP, CLR) (and not so classic ones, such as the fxtbook)are full of imperative algorithms. This is most obvious with algorithms whose implementation is heavily based on arrays, such as combinatorial generation (where both array index and array value are used in the algorithm) or the union-find algorithm.

The worst-case complexity analysis of these algorithms depends on array accesses being O(1). If you replace arrays with array-ish persistent structures, such as Clojure does, the array accesses are no longer O(1), and the complexity analysis of those algorithms is no longer valid.

Which brings me to the following questions: is pure functional programming incompatible with the classical algorithms literature?

Arianna answered 1/8, 2011 at 12:23 Comment(3)
You may be interested in this related question: #1990964Sower
The only algorithms thate are really annoying to do purely are arrays and graphs, imo.Shontashoo
@Matthias: write your comment as an answer, and I will mark it as such. Even though it is just a link, it actually does answer my question and gives useful references.Arianna
S
2

You may be interested in this related question: Efficiency of purely functional programming.

is there any problem for which the best known non-destructive algorithm is asymptotically worse than the best known destructive algorithm, and if so by how much?

Sower answered 2/8, 2011 at 9:11 Comment(0)
Z
8

With respect to data structures, Chris Okasaki has done substantial research into adopting classic data structures into a purely functional setting, as many of the standard data structures no longer work when using destructive updates. His book "Purely Functional Data Structures" shows how some structures, like binomial heaps and red/black trees, can be implemented quite well in a functional setting, while other basic structures like arrays and queues must be implemented with more elaborate concepts.

If you're interested in pursuing this branch of the core algorithms, his book would be an excellent starting point.

Zora answered 2/8, 2011 at 5:29 Comment(0)
S
8

The short answer is that, so long as the algorithm does not have effects that can be observed after it finishes (other than what it returns), then it is pure. This holds even when you do things like destructive array updates or mutation.

If you had an algorithm like say:

function zero(array):
    ix <- 0
    while(ix < length(array)):
       array[ix] <- 0
       ix <- ix+1
    return array

Assuming our pseudocode above is lexically scoped, so long as the array parameter is first copied over and the returned array is a wholly new thing, this algorithm represents a pure function (in this case, the Haskell function fmap (const 0) would probably work). Most "imperative" algorithms shown in books are really pure functions, and it is perfectly fine to write them that way in a purely functional setting using something like ST.

I would recommend looking at Mercury or the Disciple Disciplined Compiler to see pure languages that still thrive on destruction.

Spectacles answered 2/8, 2011 at 7:28 Comment(0)
S
2

You may be interested in this related question: Efficiency of purely functional programming.

is there any problem for which the best known non-destructive algorithm is asymptotically worse than the best known destructive algorithm, and if so by how much?

Sower answered 2/8, 2011 at 9:11 Comment(0)
M
0

It is not. But it is true that one can see in many book algorithms that look like they are only usable in imperative languages. The main reason is that pure functional programming was restrained to academic use for a long time. Then, the authors of these algorithms strongly relied on imperative features to be in the mainstream. Now, consider two widely spread algorithms: quick sort and merge sort. Quick sort is more "imperative" than merge sort; one of its advantage is to be in place. Merge sort is more "pure" than quick sort (in some way) since it needs to copy and keep its data persistent. Actually many algorithm can be implemented in pure functional programming without losing too much efficiency. This is true for many algorithms in the famous Dragon Book for example.

Mendie answered 2/8, 2011 at 0:31 Comment(2)
IMHO, instead of a simple "It is not" (and then explaining why people believe otherwise), you should actually support your reason and give an alternative implementation of, for instance, a hashtable in functional programming. It's easy to claim that FP can do whatever imperative programming can do, but I can't think of any way off the top of my head for achieving O(1) lookup tables without mutation.Duck
I did not claim that all algorithms can be translate from imperative form to pure functional form... Actually I said that some are more "imperative" than "purely functional". What I've said is that algorithms in these books are not all imperative by nature. This means that some may be. So I completly agree with you : some are hard to be as efficient than their imperative version. This is why I have said : many algorithm can be implement in pure functional programming without losing too much efficiency.Kilmarx

© 2022 - 2024 — McMap. All rights reserved.