Replacing functions with Table Lookups
Asked Answered
Q

3

11

I've been watching this MSDN video with Brian Beckman and I'd like to better understand something he says:

Every imperitive programmer goes through this phase of learning that functions can be replaced with table lookups

Now, I'm a C# programmer who never went to university, so perhaps somewhere along the line I missed out on something everyone else learned to understand.

What does Brian mean by:

functions can be replaced with table lookups

Are there practical examples of this being done and does it apply to all functions? He gives the example of the sin function, which I can make sense of, but how do I make sense of this in more general terms?

Quarterage answered 7/12, 2012 at 15:51 Comment(1)
look at java tableSwitch and lookupSwitch, how clojure hashmaps are functions of their keysSerra
M
15

Brian just showed that the functions are data too. Functions in general are just a mapping of one set to another: y = f(x) is mapping of set {x} to set {y}: f:X->Y. The tables are mappings as well: [x1, x2, ..., xn] -> [y1, y2, ..., yn].

If function operates on finite set (this is the case in programming) then it's can be replaced with a table which represents that mapping. As Brian mentioned, every imperative programmer goes through this phase of understanding that the functions can be replaced with the table lookups just for performance reason.

But it doesn't mean that all functions easily can or should be replaced with the tables. It only means that you theoretically can do that for every function. So the conclusion would be that the functions are data because tables are (in the context of programming of course).

Metage answered 7/12, 2012 at 17:33 Comment(4)
Thanks @Metage - That makes things a lot clearer. As always, what's being pointed to is much simpler than my mind would allow me to see :)Quarterage
It should be noted that only finite functions can be replaced by a table, since tables are necessarily finite, whereas functions tend to be infinite, i.e. have an infinite domain. However a common technique for computationally expensive functions is to back the function with an ever-growing table of remembered values, so that a result only has to be computed once for each input--this is called memorization.Clip
@pelotom it's possible to build finite interpolation functions-in-tables that approximate the original victim function well. An example of this is co-locating NURBs (Non-Uniform Rational BSplines). I've had some success at moderately generalized auto-interpolation schemes that optimize continuous functions on-the-fly.Maggs
"Every imperitive programmer goes through this phase of learning that functions can be replaced with table lookups". Is this something these learning programmers do as a purely mental exercise, or can you actually write code to perform such a lookup, and thereby see the increased performance. If so, there must be tons of examples in various programming text books or computer science books,no?Selfesteem
M
5

There is a lovely trick in Mathematica that creates a table as a side-effect of evaluating function-calls-as-rewrite-rules. Consider the classic slow-fibonacci

fib[1] = 1
fib[2] = 1
fib[n_] := fib[n-1] + fib[n-2]

The first two lines create table entries for the inputs 1 and 2. This is exactly the same as saying

fibTable = {};
fibTable[1] = 1;
fibTable[2] = 1;

in JavaScript. The third line of Mathematica says "please install a rewrite rule that will replace any occurrence of fib[n_], after substituting the pattern variable n_ with the actual argument of the occurrence, with fib[n-1] + fib[n-2]." The rewriter will iterate this procedure, and eventually produce the value of fib[n] after an exponential number of rewrites. This is just like the recursive function-call form that we get in JavaScript with

function fib(n) {
  var result = fibTable[n] || ( fib(n-1) + fib(n-2) );
  return result;
}

Notice it checks the table first for the two values we have explicitly stored before making the recursive calls. The Mathematica evaluator does this check automatically, because the order of presentation of the rules is important -- Mathematica checks the more specific rules first and the more general rules later. That's why Mathematica has two assignment forms, = and :=: the former is for specific rules whose right-hand sides can be evaluated at the time the rule is defined; the latter is for general rules whose right-hand sides must be evaluated when the rule is applied.

Now, in Mathematica, if we say

fib[4]

it gets rewritten to

fib[3] + fib[2]

then to

fib[2] + fib[1] + 1

then to

1 + 1 + 1

and finally to 3, which does not change on the next rewrite. You can imagine that if we say fib[35], we will generate enormous expressions, fill up memory, and melt the CPU. But the trick is to replace the final rewrite rule with the following:

fib[n_] := fib[n] = fib[n-1] + fib[n-2]

This says "please replace every occurrence of fib[n_] with an expression that will install a new specific rule for the value of fib[n] and also produce the value." This one runs much faster because it expands the rule-base -- the table of values! -- at run time.

We can do likewise in JavaScript

function fib(n) {
  var result = fibTable[n] || ( fib(n-1) + fib(n-2) );
  fibTable[n] = result;
  return result;
}

This runs MUCH faster than the prior definition of fib.

This is called "automemoization" [sic -- not "memorization" but "memoization" as in creating a memo for yourself].

Of course, in the real world, you must manage the sizes of the tables that get created. To inspect the tables in Mathematica, do

DownValues[fib]

To inspect them in JavaScript, do just

fibTable

in a REPL such as that supported by Node.JS.

Maggs answered 7/12, 2012 at 23:11 Comment(2)
BTW, this is Brian Beckman: "rebcabin" is an anagram of "brianbec" and is my usual nom-de-internet.Maggs
For me it seems like what you talking about is cachingAssimilable
I
2

In the context of functional programming, there is the concept of referential transparency. A function that is referentially transparent can be replaced with its value for any given argument (or set of arguments), without changing the behaviour of the program.

Referential Transparency

For example, consider a function F that takes 1 argument, n. F is referentially transparent, so F(n) can be replaced with the value of F evaluated at n. It makes no difference to the program.

In C#, this would look like:

public class Square
{
    public static int apply(int n)
    {
        return n * n;
    }

    public static void Main()
    {
        //Should print 4
        Console.WriteLine(Square.apply(2));
    }
}

(I'm not very familiar with C#, coming from a Java background, so you'll have to forgive me if this example isn't quite syntactically correct).

It's obvious here that the function apply cannot have any other value than 4 when called with an argument of 2, since it's just returning the square of its argument. The value of the function only depends on its argument, n; in other words, referential transparency.

I ask you, then, what the difference is between Console.WriteLine(Square.apply(2)) and Console.WriteLine(4). The answer is, there's no difference at all, for all intents are purposes. We could go through the entire program, replacing all instances of Square.apply(n) with the value returned by Square.apply(n), and the results would be the exact same.

So what did Brian Beckman mean with his statement about replacing function calls with a table lookup? He was referring to this property of referentially transparent functions. If Square.apply(2) can be replaced with 4 with no impact on program behaviour, then why not just cache the values when the first call is made, and put it in a table indexed by the arguments to the function. A lookup table for values of Square.apply(n) would look somewhat like this:

              n: 0 1 2 3 4  5  ...
Square.apply(n): 0 1 4 9 16 25 ...

And for any call to Square.apply(n), instead of calling the function, we can simply find the cached value for n in the table, and replace the function call with this value. It's fairly obvious that this will most likely bring about a large speed increase in the program.

Intercolumniation answered 7/12, 2012 at 17:49 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.