Reduce the number of operations on a simple expression
Asked Answered
M

5

15

Lets say I take a computation that involves only addition and multiplication:

(a+b)*(c+d)

which can be done in many other ways, eg.

a*(c+d) + b*(c+d)
a*c + a*d + b*c + b*d

In terms of additions and multiplications the number of operations required for each of the three examples shown are (2,1) (3,2) (3,4) respectively. Clearly, if the goal is to reduce the total number of operations the first is superior. Is there a way, given an arbitrary expression to find the computation order that requires the least number of operations?

Note: This question is being re-asked from SE.math for the insight and perspective of the CS crowd.

Mcarthur answered 7/12, 2010 at 21:12 Comment(2)
Given the difficulties found in optimizing matrix multiplication, I strongly suspect that this is an unsolved problem.Continue
If the goal is really to reduce the total number of operations then why would you use a computer to figure it out? I'm just sayin'.Chrisman
N
9

What you want is to effectively generate all possible equivalent algebraic expressions, and choose the one that takes the least costly number of steps (adding X three times is cheaper on most machines than multiplying X by 3).

It is impractical to do this, as the number of "equivalent" formulas is infinite.

However, Pelegrí-Llopart figured out a scheme to generate optimal code given a fixed number of algebraic rewrite rules, called "BURS" (bottom-up rewrite system). This has been implemented in some code generators.

In essence, he constructs offline a big automata whose states track the set of possible applied rewrites. Each state knows what to generate when it occurs, so the online time for code generation is cheap.

Nannana answered 22/12, 2010 at 10:53 Comment(1)
This is what I was looking for! I guess it's unsurprising that the answer came compiler optimization tricks.Mcarthur
G
5

Ignoring powers of variables and integer coefficients, this reduces to a Karnaugh Map problem.

K-Maps can be represented in sum-of-products form and product-of-sums form, each of which represents a binary circuit. A "fewest operations" form would represent an optimized binary circuit, right?

Goldstein answered 22/12, 2010 at 0:22 Comment(5)
I don't see the connection. Could you add some details?Heinrich
My point is that since circuit minimization problems are believed to be intractable [Wikipedia] that no procedural program will find an optimal solution in polynomial time... The binary expression (a+b)*(c+d) evaluates to TRUE if (either a OR b is TRUE) AND (either c OR d is TRUE). In this sense arithmetic which only involves + (OR) and * (AND) where each variable is either 0 or 1 represents a binary circuit.Goldstein
But, you can precompute optimal for specific sets offline, and apply that answer online. See my answer.Nannana
Wow, I'm surprised that such optimizations have already been implemented given the computation power and memory space required for this approach.Goldstein
Here intractable is a technical term meaning that if you come up with a solution that scales better than exponentially, all mathematicians on the planet will be more surprised than if God looked down from the sky and spoke to them. (Or, as mathematicians like to put it, while grinning, "it's still an open question whether this is possible or not".) But small instances can be solved simply, no problem.Doubletree
H
4

There is a Horner's Rule for the efficient evaluation of polynomials in monomial form. According to it, given a polynomial of degree n

p(x) = anxn + an-1xn-1 + ... + a1x1 + a0

Only n multiplications are needed (not n+(n-1)+(n-2)+...+1 = n(n+1)/2, as it may seem from the first glance). That is because the polinomial can be rewritten as

p(x) = (((anx + an-1)x + an-2)x + ... a1)x + a0

Heinrich answered 20/12, 2010 at 11:8 Comment(0)
A
1

One idea - let's consider the variables as boolean values and write the maxterm form link text

Ancel answered 21/12, 2010 at 9:32 Comment(1)
Do both the SumOfProduct and ProductOfSum and see which is shorter.Karame
T
0

Not sure about the general case, but it does seem that factoring polynomials improves performance. An example from a distant comp sci course:

a * x^2 + b * x + c

is improved by factoring out x:

x * (a * x + b) + c
Trimorphism answered 7/12, 2010 at 21:33 Comment(2)
Clearly factoring out a polynomial is a good heuristic, however I am interested in the best possible case, making this a well-defined (if hard?) minimization problem.Mcarthur
Yes, I see. Your example and mine both give the best possible case by factoring to death. Another example: abc+bcd+ade; factors three ways: bc(a+d)+ade, a(bc+de)+bcd, d(bc+ae)+abc. Clearly more is needed. Assuming this is the right direction, I would brute force the optimization. Perhaps another will propose a more elegant solution.Trimorphism

© 2022 - 2024 — McMap. All rights reserved.