The complexity of the multiplication of two lower triangular matrices
Asked Answered
C

2

11

I know that the lower bound of the multiplication of two full matrices is Ω(n^2). Matrix multiplication

I have been trying to prove that the lower bound of the multiplication of two lower triangular matrices using problem transformation method.

My initial thinking is that to (1) transform the lower triangular matrix, (2) estimate the time complexity of such transformation.

T(lower_triangular_matrix_multiplication(n))+O(lower_triangular_matrix_transformation(n))>Ω(full_matrix_multiplication(n)) = Ω(n^2)

Now, I only have to prove O(lower_triangular_matrix_transformation(n)) and I need to make triangular matrix to be a full matrix so I just let this triangular matrix be multiplied by a variation of itself, say transpose, for simplicity.

The reason is that the square of a lower triangular matrix is still a lower triangular matrix and a lower triangular matrix multiplied by its transposed variation is a "full matrix".

So I only need to analyze the complexity of a triangular matrix multiplied by its transposed variation.

Could anyone indicate if my thinking is "reasonable"?

Caliginous answered 11/3, 2016 at 9:46 Comment(1)
Just a thought, is this not better suited for the Mathematics StackExchange? This really is a math question, and not a coding/programming question, although it is algorithmic.Ictinus
S
3

I'm not convinced that by constructing an algorithm is sufficient proof for lower bound, as it still could not be excluded that there would exist a different algorithm with lower complexity.

It is clear that the lower bound will not be higher than O(n^2) as the general multiplication would always be applicable to lower_triangle_matrices (ltm).

Now, as any transformation of an arbitrary matrix to one ore more ltm is be itself an operation of O(n^2) complexity, we may not deduce that any such algorithm does not exist.

Your outline of reasoning seems to suggest that adding complexities is following "normal" arithmetical rules. This is not the case!
The resulting complexity always is (at least) the maximum of the complexities of the algorithms parts.

So your reasoning seems not to be sound.

You could try one of the following:

  1. construct an algorithm with lower complexity (proof be existence)
    when target is O(ltm) < O(n^2)
  2. find a transformation that has complexity < O(n^2) and that yields ltm. Then any algorithm for ltm multiplication that has lower complexity would provide an algorithm for arbitrary matrices with complexity This would contradict your initial proposition.
    This however, requires a transformation of sufficient low complexity. If this is not known. The argument is not usable.

To me it looks as if the step from T()+O() > O(n^2) is not well grounded. And from this the conclusion to just have to proof (O(ltm)) is broken.

Summertree answered 21/3, 2016 at 13:37 Comment(0)
C
2

I was thinking that the solution might be to transform A into A+A'. Both the complexity of the operations of transpose and plus is O(n^2). So, O(lower_triangular_matrix_transformation(n))=n^2. Because the lower bound of AA and the one of (A+A')(A+A') are also Ω(n^2), T(lower_triangular_matrix_multiplication(n))>Ω(full_matrix_multiplication(n))-O(lower_triangular_matrix_transformation(n)), which means that the lower bound of triangular matrix and the one of full matrix are identical.

Q.E.D.

Caliginous answered 15/3, 2016 at 8:10 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.