Complexity of Matrix Multiplication

Given an undirected graph G(V,E), how fast can we detect if G is triangle-free ? Cubic time is obvious. Let A be the adjacency matrix of G. We can detect triangle-freeness of G in the same complexity as multiplying two boolean matrices (AxA) (duh !!). This simple algorithm is the best known !! In other words, following is the open problem :
* Is testing triangle-freeness as difficult as the Boolean multiplication of two |V| x |V | matrices?
A recent paper [1] addressess this problem partially. In a related note, the complexity of all pairs shortest paths (APSP) is still unresolved. Is APSP (for undirected graphs) as difficult as the Boolean multiplication of two |V| x |V | matrices?
[1] N. Alon, T. Kaufman, M. Krivelevich, and D. Ron. Testing triangle-freeness in general graphs. Proceedings of the 17th Annual ACM-SIAM Symposium on Discrete Algorithms (SODA), pp. 279–-288, 2006.

Today’s post is about the following question and related problems.

Given an undirected graph G(V,E), how fast can we detect if G is triangle-free ?

Cubic time is obvious. A faster algorithm computes the square of the adjacency matrix of G and runs in time n^{\omega}, where \omega is the matrix multiplication exponent. This simple algorithm is the best known !!

Is testing triangle-freeness as hard as the Boolean multiplication of two |V| x |V | matrices?

Look at [AKKR’06] for recent results. A related problem is all pairs shortest paths (APSP). This is a fundamental graph problem and its exact complexity is still open.

Is APSP (for undirected graphs) as difficult as the Boolean multiplication of two |V| x |V | matrices?

These problems are tightly related to matrix multiplication, whose complexity is also open. The best value of \omega is 2.376. At Barriers Workshop, Chris Umans presented an exciting group-theoretic approach [CU’03, CKSU’05] to improving \omega. Using their techniques, the best value of \omega they achieved is 2.41. In their paper, they made two conjectures, one combinatorial and the other algebraic. Either one would imply that the exponent of matrix multiplication is 2.

Is \omega= 2 ?
Now, that would be a surprise in mathematics and theory. Many well-known problems are reducible to matrix multiplication. Eg : determinant, LUP decompositions, matrix inversion and many problems in linear algebra.

The techniques which led to 2.376 are primarily based on divide and conquer. The technique of [CU’03, CKSU’05] is a different one and worth pursuing. Are there any other techniques explored ? If you know of other techniques, please leave a comment, even if they resulted in a modest value of \omega.


References :

  • [AKKR’06] N. Alon, T. Kaufman, M. Krivelevich, and D. Ron. Testing triangle-freeness in general graphs. Proceedings of the 17th Annual ACM-SIAM Symposium on Discrete Algorithms (SODA), pp. 279–-288, 2006.
  • [CKSU’05] Henry Cohn, Robert D. Kleinberg, Balázs Szegedy, Christopher Umans. Group-theoretic Algorithms for Matrix Multiplication. FOCS 2005: 379-388
  • [CU’03] Henry Cohn, Christopher Umans: A Group-Theoretic Approach to Fast Matrix Multiplication. FOCS 2003: 438-449

2 thoughts on “Complexity of Matrix Multiplication

  1. [AKKR 06] deals with testing triangle-freeness in the property testing model, that is distinguishing triangle-free graphs from those far from triangle-free. I don’t believe any connection to matrix multiplication is known in this setting.

Leave a comment