My Mathematical Journey: The Alternating Sign Matrix Theorem

By: David Bressoud @dbressoud


David Bressoud is DeWitt Wallace Professor Emeritus at Macalester College and former Director of the Conference Board of the Mathematical Sciences

Problem: Prove the Vandermonde Determinant Formula which says that the determinant of the matrix for which the entry in row i, column j is (x­_j)^{ni) is the product of x_ix_j, taken over all pairs i < j (see Figure 1).


By late fall of 1978 I was well into the second year of my two-year appointment at Penn State. I applied for the tenure-track position there, but also applied for other positions including visiting positions at the Institute for Advanced Studies (IAS) and the University of Wisconsin where I could work with Dick Askey. Early in 1979 both applications for visiting positions came through. Aside from Penn State, the one application for a tenure-track position that I most cared about was at Bryn Mawr. In addition to its reputation as a premiere liberal arts college and its close ties to Swarthmore and Haverford, I valued its doctoral program and very close connection to Penn. A tenure-track offer from Penn State came through. I contacted Bryn Mawr and was told that I was on their list of finalists, but they were not yet prepared to choose who to invite onto campus. I accepted the Penn State offer. I’ve often wondered how my career would have been different if I had been offered and had accepted a position at Bryn Mawr that year.

Penn State agreed to let me go for the next two years. I spent 1979–80 at IAS and 1980–81 at Wisconsin. The next year I received a Sloan Foundation Fellowship which enabled me to spend fall 1982 back in Wisconsin and fall 1983 in Minnesota to work with Dennis Stanton, a student of Dick Askey who had received his PhD the same year that I did. Being relieved of many of my teaching responsibilities put me on a fast track for early tenure, and by 1986 I was a full Professor.

Figure 1: The Vandermonde determinant formula.

These were heady years in which I published many papers on partition theory, combinatorics, and special functions, but I made little contribution to the most important evolving mathematics that I have encountered, the Alternating Sign Matrix (ASM) Conjecture. It is a fascinating, even romantic story that I would eventually write up in Proofs and Confirmations: The Story of the Alternating Sign Matrix Conjecture. That book won the MAA’s Beckenbach book prize in 1999. MathSciNet credits 257 papers that reference it. Google Scholar lists 679 citations. This complex story says a great deal about the nature of mathematics. I learned that the best mathematics research is less about proving theorems than making insightful conjectures and establishing connections.

David Robbins, in collaboration with William Mills and Howard Rumsey at the Institute for Defense Analysis in Princeton, began with a conjecture that arose from work in linear algebra. With help from Richard Stanley at MIT, they connected their problem to work done by George Andrews in partition theory, which in turn led them to a conjecture of Ian Macdonald in representation theory. While unable to solve their own conjecture, this connection gave Mills, Robbins, and Rumsey insight into Macdonald’s problem, enabling them to prove his conjecture, a very big deal given that earlier that same year Stanley had described the Macdonald conjecture as “The most interesting open problem in all of enumerative combinatorics.”

Figure 2: A 5 by 5 alternating sign matrix.

An alternating sign matrix is a square matrix with entries 0, +1, or –1 with the condition that the entries in each row and column sum to +1 and non-zero entries alternate in sign across each row and each column, as shown in Figure 2. The Mills, Robbins, and Rumsey conjecture was that the number of n by n alternating sign matrices is given by the product in Figure 3. The formula itself gives no hint of the beautiful structures that they discovered and which lead to this conjecture. The interested reader will need to go my book or the article that Jim Propp and I published in the Notices of the AMS (see references).

Figure 3. The Alternating Sign Matrix Theorem, conjectured by Mills, Robbins, and Rumsey.

A host of mathematicians and physicists contributed to the process of solving the Alternating Sign Matrix Conjecture, with final proofs discovered by Doron Zeilberger and Greg Kuperberg. The whole story draws on rich veins in mathematics including determinant formulas, lattice paths, symmetric functions, hypergeometric series, and statistical mechanics. That these were all involved should not be surprising since they all, in one way or another, draw on arguments from symmetry. More important than the actual proof were the many connections that were established and the multitude of conjectures that have emerged, still to this day providing fertile ground for new discoveries.

Karl Popper (1902–1992) and Imre Lakatos (1922–1974)

The title of my book is a play on Imre Lakatos’s Proofs and Refutations, which itself was a play on Karl Popper’s Conjectures and Refutations. Popper, a philosopher of science, demarcated scientific statements as those that can be falsified. For Popper, science consists of the formulation of theories and assertions that are sufficiently specific and testable that one can imagine results that would disprove them. Conjectures and Refutations is a collection of essays that elaborate on this dynamic. Popper argues that it is precisely when a scientific theory is shown to be in error that science makes its greatest advances.

Proofs and Confirmations.

Popper spent his later years on the faculty of the London School of Economics. Lakatos was one of his doctoral students. For his thesis, which would become Proofs and Refutations, Lakatos explored the extent to which this true of mathematics. At first blush, Popper’s reliance on falsifiability seems irrelevant to mathematics. Mathematics deals in proofs that are not subject to refutation.  But building on communications with George Pólya, Lakatos developed the theory that at least some of the greatest advances in mathematics occur when a theorem is proven and then a counter-example is discovered.

Lakatos illustrates his point with Euler’s formula for polyhedra, that v – e + f = 2, where v is the number of vertices, e is the number of edges, and f is the number of faces. Over the years, mathematicians found counterexamples that were, in reality, less about the veracity of this result than the precise definition of a polyhedron. Another example appears in an appendix to his book in which he discusses Cauchy’s proof that every infinite sum of continuous functions is continuous, a result that Fourier noted “suffers exceptions,” Fourier series being a prime example. The problem here is once again an insufficiently precise definition. Several historians of mathematics have argued that Cauchy really did mean uniform convergence, but that is exactly Lakatos’s point. It was the recognition of such counterexamples to a proven theorem that forced mathematicians to refine their definitions. In doing so, they made real progress.

It should not be assumed that recognizing the need to refine a definition is simple. It was not until twenty years after Cauchy’s proof that the concept of uniform convergence was clearly articulated, and several more decades before it became common parlance. I will have much more to say about this when I discuss the creation of my real analysis text, A Radical Approach to Real Analysis.

The story of the ASM Conjecture takes an understanding of the role of proof into a very different direction. The ASM Conjecture, like any truly great conjecture, is one that had to be true. Our belief in the beauty and regularities of mathematics made it inconceivable that it could be false. The search for proof is not about establishing the validity of the insight so much as seeking to understand why it is true. One can have confirmation that a mathematical statement is correct without a proof. The role of proof for any great conjecture is to take us to new and deeper questions, to make fresh connections and to open new territory for exploration.


The Vandermonde determinant formula invites an account of its backstory. Matrix determinants as a tool for solving systems of linear equations had been known and used in low-dimensional cases by Leibniz, Maclaurin, Cramer, Euler, and Lagrange. Vandermonde did make a significant contribution to our understanding of determinants, but the determinant that carries his name is not something he considered. The first person to consider this determinant in its full generality was Augustin Louis Cauchy. It appears in his “Memoir on functions that can only take on two equal values of opposite sign when they are acted upon by the transposition of two of their variables,” what today we refer to as alternating functions, functions of several variables that change sign when any two of the variables are interchanged.

This paper, written in 1812 while he was a second lieutenant working on the harbor expansion at Cherbourg in preparation for Napoleon’s planned invasion of England, gave general rules for evaluating determinants, created the term determinant for this operation, and proved for the first time that the determinant of a product of matrices is the product of the respective determinants.

Ironically, he defined the determinant by asserting that the matrix with entries (x_j)^{n–i}has determinant equal to the product over ij of x_ix_j. In other words, for Cauchy the equation shown in Figure 1 was not a theorem but a definition. But he then proves that this product is equal to a sum over permutations, the definition of the determinant that is commonly used today. See Figure 4 for Cauchy’s proof that this product and this sum are equal, still the simplest proof of this fact.

Figure 4: Cauchy’s proof of the Vandermonde determinant formula.

Figure 5: A plane partition of 75 (75 cubes stacked in a corner).

If we divide an alternating function by the Vandermonde product, we get a symmetric function that is unchanged when any two of the variables are interchanged. There is a strong connection between symmetric functions, which play a major role in representation theory, and partitions. It is not at all surprising that Schur, who was noted for his work in representation theory, also contributed to partition theory.

There is a strong connection between the semistandard tableau that arise in the study of Schur functions and plane partitions, which can be thought of as three-dimensional Ferrers graphs or cubes stacked into a corner (Figure 5). In 1912, MacMahon proved that the generating for plane partitions that fit inside an r by s by t box is given by the rational product shown in Figure 6. This can be proven using Schur functions, but another very appealing approach by Ira Gessel and X.G. Viennot (1985) uses lattice paths and a determinant evaluation that was given an elegant proof by Christian Krattenthaler in 1990 (Figure 7).

All this has been by way of whetting the readers’ appetite for some of the beautiful arguments contained in Proofs and Confirmations, my account of the proof of the Alternating Sign Matrix Conjecture.

Figure 6: MacMahon’s Theorem

Figure 7: Krattenthaler’s Theorem.

References

Bressoud, D.M. (1999). Proofs and Confirmations: The Story of the Alternating Sign Matrix Conjecture. MAA Sprectrum Series and Cambridge University Press

Bressoud, D. and Propp, J. (1999). How the alternating sign matrix conjecture was solved. Notices of the AMS. 46 (6), 637–646.

Gessel, I. and Viennot, X.G. (1995). Binomial determinants, paths, and hook length formulae. Advances in Mathematics. 58 (3), 300–321.

Krattenthaler, C. (1990). Generating fnctions for plane partitions of a given shape. Manuscripta Mathematica 69, 173–201.



Download the list of all past Launchings columns, dating back to 2005, with links to each column.