In this chapter, we introduce the adjacency matrix of a graph which can be used to obtain structural properties of a graph. In particular, the eigenvalues and eigenvectors of the adjacency matrix can be used to infer properties such as bipartiteness, degree of connectivity, structure of the automorphism group, and many others. This approach to graph theory is therefore called spectral graph theory. Before we begin, we introduce some notation. The identity matrix will be denoted by and the matrix whose entries are all ones will be denoted by . For example, the identity matrix and the all ones matrix are The transpose of a matrix will be denoted by . Recall that a matrix is symmetric if . The entry of a matrix will be denoted by .

The Adjacency Matrix

Let be a graph with vertex set . The adjacency matrix of is the matrix whose entry is Since if and only if , it follows that , and therefore is a symmetric matrix, that is, . By definition, the indices of the non-zero entries of the th row of correspond to the neighbors of vertex . Similarly, the non-zero indices of the th column of are the neighbors of vertex . It follows that the degree of is the sum of the th row (or th column) of , that is, If we denote the column vector of all ones by , then We will call the degree vector of . We note that, after a possible permutation of the vertices, is equal to the degree sequence of .
Consider the graph with and edge set . The adjacency matrix of is
One of the first applications of the the adjacency matrix of a graph is to count walks in . A walk from vertex to vertex (not necessarily distinct) is a sequence of vertices , not necessarily distinct, such that , and and . In this case, the walk is of length . In the case that , then the walk is said to be a closed walk. A walk where all the vertices are distinct is a path and a cycle is a closed walk where the only repeated vertices are the end-vertices of the walk. A closed walk of length three in a graph implies that contains as a subgraph. For obvious reasons, is called a triangle.
For any graph with vertex set , the entry of is the number of walks from to of length .
The proof is by induction on . For , implies that and then clearly there is a walk of length from to . If on the other hand then and are not adjacent and then clearly there is no walk of length from to . Now assume that the claim is true for some and consider the number of walks of length from to . Any walk of length from to contains a walk of length from to a neighbor of . If then by induction the number of walks of length from to is . Hence, the total number of walks of length from to is
The trace of a matrix is the sum of its diagonal entries and will be denoted by : Since all the diagonal entries of an adjacency matrix are all zero we have .
Let be a graph with adjacency matrix . Let be the number of edges in , let be the number of triangles in , and let be the number of 4-cycles in . Then
The entry is the number of closed walks from of length . A closed walk of length counts one edge. Hence, and therefore To prove the second statement, we begin by noting that a closed walk can be traversed in two different ways. Hence, for each vertex in a triangle, there are two walks of length that start at and traverse the triangle. And since each triangle contains three distinct vertices, each triangle in a graph accounts for six walks of length . Since counts all walks in of length three we have Now consider . We count the number of closed walks of length from . There are 3 types of such walks: (1) closed walks of the form where . The number of such walks is since we have choices for and choices for ; (2) closed walks of the form where and , the number of such walks is ; (3) walks along 4-cycles from and there are 2 such walks for each cycle is contained in, say . Hence, Therefore,
Show that the total number of walks of length in a graph with adjacency matrix is .
We also obtain the following as a corollary.
A graph with vertices is connected if and only if the off-diagonal entries of the matrix are all positive. In fact,
We first note that for any , all the entries of are non-negative and therefore if for some then . Assume first that is connected. Then for distinct vertices we have that since there is path from to . Therefore, if then and then also . Hence, all off-diagonal entries of are positive. Now assume that all off-diagonal entries of are positive. Let and be arbitrary distinct vertices. Since then there is a minimum such that . Therefore, there is a walk of length from to . We claim that every such walk is a path. Assume that is a walk (but not a path) from to of length . If is a repeated vertex in , say and for then we may delete the vertices from and still obtain a walk from to . We can continue this process of deleting vertices from to obtain a walk with no repeated vertices, that is, a path. This path has length less than which contradicts the minimality of . This proves the claim and hence all walks of length are paths from to . This proves that is connected.
In the proof of the previous corollary we proved the following.
Every walk from to contains a path from to .
Let . How do you obtain the adjacency matrix of given the adjacency matrix of ?
Recall that for a graph we denote its complement by . Below we give a relationship between the adjacency matrices of and .
For any graph it holds that
Let and let . For , if then , and vice-versa. Therefore, for all . On the other hand, for all . Thus as claimed.

Exercises

Provide the adjacency matrix for each of the following graphs.
  1. The path graph where the vertices are labelled in increasing order from one end to the other along the path.
  2. The cycle graph where the vertices are labelled around the cycle in increasing order.
  3. The complete graph with vertices labelled in any way. { (Do this for small and then write the general form of .)}
  4. The graph .
Consider the complete bipartite graph where and are the parts of the bipartition. Suppose that and . What is the form of the adjacency matrix of ? Try this for small , say and , and then generalize.
Consider the following recursively defined sequence of graphs: and in general if is odd and if is even.
  1. Consider the graph . Label the vertices of using and so that . Using this labelling of the vertices, write out the adjacency matrix of .
  2. Do the same as in part (a) with .
  3. Do the same as in part (a) with .
  4. For general even , what is the general form of the adjacency matrix of ?
For each case, draw the graph with the given adjacency matrix.
Consider the cycle graph with vertices and so that and . Prove that if is even then . (Hint: is bipartite.)
Let and be the adjacency matrices of and , respectively.
  1. What is the adjacency matrix of in terms of and ?
  2. What is the adjacency matrix of in terms of and ?
For each case, assume that if and then the order of the vertices of and is .
Let for . Recall that the diameter a graph, denoted by , is the maximum distance among all vertices in . Prove that if is connected then the smallest integer such that all the off-diagonal entries of are positive is the diameter of .
Let be a -regular graph with adjacency matrix . Prove that the total number of walks of length in is .

The coefficients and roots of a polynomial

As mentioned at the beginning of this chapter, the eigenvalues of the adjacency matrix of a graph contain valuable information about the structure of the graph and we will soon see examples of this. Recall that the eigenvalues of a matrix are the roots of its characteristic polynomial and the coefficients of a polynomial depend in a polynomial way on its roots. For example, expanding the polynomial we obtain from which we see that the coefficient of and the constant term of are polynomial expressions in the roots and . If we define the polynomials and then How about a cubic polynomial? Consider then and expand: Thus, if define the polynomials , , and , then and again we see that the coefficients of are polynomial expressions in the roots . To deal with the general th order polynomial, let us introduce some terminology and notation. A multivariate polynomial is a polynomial in more than one variable. Examples of polynomials in the variables are and examples of polynomials in the variables are We will only consider polynomials with rational coefficients and we will use to denote the set of all polynomials in the variables with rational coefficients. For example, the polynomial is an element of . We now introduce a set of particularly important polynomials.
Let and for let denote the set of all -element subsets of . The th elementary symmetric polynomial in the variables is defined by
In Definition 2.2.1, the notation means that the sum is over all -element subsets of . The number of elements in is and thus is the sum of monomials of the form . We call the th elementary symmetric polynomial in variables. A few examples of are If it is necessary to emphasize that is the th elementary symmetric polynomial in variables then we use the notation but note that the superscript is not an exponent but there only to indicate the number of variables.
For , the elementary symmetric polynomials are and for the elementary symmetric polynomials are For , there are five-element subsets of , and thus is the sum of monomials:
We now describe a natural way in which the elementary symmetric polynomials arise. Introduce a new variable and consider the polynomial Hence, are the roots of the polynomial since for all . Expanding the right hand side, we now show by induction that where the appearing in the coefficients of is the th elementary symmetric polynomial evaluated at . We first begin with the following lemma.
Let denote the th elementary symmetric polynomial in the variables and let denote the th elementary symmetric polynomial in the variables . Then
By definition, A -element subset of that does not contain is an element of and a -element subset of that does contain is the union of and a -element subset of . Therefore, as claimed.
If then where for .
The proof is by induction on the order of the polynomial . The case is trivial. Assume that the claim is true for all polynomials of order and let . Then where . Applying the induction hypothesis to , we have that and then expanding and collecting like terms we obtain We can now apply Lemma 2.2.2 to the coefficients of and obtain as claimed.
Consider the polynomial . Hence, the roots of are , , and , and . Expanding we obtain On the other hand, using the expressions for from , we have:
Let us record the following for future reference.
Consider the polynomial Then is the sum of the roots of and is the product of the roots of .
There is another important set of multivariate polynomials that we will encounter in the next section called the power sums polynomials and that are closely related with the elementary symmetric polynomials. The power sums polynomials in the variables are the polynomials , , , given by The relationship between the elementary symmetric and the power sums polynomials is the following. First of all, it is clear that . Now, and therefore A similar computation yields that The general relationship between the symmetric elementary and power sums polynomials is known as Newton's identities: From Newton's identities we obtain that Now since , it is straightforward to show by induction that each elementary symmetric polynomial can be written only in terms of the power sums polynomial. To summarize, we obtain the following.
Consider the polynomial written in expanded form The coefficients can be expressed in terms of the power sums polynomials evaluated at the roots , that is, there are polynomial functions such that where the are evaluated at .

Exercises

Expand the polynomial and use the expressions for in to verify equation for .
Use Newton's identities to express in terms of .
The polynomial has as a root. Find the other roots of and then find .

The characteristic polynomial and spectrum of a graph

In this section, we introduce the characteristic polynomial and spectrum of a graph and prove some of their basic properties. Before we begin, we recall some basic facts from linear algebra. Recall that is an eigenvalue of the matrix if there exists a vector such that In this case, is called an eigenvector of corresponding to the eigenvalue . To find the eigenvalues of , we find the zeros of the characteristic polynomial of : If is an matrix, then the characteristic polynomial is an th order polynomial and if and only if is an eigenvalue of . From the Fundamental Theorem of Algebra, has eigenvalues, possibly repeated and complex. However, if is a symmetric matrix, then an important result in linear algebra is that the eigenvalues of are all real numbers and we may therefore order them from say smallest to largest: Also, if is symmetric and and are eigenvectors of corresponding to distinct eigenvalues then and are orthogonal, that is, Moreover, if is symmetric, there exists an orthonormal basis of consisting of eigenvectors of . Recall that is an orthonormal basis of if and if , that is, the vectors in are are unit vectors and are mutually orthogonal.
The characteristic polynomial of a graph with adjacency matrix is The spectrum of , denoted by , is the list of the eigenvalues of in increasing order :
Show by direct computation that the characteristic polynomial of is and find the eigenvalues of .
The adjacency matrix of the empty graph is the zero matrix and therefore the characteristic polynomial of is . Hence, has spectrum .
The adjacency matrix of is Consider the vectors , , and . It is not hard to see that are linearly independent. A direct computation yields and therefore is an eigenvalue of . Similarly, a direct computation yields that and . Hence, . Finally, we have that , and therefore is an eigenvalue of . Therefore, the spectrum of is and therefore the characteristic polynomial of is . In general, one can show that and therefore the characteristic polynomial of is .
The following result, and the previous example, shows that is a sharp bound for the magnitude of the eigenvalues of .
For any eigenvalue of it holds that .
Suppose that is an eigenvalue of with eigenvector . Suppose that the th entry of has maximum absolute value, that is, for all . Since it follows that and therefore using the triangle inequality we obtain Therefore , and the claim follows by dividing both sides of the inequality by .
Let and let denote the average degree of . Then
Let be an orthonormal basis of with corresponding eigenvalues . Then there exists numbers such that . Let denote the degree vector of . Now while on the other hand, since is an orthonormal basis, we have Therefore, where we have used the fact that and for all . Therefore, and this proves the first inequality. The second inequality follows from Proposition 2.3.2.
A graph is -regular if and only if is an eigenvector of with eigenvalue .
Recall that If is -regular then for all and therefore Thus, is an eigenvalue of with corresponding eigenvector . On the other hand, if is an eigenvector of with eigenvalue then and thus for all and then is -regular.
Let be a -regular graph with eigenvalues . Then the complement graph has eigenvalues .
Let be an orthonormal basis of consisting of eigenvectors of with corresponding eigenvalues . By Proposition 2.3.4, is an eigenvalue of with corresponding eigenvector , and moreover by Exercise 2.16, is the maximum eigenvalue of . We may therefore take . Let and let . From Lemma 2.1.5 we have that Now since is orthogonal to for we have for . Therefore, for we have Since is a regular graph with degree , by Proposition 2.3.4 is an eigenvalue of with corresponding eigenvector , and this ends the proof.
We now consider bipartite graphs.
Suppose that is a bipartite graph. Then if is an eigenvalue of then is an eigenvalue of . Consequently, if are the non-zero eigenvalues of then the characteristic polynomial of takes the form where . In particular, if is odd then , that is, is an eigenvalue of with multiplicity .
Since is bipartite, there is a partition of the vertex set such that each edge of has one vertex in and the other in . Let and . By a relabelling of the vertices of , we may assume that and . Therefore, the adjacency matrix of takes the form Suppose that is an eigenvector of with eigenvalue . Thus, Then Therefore, is an eigenvector of with eigenvalue . Hence, is a factor in . If denotes the number of non-zero eigenvalue pairs then is the multiplicity of the eigenvalue , and if is odd then .
The eigenvalues of the cycle are for .
Under what condition will a -regular graph have as eigenvalues?
Consider the complete bipartite graph where . Show that is an eigenvalue of of multiplicity . In Exercise 2.18 you will show that are the other two eigenvalues of .
Here is a generalization of the previous example.
Suppose that is a -regular graph with vertices and is a -regular graph with vertices. Let , where , and let , where . Let .
  1. Let and . Write down the adjacency matrix of if we order the vertices of as .
  2. If is an eigenvector of with eigenvalue , with , then show that is an eigenvector of with eigenvalue .
  3. If is an eigenvector of with eigenvalue , with , then show that is an eigenvector of with eigenvalue .
  4. Parts (b) and (c) determine eigenvalues of . Here we find the remaining two eigenvalues. Consider the vector where and is to be determined. Apply the eigenvector-eigenvalue condition and show that the remaining two eigenvalues of are and that
  5. Conclude that if and are the characteristic polynomials of and , respectively, then the characteristic polynomial of is
Let be the degree sequence of and let be the largest eigenvalue of . Prove that Hint: Use Rayleigh quotients and Perron-Frobenius. \cite{AH:60}

Exercises

Let be an matrix and let be the characteristic polynomial of . Find in two ways:
  1. Using the expansion where as usual are the elementary symmetric polynomials evaluated at the roots of .
  2. Using the definition of , namely, . Hint: Recall that for any .
Conclude that , that is, that is the product of the eigenvalues of .
By direct hand computation, find the characteristic polynomial and spectrum of the graph .
Let and let .
  1. Draw the graphs and . Explain why and are not isomorphic.
  2. Find the characteristic polynomials and eigenvalues of and .
  3. What can you conclude based on parts (a) and (b)?
Prove that if has at least one edge then has at least one negative and one positive eigenvalue. (Hint: Use Proposition 2.3.3 and the fact that where are the eigenvalues of .)
Let be a -regular graph. Prove that for all eigenvalues of .
Recall that are twin vertices if , that is, and have the same neighbors (other than themselves). Let be a graph with . Prove that if and are twin vertices then is an eigenvector of with eigenvalue
  1. if and are not adjacent, and
  2. if and are adjacent.
Consider the complete bipartite graph where .
  1. Show that the vector is an eigenvector of with eigenvalue .
  2. Find an eigenvector for .
Let and be graphs with characteristic polynomials and , respectively. What is the characteristic polynomial of ? Show all your work. Hint: See Exercise 2.6 and use the fact that
Prove that if is a rational eigenvalue of a graph then in fact is an integer, that is, . (Hint: Rational root theorem)

Cospectral graphs

In this section, we consider the question of whether it is possible to uniquely determine a graph from its spectrum. To that end, we say that two graphs and are cospectral if they have the same (adjacency) eigenvalues. Our first task is to show that isomorphic graphs are cospectral. It turns out, however, that there are non-isomorphic graphs with the same eigenvalues and we will supply some examples. We will then use our results from Section 2.2 to identify structural properties shared by cospectral graphs. To show that two isomorphic graphs are cospectral, we use the fact that isomorphic graphs have similar adjacency matrices. Recall that two matrices and are similar if there exists an invertible matrix such that Similar matrices share many properties. For example:
If and are similar then the eigenvalues of and are equal.
By definition, there exists an invertible matrix such that . Let and let , that is, is the characteristic polynomial of , for . Then where we used the fact that . Hence, , and therefore and have the same eigenvalues.
Hence, if we can show that the adjacency matrices of isomorphic graphs and are similar then and are cospectral. To do this, we study how permutations can be represented by matrices. For the permutation define the matrix as follows. Let denote the standard basis vectors in thought of as column vectors. Define the matrix as For example, for the permutation the matrix is Notice that can also be obtained by starting with the identity matrix and sending column of to column . Therefore, written out in column form we have The matrix is called the permutation matrix associated to . The columns of any permutation matrix form an orthonormal basis of since the columns of are just the standard basis vectors of (of course in a rearranged order). Hence, permutations matrices are orthogonal matrices, in other words . Hence, , and this implies that for any permutation matrix . We now present the linear-algebraic version of the notion of isomorphic graphs. In the following proof, we use the fact that for any matrix , the entry of can be determined by the computation
Let be a graph with adjacency matrix . If is a permutation matrix then is the adjacency matrix of some graph that is isomorphic to . Conversely, for any graph that is isomorphic to there exists a permutation matrix such that is the adjacency matrix of .
Let be a permutation with permutation matrix . Recall that we can write and then for any standard basis vector . Put and note that is symmetric because . Let and let and let . Consider the entry : We have proved that for all . This proves that all the entries of are either or and the diagonal entries of are zero since they are zero for . Hence, is the adjacency matrix of a graph, say . Now, since then is an edge in if and only if is an edge in . Hence, with isomorphism . Conversely, let be a graph isomorphic to . Then there exists a permutation such that is an edge in if and only if is an edge in . Let be the permutation matrix of . Our computation above shows that . Hence, the matrix has a non-zero entry at if and only if has a non-zero entry at . Hence, is the adjacency matrix of and the proof is complete.
Here is a rephrasing of the previous theorem.
Let and be the adjacency matrices of two graphs and on the vertex set , respectively. Then and are isomorphic if and only if there exists a permutation matrix such that .
Using Theorem 2.4.2 and the definition of an automorphism, the following is immediate.
Let be a graph and let be a permutation with matrix representation . Then is an automorphism of if and only if , or equivalently, .
Combining Corollary 2.4.3 and Proposition 2.4.1 we obtain the following.
If and are isomorphic then .
It is now natural to ask whether non-isomorphic graphs can have the same eigenvalues. The answer turns out to be yes, and in fact it is not too difficult to find non-isomorphic graphs that have the same eigenvalues. For example, one can verify that the graphs and have the same eigenvalues but are not isomorphic since is disconnected and is connected. These two graphs are the smallest non-isomorphic cospectral graphs. The smallest connected non-isomorphic cospectral graphs are shown in Figure 2.1.
figures/smallest-cospectral-graphs.svg
Smallest connected non-isomorphic cospectral graphs
We now investigate what graph properties can be deduced from the eigenvalues of a graph, and in particular, we will focus on the coefficients of the characteristic polynomial of a graph and some of the properties they reveal about the graph. Recall from Section 2.2 that for any polynomial with roots it holds that where are the elementary symmetric polynomials evaluated . From Section 2.2, the elementary symmetric polynomials can be written in terms of the power sums , and it turns out that if is the characteristic polynomial of a matrix then there is a very simple relationship between the power sums and the entries of . Consider first via a matrix: where is a polynomial whose degree is at most one. Expanding we obtain where is a polynomial whose degree is at most one. This shows that In the general case, a similar argument shows that the coefficient of in is . On the other hand, if the roots of are then . To summarize:
Suppose that the matrix has eigenvalues . Then In other words, the trace of is the sum of the eigenvalues of .
Alternatively, we have shown that We now want to relate the power sums with the entries of but first we need the following.
If is an eigenvalue of then is an eigenvalue of .
If then By induction, Therefore, if has eigenvalues then has eigenvalues .
As a consequence we obtain the following.
If are the eigenvalues of then
By Lemma 2.4.7, the eigenvalues of are . By Proposition 2.4.6 applied to , it holds that
Proposition 2.4.8 is important because it relates the power sums evaluated at the eigenvalues with the entries of via the numbers . From this we can for example prove the following.
Let and be matrices. Then and have the same eigenvalues if and only if for .
Suppose that and have the same eigenvalues . Then by Proposition 2.4.8 we have and and therefore for all . Conversely, if for then by Proposition 2.2.5 the coefficients of the characteristic polynomials of and are equal since the coefficients of the characteristic polymials can be written in terms of . Hence, and have the same characteristic polynomial and therefore the same eigenvalues.
For our purposes, Proposition 2.4.8 is the main tool to relate the spectrum of a graph with its structural properties. For example, we obtain the following.
Let and be graphs each with vertices. Then and are cospectral if and only if for each , the total number of closed walks in of length equals the total number of walks in of length .
By Theorem 2.4.9, if and have the same eigenvalues then for all . By Theorem 2.1.1, the number is the total number of closed walks of length and the claim follows.
Let . Suppose that . Find the number of edges of .
In the next theorem, we relate the first few coefficients of the characteristic polynomial of a graph with some of the structural properties of a graph. The main tool being used here is Proposition 2.4.8.
Let be a graph with characteristic polynomial If has edges, triangles, and cycles of length four then
Since has zeros on the diagonal then From the Newton identities , and using , we have Now and since (Corollary 2.1.2) we conclude that Therefore, Now consider . We have from the Newton identities that From Corollary 2.1.2, we have and therefore as claimed. Finally, from Newton's identities we have Now from Corollary 2.1.2, we have , , and therefore as claimed.
From Theorem 2.4.11 we now obtain a few graph characteristics that are shared by cospectral graphs. For example, if and are cospectral then they have the same characteristic polynomial. Theorem 2.4.11 then implies that and have the same number of edges and the same number of triangles.
If and are cospectral then they have the same number of edges and the same number of triangles.
A natural question to ask is if the degree sequence is a property that must be shared by cospectral graphs. The answer in general is no. For example, in Figure 2.2 we display two non-isomoprhic cospectral trees with distinct degree sequences. For trees, however, the following example shows that the sum of the squares of the degrees is equal for cospectral trees.
figures/cospectral-trees.svg
Two non-isomorphic cospectral trees with distinct degree sequences; there are many others.
Suppose that and are graphs with at least vertices and suppose that they have the same number of 4-cycles. Let and let be their respective degree sequences. If and are cospectral show that Solution: If and are cospectral then their characteristic polynomials are equal, and in particular the coefficient of in their characteristic polynomials are equal. Also, they must have the same number of edges. Since and have the same number of 's, Theorem 2.4.11 implies that . This example is applicable to trees since trees have no cycles.
Suppose that is a bipartite graph with spectrum . Prove that if is odd then Solution: Let be a bipartition of . Suppose that is a closed walk in and without loss of generality suppose that . Then for odd and for even. Since it follows that is necessarily even. Hence, all closed walks in have even length. By Proposition 2.4.8, for all , and is the total number of closed walks of length . Hence, for odd and the claim is proved.
The graph has spectrum and degree sequence . Find the number of edges, triangles, and -cycles in .
Use Theorem 2.4.11 to find the characteristic polynomial of . (Hint: is bipartite.)

Exercises

Prove that if is a -regular graph with vertices and then
A -regular graph with vertices has characteristic polynomial Find the number of edges , number of triangles , and number of -cycles of .
Two trees and have degree sequence and . Could and be cospectral? Explain.
A graph has spectrum . How many closed walks of length 4 are in ? What about closed walks of length 5?
A tree has degree sequence . Find the characteristic polynomial of . (Hint: Recall that a tree is bipartite, and recall Theorem 2.3.6.)

Bipartite Graphs

The following is a spectral characterization of bipartite graphs.
Let be a graph on vertices. The following are equivalent.
  1. is bipartite
  2. is a non-zero eigenvalue of if and only if is an eigenvalue of
  3. for odd.
  4. for odd.
Suppose that is bipartite and let . The vertex set can be partitioned into two disjoint parts and such that any edge has one vertex in and one in . Therefore, by an appropriate labelling of the vertices, the adjacency matrix of takes the form where is a matrix. Suppose that is an eigenvector of with eigenvalue , where and . Then implies that Consider the vector . Then Hence, is an eigenvector of with eigenvalue . This proves (i) (ii). Now assume that (ii) holds. Then there are an even number of non-zero eigenvalues, say , where , and is the number of zero eigenvalues. If is odd then . This proves (ii) (iii). Now assume that (iii) holds. Since it follows that for odd. This proves (iii) (iv). Now assume (iv) holds. It is known that the entry of are the number of walks starting and ending at vertex . Hence, the total number of cycles of length in is bounded by . By assumption, if is odd then and thus there are no cycles of odd length in . This implies that is bipartite and proves (iv) (i). This ends the proof.
From the previous theorem, it follows that if are the non-zero eigenvalues of and has the zero eigenvalue of multiplicity then the characteristic poly of is From this it follows that at least half of the coefficients in the expansion of are zero, namely all the odd coefficients. For example, if say and then so that . Another way to see this is using the Newton identities.
Let be a bipartite graph on vertices and let be the characteristic polynomial of : Then for odd.
Using the Newton identities, we have for . If is bipartite then for all odd. Let be odd and assume by induction that are all zero. Then the only terms that survive in the expression for are those where is even. If is even then necessarily is odd, and thus . Hence as claimed.
As a consequence of Corollary 2.5.2, if is a bipartite graph on vertices and is odd then then and thus is an eigenvalue of .