Loading [MathJax]/jax/output/HTML-CSS/jax.js

5. Bruhat order

I was planning to include this as part of an earlier post, but I figured this was important enough to warrant a new post. Bruhat order is a partial ordering on a Weyl group that arises from Schubert calculus. The main result required here is Theorem 5.3, but since this is its own post I added some additional material that we may explore in further detail later. We have two main goals here: one is to develop the theory sufficiently to state and prove the main result in my dissertation, and the other is to present some of my work that is unpublished and probably will only be published here. It's going to take a few more posts to get to my dissertation, after which we will see where it goes. There is an unpublished result of mine based on the root systems that has already been cited by Sara Billey in a talk and that I hope to get to here.

Definition 1: Let (R,B) be a root system. If u,vW(R), define uv if there exists yI(v) such that usy=v; that is, u1v is a reflection and (u)<(v). Then define a partial ordering , called Bruhat order, to be the reflexive, transitive closure of .

This definition of Bruhat order is not always the easiest to work with. The main theorem in this post gives another characterization of Bruhat order in terms of subwords.

Definition 2: Given a word (sx1,,sxn), we say that a sequence (e1,,en) is a subword of this word if ei{1,sxi} for all i. A subword has an associated word obtained by deleting all of the 1's; we say the subword is reduced if the corresponding word is reduced, and we say that the subword is for the element e1e2en. We can iterate this process; if we have two subwords u=(e1,,en), u=(f1,,fn) of a word u we say that u is a subword of u if ei{1,fi} for all i.

We will need the following lemma (or rather the corollary afterwards).

Lemma 5.1 (The deletion property): Let (R,B) be a root system, let wW(R), and let (sx1,,sxn) be a word for w. If w is not reduced, then there exist indices i<j such that (sx1,,^sxi,,^sxj,,sxn) is a word for w.

Proof: Let j be the minimal index such that (sx1,,sxj) is not a reduced word. Let v=sx1sxj1. Then (vsxj)<(v), so there exists an index i such that (sx1,,^sxi,,sxj1) is a word for vsxj. These are the i and j we seek in the statement of the result.

The proof of the following is easy and we leave it to the reader.

Corollary 5.2: Let (R,B) be a root system and let wW(R). Given any word (sx1,,sxn) for w there exists a reduced subword for w.

Theorem 5.3: Let (R,B) be a root system and let u,vW(R). Then uv if and only if for any word (sx1,,sxn) for v there exists a reduced subword for u.

Proof: We prove that if uv then there exists such a reduced subword by induction on (v)(u), the result being true by the corollary when (u)=(v). Otherwise by definition there exists a u with uuv. By Theorem 2.3 we may delete one element from any word for v to obtain a word for u, or equivalently set the element at some single index to 1 to obtain a subword for u. By the induction hypothesis, for this subword for u we have that there is a reduced subword that is a word for u. Thus there is a reduced subword for u of the word for v, and one implication follows.

Suppose now that there exists such a subword for u in any word for v. Let v be a reduced word for v containing a reduced subword u for u; we prove by induction on (v) that uv. When (v)=0 we must have that (u)=0 so the result holds. Let sxn be the last simple reflection in v, so the element v(v), so that (v)=n. Let ˆv be the subword of v such that ˆvn=1 and ˆvi=vi for i<n; note that this is a subword for vsxn. Similarly, let ˆu be the subword of u such that ˆun=1 and ˆui=ui for i<n. Suppose first unsxn. Then u=ˆu is a subword of ˆv, so uvsxnv by the induction hypothesis and the result follows.

If un=sxn, then since ˆu is a subword for usxn and is a subword of ˆv we have by the induction hypothesis that usxnvsxn. Thus there exists some u such that usxnuvsxn, and there is a reduced subword u of ˆv for u. Assume without loss of generality that ˆu is a subword of u. We split into two cases, depending on whether xnI(u).

If xnI(u), then (usxn)>(u), hence if u is the subword of v for usxn such that ui=ui for i<n and un=sxn, then since u is a subword of u and usxnv (since if usy=vsxn then (usxn)sxny=v) we have by the induction hypothesis that uusxnv and we are done.

If xnI(u), then we may assume that un=sxn. This is because if this were not the case then we could delete an element of u, setting say ui=1 to obtain a word for usxn, then set un=sxn to obtain another subword for u. u contains a reduced subword for usxn by the induction hypothesis whose last element is not sxn (because no word for usxn ends in sxn), hence there is a reduced subword ˜u of u for u, so that uuvsxnv by the induction hypothesis and we are done. The result follows by induction.

Now we for the additional material, where we relate Bruhat order to subsystems. The following theorem, which is an easy consequence of the theory built so far in this blog, is highly nontrivial in the setting of Coxeter systems. It was the main result of (Dyer, Matthew. On the "Bruhat graph" of a Coxeter system, Compositio Mathematica 78.2 (1991): 185-191). To start we will need a lemma, followed by a definition, then we may move on to the theorem. Recall that for a root system (R,B) and a subsystem U, we defined in Theorem-Definition 4.5 a function ϕU:W(R)W(U) that sends an element wW(R) to the element of W(U) whose inversion set in U is the intersection I(w)U.

Lemma 5.4: Let (R,B) be a root system, let UR be a subsystem, and let wW(R). Then wW(U) if and only if ϕU(w)=w
Proof: By definition, the codomain of ϕU is W(U), hence if ϕU(w)=w then wW(U). Conversely, suppose wW(U) and ϕU(w)=wW(U). Then IU(w)=IU(w), and since w,wW(U) it follows by Proposition 3.5 that w=w. The result follows.

Definition 3: Let (R,B) be a root system and let U be a subsystem. If u,vW(R) are such that (u)<(v) and there exists a root yU such that usy=v, then we say that uUv. Define U to be the reflexive, transitive closure of U. Note that Bruhat order on W(U) is the restriction of the relation U.

The Bruhat graph of a Weyl group W(R) is the directed graph on W(R) with the edges uv, where we may consider the edge uv as being labeled with the reflection u1v. The theorem below says that the Bruhat graph of wW(U)W(R) for any coset wW(R) is isomorphic as a labeled graph to the Bruhat graph of W(U) and is an induced subgraph of the Bruhat graph of W(R) (so there are no extra edges that do not come from reflections in W(U)).

Theorem 5.5: Let (R,B) be a root system, let U be a subsystem, and let wW(R). If u,v are both contained in the coset wW(U), then uRv if and only if uUv.

Proof: If uUv, then clearly uRv. Conversely, if uRv, then there exists a reflection tW(R) such that ut=v. Since u,v are in the same left coset of W(U) we must have that tW(U). We need to prove then that if tW(R) is a reflection and tW(U), then there is a root yU such that t=sy. The contrapositive is that if t=sy for some yRU, then tW(U). To see this, let wW(R) be an element such that w(y)B. Then syW(U) if and only if sw(y)W(w(U)). For all zw(U)R+(B) we have that w(y)zR+(B), hence ϕw(U)(sw(y))=1sw(y), hence sw(y)W(w(U)) by Lemma 5.4 and we are done.

This implies that if u,vwW(U) and ϕU(u)UϕU(v), then uv. However, this does not in general imply that if u,vwW(U) and uv then ϕU(u)UϕU(v). For an example, consider the root system R3A, defined in post 2. Let x=t2t3 and let y=t1t4. Then U={±x,±y} is a subsystem of R3A since xy=y and yx=x, and sxsy. However it is not the case that sxUsy because U(sx)=U(sy).

No comments:

Post a Comment