TENSOR NOTATION AND OPERATIONS

3. TENSOR NOTATION AND OPERATIONS

Summation Convention As you may have noticed in the last section, tensor equations use a lot of summation signs—it would be a simplification if we could get along without them. Using the summation convention (or Einstein summation convention), we omit the summation signs in equations like (2.11) to (2.14), and (2.16) to (2.18), and simply understand a summation over any index which appears exactly twice in one term. Here are some examples using summation convention (in three dimensions).

Examples.

a ii or a jj or a ββ , etc. means a 11 +a 22 +a 33 ;

x i x i or x α x α , etc. means x 2 1 +x 2 2 +x 2 3 ;

a ij b jk means a i1 b 1k +a i2 b 2k +a i3 b 3k ; (3.1)

T ijkl S ij V k U l means

T ijkl S ij V k U l ;

and so on. The repeated index (which is summed over) is called a dummy index; like an integration variable in a definite integral, it does not matter what letter is used for it. An index which is not repeated is called a free index.

When summation convention is being used, we are not warned by a summation sign what letters to sum over; we just have to inspect the indices and see which ones appear twice. In writing terms using the summation convention, we must be careful not to re-use an index. For example, if we already have two i subscripts indicating a sum over i, and we want another sum in the same term, we must use

a different dummy index, say j or m or α, etc. In the following discussion we will use summation convention; watch carefully for the repeated dummy indices.

Contraction The transformation equations for a 4 th -rank tensor are [see (2.18)] (3.2)

T ′ αβγδ =a αi a βj a γk a δl T ijkl .

(Note the sums over i, j, k, and l).

Section 3 Tensor Notation and Operations 503

Example 1. Now suppose we put δ = β which, by summation convention, means a further sum over β. Then we have

T ′ αβγβ =a αi a βj a γk a βl T ijkl .

Now a βj a βl (summed over β) is the dot product of columns j and l of the rotation matrix A [see Problem (2.3)]. This dot product is 1 if j = l, and 0 otherwise. In other words a βj a βl =δ jl [see Chapter 3, equation (9.4)]. Then δ jl T ijkl becomes T ijkj since δ jl is zero unless j and l are equal. (The repeated dummy index could

be either j or l or anything else except the dummy indices i and k which are already used, and the free indices α and γ). Thus we have

T ′ αβγβ =a αi a γk δ jl T ijkl =a αi a γk T ijkj

Now (3.4) says that T ijkj are the components of a 2 nd -rank tensor since there are two free indices and two a factors are required [compare equation (2.14)]. This process of setting two indices of a tensor equal to each other and then summing is called contraction. Contraction reduces the rank of a tensor by 2. Note that in (3.2) we started with a 4 th -rank tensor and after contracting we have a tensor of rank 2 in (3.4).

It is interesting to observe that the dot (or scalar or inner) product of two vectors in elementary vector analysis is an example of contraction. In Section 2 we showed that the direct product of two vectors [see (2.17)] is a 2 nd -rank tensor. If we contract

U i V j to get U i V i we have the dot product of vectors U and V, which is a scalar. Again note that contraction has reduced the rank of a tensor by 2 (a scalar is a tensor of rank zero).

Tensors and Matrices The components of first or second rank tensors can be displayed as matrices and this is often useful. We have frequently (see Chapter 3) written the components of a vector (1 st -rank tensor) as a column or row matrix. The components T ij of a 2 nd -rank tensor can be written as the elements of a square matrix (see inertia matrix, Section 4). Then note that in the tensor equation,

U i =T ij V j , the contraction (sum on j) corresponds exactly to row times column multiplication for matrices.

Symmetric and Antisymmetric Tensors A2 nd -rank tensor T ij is called sym- metric if T ij =T ji , and antisymmetric (or skew symmetric) if T ij = −T ji . Note that these agree with the corresponding definitions for matrices [Chapter 3, (9.2)]. Any

2 nd -rank tensor can be written as a sum of a symmetric tensor and an antisymmetric tensor as in (3.5) (Problem 13).

T ij = (T ij +T ji )+ (T ij −T ji ).

For tensors of higher rank, similar terminology is used. If an exchange of two indices leaves the tensor component unchanged, we say that the tensor is symmetric with respect to those two indices. If an exchange of two indices changes the tensor

504 Tensor Analysis Chapter 10

Combining tensors The sum or difference (in fact linear combination) of two tensors of rank n is a tensor of rank n (Problems 6 and 7). For example, T ij +R ijk V k is a tensor of rank 2. Note the summation convention and the contraction which makes R ijk V k also a tensor of rank 2 so that we can add it to T ij . (Addition is not defined for tensors of different ranks.)

Quotient Rule Let us suppose we know that, for every vector V j , the quantities U i =T ij V j are the components of a non-zero vector and that this holds true in all rotated coordinate systems. Then we can prove that the quantities T ij are the components of a 2 nd -rank tensor. This is an example of the quotient rule.

Example 2. To prove this, we need the following equations:

given equation in rotated system; U ′ α =a αi U i ,

β ′ =U α ,

T ′ αβ V ′

U is a vector;

U i =T ij V j ,

given equation;

V is a vector; see equation (2.13). Now, putting this all together we have (3.7)

V j =a βj V β ′ ,

T ′ V αβ ′ β =U ′ α =a αi U i =a αi T ij V j =a αi T ij a βj V β ′ . Factoring out V ′ β from the first and last steps, we have

αi a βj T ij )V αβ ′ −a β =0 for all vectors V ′ . Since V ′ is arbitrary, the parenthesis in (3.8) is equal to zero (Problem 8). Thus

(T ′

we have (3.9)

T ′ αβ =a αi a βj T ij .

Now (3.9) is the transformation equation for a 2 nd -rank tensor [compare (2.14)], so, as claimed, the quantities T ij are the components of a 2 nd -rank tensor.

The quotient rule is useful in determining whether some given quantities are the components of a tensor. [As an example of this, see (4.1).] Suppose X is a set of 3 n components (the right number for a tensor of rank n in 3 dimensions). The quotient rule says that if the product of X and an arbitrary tensor is a non-zero tensor, then X is a tensor. The product may be either a direct product or a direct product combined with one or more contractions. We have proved the quotient rule for one case but the proof of any case follows this same pattern. Given XA = B, where A is an arbitrary tensor and B is a non-zero tensor, we use the transformation equations for A and B, and the fact that A is arbitrary, to find the transformation equations for X (see Problems 9 to 12).

Section 4 Inertia Tensor 505

PROBLEMS, SECTION 3

1. Write equations (2.11), (2.12), (2.13), (2.14), (2.16), (2.17), and (2.18) using sum- mation convention.

2. Show that the fourth expression in (3.1) is equal to ∂u/∂x i ′ . By equations (2.6) and (2.10), show that ∂x j /∂x ′ i =a ij , so

Compare this with equation (2.12) to show that ∇u is a Cartesian vector. Hint: Watch the summation indices carefully and if it helps, put back the summation signs or write sums out in detail as in (3.1) until you get used to summation convention.

3. As we did in (3.3), show that the contracted tensor T iij is a first-rank tensor, that is, a vector.

4. Show that the contracted tensor T ijk V k is a 2 nd -rank tensor. 5. Show that T ijklm S lm is a tensor and find its rank (assuming that T and S are tensors

of the rank indicated by the indices). 6. Show that the sum of two 3 rd -rank tensors is a 3 rd -rank tensor. Hint: Write the

transformation law for each tensor and then add your two equations. Divide out the a factors to leave the result T αβγ ′ +S αβγ ′ =a αi a βj a γk (T ijk +S ijk ) using summation convention.

7. As in problem 6, show that the sum of two 2 nd -rank tensors is a 2 nd -rank tensor; that the sum of two 4 th -rank tensors is a 4 th -rank tensor.

8. Show that (3.9) follows from (3.8). Hint: Give a proof by contradiction. Let S αβ be the parenthesis in (3.8); you may find it useful to think of the components written as a matrix. You want to prove that all 9 components of S αβ are zero. Suppose it

is claimed that S 12 is not zero. Since V β ′ is an arbitrary vector, take it to be the vector (0, 1, 0), and observe that S αβ V β ′ is then not zero in contradiction to (3.8). Similarly show that all components of S αβ are zero as (3.9) claims.

Prove the quotient rule in each of the following problems, that is, given XA = B where A is any arbitrary tensor and B is a non-zero tensor, show that X is a tensor. Hints: Follow the general method in (3.6) to (3.9). See the last sentence of the section.

9. X i A ij =B j

10. X i A j =B ij

11. X ij A k =B ijk 12. X ijkl A kl =B ij

13. Show that the first parenthesis in (3.5) is a symmetric tensor and the second paren- thesis is antisymmetric.