**§ ****4 ****Tensor Algorithms**

1. The
concept of tensor

[ General definition of tensor ] If a quantity has *n ^{N}* components, and the coordinate transformation of each component in the

_{} ( *i* = 1 , ··· , *n* )

Below, it changes according to the following rules:

_{}

where is a function of *x ** ^{i}* , if yes , the quantity ( there are

The concept of tensors is a generalization of the concepts of vectors and matrices, scalars are zero-order tensors, vectors are first-order tensors, matrices ( square matrices ) are second-order tensors, and third-order tensors ( for example ) are like "stereo matrices" ( Figure 8.18 right ). Higher-order tensors cannot be expressed graphically . The following is a schematic diagram of tensors when *n* = 2 :_{}

[ Tensor example ]

1 Multiplyable tensor Assuming that the two vectors ** a** and

_{}

All are second-order tensors, called multiplicative tensors .

2 ^{ }Kronecker notation The Kronecker notation is a second-order mixture tensor of first-order contravariant first-order covariance, since ^{ }_{}

_{}

Available

_{}

[ Second-order symmetric tensor and antisymmetric tensor ] If the tensor satisfies the equation

_{}

are called the second-order symmetric covariant tensor, the second-order symmetric contravariant tensor, and the second-order symmetric mixture tensor, respectively . If the tensor satisfies the equation

_{}

are called second-order antisymmetric covariance tensors, second-order antisymmetric contravariant tensors, and second-order antisymmetric mixture tensors, respectively .

The symmetric properties of contravariant ( covariant ) indices of tensors are invariant under coordinate transformations .

In three-dimensional space, a second-order antisymmetric tensor is equivalent to a vector .

2.
Tensor Algebra

[ Permutation of indicators ] Permutation of indicators is the simplest operation of tensor algebra, and a new tensor can be made by using it . For example, through permutation of indicators, a new tensor can be obtained from a tensor , and its matrix is the transformation of the matrix of the tensor. set matrix ._{}_{}_{}

[ Addition ( subtraction ) method ] The corresponding components of several tensors of the same type are added ( or subtracted ) to obtain a new component of the same type of tensor. This operation is called tensor addition ( or subtraction ).

Any second-order tensor can be decomposed into a symmetric tensor and an antisymmetric tensor . For example

_{}

[ Multiplication of tensors ] Multiply the components of two tensors according to various possible situations, and you will get the components of a new tensor . The order of contravariance and covariance of this tensor is equal to the original two tensors respectively. Sum of the order of contravariance and covariance of a quantity . This operation is called multiplication of tensors . For example

_{}

This is a mixed tensor of order *l* + *k* contravariant *m* + *h* covariant of order *l* + *m* + *k* + *h* .

Note that the order of tensor multiplication is not commutative .

[ Contraction of tensors ] For a given mixture tensor, add up a contravariant index and a covariant index equal to it, and get a lower order ( contravariance and covariance are one lower order each) ) , this operation is called contraction of tensors . For example

_{}

is a mixture tensor of order *l* - 1 contravariant and *m* - 1 covariant .

[ Indicator rise and fall ] In applications, the multiplication and contraction of second-order contravariant tensors are often used to "raise" the covariant index of tensors, and the multiplication and contraction of second-order covariant tensors are used to "lower" The contravariant index of the tensor . This operation is called the lifting and lowering of the index . For example, *T ** _{ijk}* can be raised and lowered by

_{}

[ quotient law of tensors ] Let each sum be a function of a set of *x ** ^{i}* sums , if for any contravariant vector sum and any index

_{}and_{}

If it becomes a tensor, it must be a tensor . This law of discriminating tensors is called the quotient law of tensors ._{}

For example , with a function of , and _{}_{}_{}_{}

_{}

but

_{}

which is

_{}

holds for all , so the expression in the parentheses above equals zero, and is therefore a tensor ._{}_{}

Substituting any covariant vector for the contravariant vector yields similar results .

[ Tensor density ] The quantity that changes according to the following rules

_{}

It is called tensor density, where it is a constant, which is called the weight of tensor density . A tensor is a tensor density with a weight of zero . According to the order of the tensor, scalar density and vector density can also be defined ._{}

The sum of two tensor densities with the same number of indicators and the same weights is a tensor density of the same type . When two tensors are multiplied, the weights are added .

3.
Tensor Analysis

The above tensors all assume that their components are functions of a point *M* ( *x ** ^{i}* ) in the space

_{}

When the point *M* ( *x ^{i}* ) changes in a certain area

There is also an invariant operation for tensor fields - absolute differentiation ( also known as covariant differentiation ) , which is what tensor analysis is about .

The ordinary derivative of a scalar field is the component of a covariant vector field ( gradient field ) . However, in general, the ordinary derivative of a tensor field does not constitute a new tensor field .

[ Affine contact space ] If for each coordinate system ( *x ** ^{i}* ) in the space

_{}

, they change according to the following rules

_{}
(1)

Then it is said that a contact object ( or contact coefficient ) is given at the point *M* , and the partial derivative takes the value at the point *M.*

Suppose a contact object field is given in space *R ^{n}*

_{}

And these functions are continuously differentiable, then *R ^{n}* is called an affine connection space, denoted as

_{}

[ torsion tensor ] The transformation law in equation (1) includes two terms: the first term does not depend on the old coordinate system ; the second term depends on , and has the exact same form as the transformation law of the tensor . A term is symmetric to the two subscripts , it is generally not equal to zero, so it is not a tensor . But_{}_{}_{}_{}_{}

_{}

constitute a tensor, called the torsion tensor of the affine connection space L n. If the *torsion ^{tensor}* is equal to zero, that is

_{}

Then the given space is said to be an affine connection space without torsion rate, denoted as ._{}

[ Absolute Differentiation and Parallel Movement of Vectors ] If a contravariant vector is given in the space *L ^{n}* , under the coordinate transformation, there are

_{}
(2)

This constitutes the transformation law of the vector at point *M.* If we move from point *M* ( *x *^{i} ) to point *N* ( *x ** ^{i}* + d

_{}

where d *a ^{i}* represents the component of the change when the vector moves from

Taking only one term in the above formula, we get

_{} (3)

If the second partial derivative of the transformation is not equal to zero at *M* , then the amount of change in a vector is never a component of a vector .

If *R ^{n}* is an affine connection space, it can be obtained from equations (1) , (2) and (3)

_{}

This indicates

_{}

*is a contravariant* infinitesimal vector . * ^{Let Dai be the absolute differential of the vector at point M with respect to the displacement MN with component d x i}* . If the object

If the vector is equal to zero, then_{}

_{}=0

It is said that the vector moves from point *M to point **N* in parallel with respect to the connection . When the component *a ** ^{i}* remains unchanged (d

If given a curve *C*

*x ** ^{i}*
=

and a contravariant vector , along this curve *C* can be made as the concomitant vector of_{}_{}

_{}

Call it the derivative along the curve *C.* If the derivative is zero, i.e._{}

_{}
(4)

Then the vector *a ^{i}* moves parallel to the curve

Similarly, absolute differentiation and parallel movement of covariant vectors can be considered ._{}

_{}

is the absolute differential of the covariant vector with respect to the displacement d *x ** ^{i}* . The condition for parallel movement is

_{}

Or the condition for parallel movement along the curve *C is*

_{}

[ Covariant Derivative ] From the definition formula of the absolute differential of contravariant vector and covariant vector, the quantity can be obtained

_{}and_{}

They are second-order tensors covariant with respect to the index *k* , called the covariant derivatives of the vector sums, denoted sum or sum , respectively ._{}_{}_{}_{}_{}_{}

[ Absolute Differentiation and Parallel Shift of Tensor and Its Covariant Differential Method ]

From the differential formula of the product and the definition of the tensor, the law of parallel movement of the tensor can be deduced .

For example, the parallel movement law of a third-order tensor is

_{}

The parallel movement law of the fourth-order tensor is:

_{}

It can be seen that the number of items contained in the law of parallel movement of tensor is the same as the order of tensor . For the contravariant index of tensor , it is similar to the law of parallel movement of contravariant vector ; for the covariant index of tensor , similar to the law of parallel movement of covariant vectors .

_{}

is called the absolute differential of a tensor ._{}_{}

[ Covariant derivatives of tensors and their algorithms ]

_{}

is called the covariant derivative of a tensor, which is a component of a fifth-order tensor ._{}

In the ordinary derivative, adding an item to each index of the differentiated tensor can form the covariant derivative of any tensor. For the contravariant index, the form of this term is

_{}

For the covariance indicator is

_{}

The algorithm for covariant derivatives is as follows:

1 The covariant derivative of the sum of several tensors of the same structure is equal to the sum of the covariant derivatives of each tensor, that is^{ }

_{}

2 ^{ }Satisfy the differential law of the product, that is

_{}

[ Self-parallel curve ] In the affine connection space, if every vector tangent to a point *M *_{0} on the curve is tangent to the curve when moving in parallel along the curve, the curve is called a self-parallel curve ._{}

Let the equation of the curve be *x ^{i}* =

_{}

This is the differential equation of the connected self-parallel curve . Let_{}

_{}

The above differential equation can be written as

_{}

The coefficients are obviously symmetric with respect to *j* and *k* , and constitute an affine connection . It is said to constitute a symmetric affine connection associated with , if it is also symmetric with respect to *j* and *k* , it is consistent with ._{}_{}_{}_{}_{}_{}

Contribute a better translation