Anda di halaman 1dari 39

1.

Divergence of a product: Given that 𝜑 is a scalar field and 𝐯 a vector field, show that
div(𝜑𝐯) = (grad𝜑) ⋅ 𝐯 + 𝜑 div 𝐯
grad(𝜑𝐯) = (𝜑𝑣 𝑖 ),𝑗 𝐠 𝑖 ⊗ 𝐠𝑗
= 𝜑,𝑗 𝑣 𝑖 𝐠 𝑖 ⊗ 𝐠𝑗 + 𝜑𝑣 𝑖 ,𝑗 𝐠 𝑖 ⊗ 𝐠𝑗
= 𝐯 ⊗ (grad 𝜑) + 𝜑 grad 𝐯
Now, div(𝜑𝐯) = tr(grad(𝜑𝐯)). Taking the trace of the above, we have:
div(𝜑𝐯) = 𝐯 ⋅ (grad 𝜑) + 𝜑 div 𝐯

2. Show that grad(𝐮 · 𝐯) = (grad 𝐮)T 𝐯 + (grad 𝐯)T 𝐮


𝐮 · 𝐯 = 𝑢𝑖 𝑣𝑖 is a scalar sum of components.
grad(𝐮 · 𝐯) = (𝑢𝑖 𝑣𝑖 ),𝑗 𝐠𝑗
= 𝑢𝑖 ,𝑗 𝑣𝑖 𝐠𝑗 + 𝑢𝑖 𝑣𝑖 ,𝑗 𝐠𝑗
Now grad 𝐮 = 𝑢𝑖 ,𝑗 𝐠 𝑖 ⊗ 𝐠𝑗 swapping the bases, we have that,
(grad 𝐮)T = 𝑢𝑖 ,𝑗 (𝐠𝑗 ⊗ 𝐠 𝑖 ).
Writing 𝐯 = 𝑣𝑘 𝐠 𝑘 , we have that, (grad 𝐮)T 𝐯 = 𝑢𝑖 ,𝑗 𝑣𝑘 (𝐠𝑗 ⊗ 𝐠 𝑖 )𝐠 𝑘 =
𝑢𝑖 ,𝑗 𝑣𝑘 𝐠𝑗 𝛿𝑖𝑘 = 𝑢𝑖 ,𝑗 𝑣𝑖 𝐠𝑗
It is easy to similarly show that 𝑢𝑖 𝑣𝑖 ,𝑗 𝐠𝑗 = (grad 𝐯)T 𝐮. Clearly,
grad(𝐮 · 𝐯) = (𝑢𝑖 𝑣𝑖 ),𝑗 𝐠𝑗 = 𝑢𝑖 ,𝑗 𝑣𝑖 𝐠𝑗 + 𝑢𝑖 𝑣𝑖 ,𝑗 𝐠𝑗
= (grad 𝐮)T 𝐯 + (grad 𝐯)T 𝐮
As required.

3. Show that grad(𝐮 × 𝐯) = (𝐮 ×)grad 𝐯 − (𝐯 ×)grad 𝐮


𝐮 × 𝐯 = 𝜖 𝑖𝑗𝑘 𝑢𝑗 𝑣𝑘 𝐠 𝑖
Recall that the gradient of this vector is the tensor,
grad(𝐮 × 𝐯) = (𝜖 𝑖𝑗𝑘 𝑢𝑗 𝑣𝑘 ),𝑙 𝐠 𝑖 ⊗ 𝐠 𝑙
= 𝜖 𝑖𝑗𝑘 𝑢𝑗 ,𝑙 𝑣𝑘 𝐠 𝑖 ⊗ 𝐠 𝑙 + 𝜖 𝑖𝑗𝑘 𝑢𝑗 𝑣𝑘 ,𝑙 𝐠 𝑖 ⊗ 𝐠 𝑙
= −𝜖 𝑖𝑘𝑗 𝑢𝑗 ,𝑙 𝑣𝑘 𝐠 𝑖 ⊗ 𝐠 𝑙 + 𝜖 𝑖𝑗𝑘 𝑢𝑗 𝑣𝑘 ,𝑙 𝐠 𝑖 ⊗ 𝐠 𝑙
= − (𝐯 ×)grad 𝐮 + (𝐮 ×)grad 𝐯

4. Show that div (𝐮 × 𝐯) = 𝐯 ⋅ curl 𝐮 − 𝐮 ⋅ curl 𝐯


We already have the expression for grad(𝐮 × 𝐯) above; remember that
div (𝐮 × 𝐯) = tr[grad(𝐮 × 𝐯)]
= −𝜖 𝑖𝑘𝑗 𝑢𝑗 ,𝑙 𝑣𝑘 𝐠 𝑖 ⋅ 𝐠 𝑙 + 𝜖 𝑖𝑗𝑘 𝑢𝑗 𝑣𝑘 ,𝑙 𝐠 𝑖 ⋅ 𝐠 𝑙
= −𝜖 𝑖𝑘𝑗 𝑢𝑗 ,𝑙 𝑣𝑘 𝛿𝑖𝑙 + 𝜖 𝑖𝑗𝑘 𝑢𝑗 𝑣𝑘 ,𝑙 𝛿𝑖𝑙
= −𝜖 𝑖𝑘𝑗 𝑢𝑗 ,𝑖 𝑣𝑘 + 𝜖 𝑖𝑗𝑘 𝑢𝑗 𝑣𝑘 ,𝑖 = 𝐯 ⋅ curl 𝐮 − 𝐮 ⋅ curl 𝐯

5. Given a scalar point function 𝜙 and a vector field 𝐯, show that curl (𝜙𝐯) =
𝜙 curl 𝐯 + (grad 𝜙) × 𝐯.
curl (𝜙𝐯) = 𝜖 𝑖𝑗𝑘 (𝜙𝑣𝑘 ),𝑗 𝐠 𝑖
= 𝜖 𝑖𝑗𝑘 (𝜙,𝑗 𝑣𝑘 + 𝜙𝑣𝑘 ,𝑗 )𝐠 𝑖
= 𝜖 𝑖𝑗𝑘 𝜙,𝑗 𝑣𝑘 𝐠 𝑖 + 𝜖 𝑖𝑗𝑘 𝜙𝑣𝑘 ,𝑗 𝐠 𝑖
= (∇𝜙) × 𝐯 + 𝜙 curl 𝐯

6. Show that div (𝐮 ⊗ 𝐯) = (div 𝐯)𝐮 + (grad 𝐮)𝐯


𝐮 ⊗ 𝐯 is the tensor, 𝑢𝑖 𝑣 𝑗 𝐠 𝑖 ⊗ 𝐠 𝑗 . The gradient of this is the third order tensor,
grad (𝐮 ⊗ 𝐯) = (𝑢𝑖 𝑣 𝑗 ),𝑘 𝐠 𝑖 ⊗ 𝐠 𝑗 ⊗ 𝐠 𝑘
And by divergence, we mean the contraction of the last basis vector:
div (𝐮 ⊗ 𝐯) = (𝑢𝑖 𝑣 𝑗 )𝑘 (𝐠 𝑖 ⊗ 𝐠 𝑗 )𝐠 𝑘
= (𝑢𝑖 𝑣 𝑗 )𝑘 𝐠 𝑖 𝛿𝑗𝑘 = (𝑢𝑖 𝑣 𝑗 )𝑗 𝐠 𝑖
= 𝑢 𝑖 ,𝑗 𝑣 𝑗 𝐠 𝑖 + 𝑢 𝑖 𝑣 𝑗 ,𝑗 𝐠 𝑖
= (grad 𝐮)𝐯 + (div 𝐯)𝐮

7. For a scalar field 𝜙 and a tensor field 𝐓 show that grad (𝜙𝐓) = 𝜙grad 𝐓 + 𝐓 ⊗
grad𝜙. Also show that div (𝜙𝐓) = 𝜙 div 𝐓 + 𝐓grad𝜙
grad(𝜙𝐓) = (𝜙𝑇 𝑖𝑗 ),𝑘 𝐠 𝑖 ⊗ 𝐠 𝑗 ⊗ 𝐠 𝑘
= (𝜙,𝑘 𝑇 𝑖𝑗 + 𝜙𝑇 𝑖𝑗 ,𝑘 )𝐠 𝑖 ⊗ 𝐠 𝑗 ⊗ 𝐠 𝑘
= 𝐓 ⊗ grad𝜙 + 𝜙grad 𝐓
Furthermore, we can contract the last two bases and obtain,
div(𝜙𝐓) = (𝜙,𝑘 𝑇 𝑖𝑗 + 𝜙𝑇 𝑖𝑗 ,𝑘 )𝐠 𝑖 ⊗ 𝐠 𝑗 ⋅ 𝐠 𝑘
= (𝜙,𝑘 𝑇 𝑖𝑗 + 𝜙𝑇 𝑖𝑗 ,𝑘 )𝐠 𝑖 𝛿𝑗𝑘
= 𝑇 𝑖𝑘 𝜙,𝑘 𝐠 𝑖 + 𝜙𝑇 𝑖𝑘 ,𝑘 𝐠 𝑖
= 𝐓grad𝜙 + 𝜙 div 𝐓

8. For two arbitrary vectors, 𝐮 and 𝐯, show that grad(𝐮 × 𝐯) = (𝐮 ×)grad𝐯 −


(𝐯 ×)grad𝐮
grad(𝐮 × 𝐯) = (𝜖 𝑖𝑗𝑘 𝑢𝑗 𝑣𝑘 ),𝑙 𝐠 𝑖 ⊗ 𝐠 𝑙
= (𝜖 𝑖𝑗𝑘 𝑢𝑗 ,𝑙 𝑣𝑘 + 𝜖 𝑖𝑗𝑘 𝑢𝑗 𝑣𝑘 ,𝑙 )𝐠 𝑖 ⊗ 𝐠 𝑙
= (𝑢𝑗 ,𝑙 𝜖 𝑖𝑗𝑘 𝑣𝑘 + 𝑣𝑘 ,𝑙 𝜖 𝑖𝑗𝑘 𝑢𝑗 )𝐠 𝑖 ⊗ 𝐠 𝑙
= −(𝐯 ×)grad𝐮 + (𝐮 ×)grad𝐯

9. For a vector field 𝐮, show that grad(𝐮 ×) is a third ranked tensor. Hence or
otherwise show that div(𝐮 ×) = −curl 𝐮.
The second–order tensor (𝐮 ×) is defined as 𝜖 𝑖𝑗𝑘 𝑢𝑗 𝐠 𝑖 ⊗ 𝐠 𝑘 . Taking the covariant
derivative with an independent base, we have
grad(𝐮 ×) = 𝜖 𝑖𝑗𝑘 𝑢𝑗 ,𝑙 𝐠 𝑖 ⊗ 𝐠 𝑘 ⊗ 𝐠 𝑙
This gives a third order tensor as we have seen. Contracting on the last two bases,
div(𝐮 ×) = 𝜖 𝑖𝑗𝑘 𝑢𝑗 ,𝑙 𝐠 𝑖 ⊗ 𝐠 𝑘 ⋅ 𝐠 𝑙
= 𝜖 𝑖𝑗𝑘 𝑢𝑗 ,𝑙 𝐠 𝑖 𝛿𝑘𝑙
= 𝜖 𝑖𝑗𝑘 𝑢𝑗 ,𝑘 𝐠 𝑖
= −curl 𝐮

10. Show that div (𝜙𝟏) = grad 𝜙


Note that 𝜙𝟏 = (𝜙𝑔𝛼𝛽 )𝐠 𝛼 ⊗ 𝐠 𝛽 . Also note that
grad 𝜙𝟏 = (𝜙𝑔𝛼𝛽 ),𝑖 𝐠 𝛼 ⊗ 𝐠 𝛽 ⊗ 𝐠 𝑖
The divergence of this third order tensor is the contraction of the last two bases:
div (𝜙𝟏) = tr(grad 𝜙𝟏) = (𝜙𝑔𝛼𝛽 ),𝑖 (𝐠 𝛼 ⊗ 𝐠 𝛽 )𝐠 𝑖 = (𝜙𝑔𝛼𝛽 ),𝑖 𝐠 𝛼 𝑔𝛽𝑖
= 𝜙,𝑖 𝑔𝛼𝛽 𝑔𝛽𝑖 𝐠 𝛼
= 𝜙,𝑖 𝛿𝛼𝑖 𝐠 𝛼 = 𝜙,𝑖 𝐠 𝑖 = grad 𝜙

11. Show that curl (𝜙𝟏) = ( grad 𝜙) ×


Note that 𝜙𝟏 = (𝜙𝑔𝛼𝛽 )𝐠 𝛼 ⊗ 𝐠 𝛽 , and that curl 𝑻 = 𝜖 𝑖𝑗𝑘 𝑇𝛼𝑘 ,𝑗 𝐠 𝑖 ⊗ 𝐠 𝛼 so that,
curl (𝜙𝟏) = 𝜖 𝑖𝑗𝑘 (𝜙𝑔𝛼𝑘 ),𝑗 𝐠 𝑖 ⊗ 𝐠 𝛼
= 𝜖 𝑖𝑗𝑘 (𝜙,𝑗 𝑔𝛼𝑘 )𝐠 𝑖 ⊗ 𝐠 𝛼 = 𝜖 𝑖𝑗𝑘 𝜙,𝑗 𝐠 𝑖 ⊗ 𝐠 𝑘
= ( grad 𝜙) ×

12. Show that curl (𝐯 ×) = (div 𝐯)𝟏 − grad 𝐯


(𝐯 ×) = 𝜖 𝛼𝛽𝑘 𝑣𝛽 𝐠 𝛼 ⊗ 𝐠 𝑘
curl 𝑻 = 𝜖 𝑖𝑗𝑘 𝑇𝛼𝑘 ,𝑗 𝐠 𝑖 ⊗ 𝐠 𝛼
so that
curl (𝐯 ×) = 𝜖 𝑖𝑗𝑘 𝜖 𝛼𝛽𝑘 𝑣𝛽 ,𝑗 𝐠 𝑖 ⊗ 𝐠 𝛼
= (𝑔𝑖𝛼 𝑔 𝑗𝛽 − 𝑔𝑖𝛽 𝑔 𝑗𝛼 ) 𝑣𝛽 ,𝑗 𝐠 𝑖 ⊗ 𝐠 𝛼
= 𝑣 𝑗 ,𝑗 𝐠 𝛼 ⊗ 𝐠 𝛼 − 𝑣 𝑖 ,𝑗 𝐠 𝑖 ⊗ 𝐠 𝑗
= (div 𝐯)𝟏 − grad 𝐯

13. Show that div (𝐮 × 𝐯) = 𝐯 ⋅ curl 𝐮 − 𝐮 ⋅ curl 𝐯


div (𝐮 × 𝐯) = (𝜖 𝑖𝑗𝑘 𝑢𝑗 𝑣𝑘 ),𝑖
Noting that the tensor 𝜖 𝑖𝑗𝑘 behaves as a constant under a covariant
differentiation, we can write,
div (𝐮 × 𝐯) = (𝜖 𝑖𝑗𝑘 𝑢𝑗 𝑣𝑘 ),𝑖
= 𝜖 𝑖𝑗𝑘 𝑢𝑗 ,𝑖 𝑣𝑘 + 𝜖 𝑖𝑗𝑘 𝑢𝑗 𝑣𝑘 ,𝑖
= 𝐯 ⋅ curl 𝐮 − 𝐮 ⋅ curl 𝐯

14. Given a scalar point function 𝜙 and a vector field 𝐯, show that curl (𝜙𝐯) =
𝜙 curl 𝐯 + (∇𝜙) × 𝐯.
curl (𝜙𝐯) = 𝜖 𝑖𝑗𝑘 (𝜙𝑣𝑘 ),𝑗 𝐠 𝑖
= 𝜖 𝑖𝑗𝑘 (𝜙,𝑗 𝑣𝑘 + 𝜙𝑣𝑘 ,𝑗 )𝐠 𝑖
= 𝜖 𝑖𝑗𝑘 𝜙,𝑗 𝑣𝑘 𝐠 𝑖 + 𝜖 𝑖𝑗𝑘 𝜙𝑣𝑘 ,𝑗 𝐠 𝑖
= (∇𝜙) × 𝐯 + 𝜙 curl 𝐯

15. Show that curl (grad 𝜙) = 𝐨


For any tensor 𝐯 = 𝑣𝛼 𝐠 𝛼
curl 𝐯 = 𝜖 𝑖𝑗𝑘 𝑣𝑘 ,𝑗 𝐠 𝑖
Let 𝐯 = grad 𝜙. Clearly, in this case, 𝑣𝑘 = 𝜙,𝑘 so that 𝑣𝑘 ,𝑗 = 𝜙,𝑘𝑗 . It therefore
follows that,
curl (grad 𝜙) = 𝜖 𝑖𝑗𝑘 𝜙,𝑘𝑗 𝐠 𝑖 = 𝟎.
The contraction of symmetric tensors with unsymmetric led to this conclusion.
Note that this presupposes that the order of differentiation in the scalar field is
immaterial. This will be true only if the scalar field is continuous – a proposition
we have assumed in the above.

16. Show that curl (grad 𝐯) = 𝟎


For any tensor 𝐓 = T𝛼𝛽 𝐠 𝛼 ⊗ 𝐠 𝛽
curl 𝐓 = 𝜖 𝑖𝑗𝑘 𝑇𝛼𝑘 ,𝑗 𝐠 𝑖 ⊗ 𝐠 𝛼
Let 𝐓 = grad 𝐯. Clearly, in this case, T𝛼𝛽 = 𝑣𝛼 ,𝛽 so that 𝑇𝛼𝑘 ,𝑗 = 𝑣𝛼 ,𝑘𝑗 . It
therefore follows that,
curl (grad 𝐯) = 𝜖 𝑖𝑗𝑘 𝑣𝛼 ,𝑘𝑗 𝐠 𝑖 ⊗ 𝐠 𝛼 = 𝟎.
The contraction of symmetric tensors with unsymmetric led to this conclusion.
Note that this presupposes that the order of differentiation in the vector field is
immaterial. This will be true only if the vector field is continuous – a proposition
we have assumed in the above.

17. Show that curl (grad 𝐯)T = grad(curl 𝐯)


From previous derivation, we can see that, curl 𝐓 = 𝜖 𝑖𝑗𝑘 𝑇𝛼𝑘 ,𝑗 𝐠 𝑖 ⊗ 𝐠 𝛼 . Clearly,
curl 𝐓 T = 𝜖 𝑖𝑗𝑘 𝑇𝑘𝛼 ,𝑗 𝐠 𝑖 ⊗ 𝐠 𝛼
so that curl (grad 𝐯)T = 𝜖 𝑖𝑗𝑘 𝑣𝑘 ,𝛼𝑗 𝐠 𝑖 ⊗ 𝐠 𝛼 . But curl 𝐯 = 𝜖 𝑖𝑗𝑘 𝑣𝑘 ,𝑗 𝐠 𝑖 . The
gradient of this is,
grad(curl 𝐯) = (𝜖 𝑖𝑗𝑘 𝑣𝑘 ,𝑗 ),𝛼 𝐠 𝑖 ⊗ 𝐠 𝛼 = 𝜖 𝑖𝑗𝑘 𝑣𝑘 ,𝑗𝛼 𝐠 𝑖 ⊗ 𝐠 𝛼 = curl (grad 𝐯)T

18. Show that div (grad 𝜙 × grad θ) = 0


grad 𝜙 × grad θ = 𝜖 𝑖𝑗𝑘 𝜙,𝑗 𝜃,𝑘 𝐠 𝑖
The gradient of this vector is the tensor,
grad(grad 𝜙 × grad 𝜃) = (𝜖 𝑖𝑗𝑘 𝜙,𝑗 𝜃,𝑘 ),𝑙 𝐠 𝑖 ⊗ 𝐠 𝑙
= 𝜖 𝑖𝑗𝑘 𝜙,𝑗𝑙 𝜃,𝑘 𝐠 𝑖 ⊗ 𝐠 𝑙 + 𝜖 𝑖𝑗𝑘 𝜙,𝑗 𝜃,𝑘𝑙 𝐠 𝑖 ⊗ 𝐠 𝑙
The trace of the above result is the divergence we are seeking:
div (grad 𝜙 × grad θ) = tr[grad(grad 𝜙 × grad 𝜃)]
= 𝜖 𝑖𝑗𝑘 𝜙,𝑗𝑙 𝜃,𝑘 𝐠 𝑖 ⋅ 𝐠 𝑙 + 𝜖 𝑖𝑗𝑘 𝜙,𝑗 𝜃,𝑘𝑙 𝐠 𝑖 ⋅ 𝐠 𝑙
= 𝜖 𝑖𝑗𝑘 𝜙,𝑗𝑙 𝜃,𝑘 𝛿𝑖𝑙 + 𝜖 𝑖𝑗𝑘 𝜙,𝑗 𝜃,𝑘𝑙 𝛿𝑖𝑙
= 𝜖 𝑖𝑗𝑘 𝜙,𝑗𝑖 𝜃,𝑘 + 𝜖 𝑖𝑗𝑘 𝜙,𝑗 𝜃,𝑘𝑖 = 0
Each term vanishing on account of the contraction of a symmetric tensor with an
antisymmetric.

19. Show that curl curl 𝐯 = grad(div 𝐯) − grad2 𝐯


Let 𝐰 = curl 𝐯 ≡ 𝜖 𝑖𝑗𝑘 𝑣𝑘 ,𝑗 𝐠 𝑖 . But curl 𝐰 ≡ 𝜖 𝛼𝛽𝛾 𝑤𝛾 ,𝛽 𝐠 𝛼 . Upon inspection, we
find that 𝑤𝛾 = 𝑔𝛾𝑖 𝜖 𝑖𝑗𝑘 𝑣𝑘 ,𝑗 so that
curl 𝐰 ≡ 𝜖 𝛼𝛽𝛾 (𝑔𝛾𝑖 𝜖 𝑖𝑗𝑘 𝑣𝑘 ,𝑗 ),𝛽 𝐠 𝛼 = 𝑔𝛾𝑖 𝜖 𝛼𝛽𝛾 𝜖 𝑖𝑗𝑘 𝑣𝑘 ,𝑗𝛽 𝐠 𝛼
Now, it can be shown (see below) that 𝑔𝛾𝑖 𝜖 𝛼𝛽𝛾 𝜖 𝑖𝑗𝑘 = 𝑔𝛼𝑗 𝑔𝛽𝑘 − 𝑔𝛼𝑘 𝑔𝛽𝑗 so that,
curl 𝐰 = (𝑔𝛼𝑗 𝑔𝛽𝑘 − 𝑔𝛼𝑘 𝑔𝛽𝑗 )𝑣𝑘 ,𝑗𝛽 𝐠 𝛼
= 𝑣 𝛽 ,𝑗𝛽 𝐠𝑗 − 𝑣 𝛼 ,𝑗𝑗 𝐠 𝛼 = grad(div 𝐯) − grad2 𝐯

20. Show that 𝑔𝛾𝑖 𝜖 𝛼𝛽𝛾 𝜖 𝑖𝑗𝑘 = 𝑔𝛼𝑗 𝑔𝛽𝑘 − 𝑔𝛼𝑘 𝑔𝛽𝑗
Note that
𝑔𝑖𝛼 𝑔𝑖𝛽 𝑔𝑖𝛾 𝑔𝛾𝑖 𝑔𝑖𝛼 𝑔𝛾𝑖 𝑔𝑖𝛽 𝑔𝛾𝑖 𝑔𝑖𝛾
𝑔𝛾𝑖 𝜖 𝛼𝛽𝛾 𝜖 𝑖𝑗𝑘 = 𝑔𝛾𝑖 | 𝑔 𝑗𝛼 𝑔 𝑗𝛽 𝑔 𝑗𝛾 | = | 𝑔 𝑗𝛼 𝑔 𝑗𝛽 𝑔 𝑗𝛾 |
𝑔𝑘𝛼 𝑔𝑘𝛽 𝑔𝑘𝛾 𝑔𝑘𝛼 𝑔𝑘𝛽 𝑔𝑘𝛾
𝛽 𝛾
𝛿𝛾𝛼 𝛿𝛾 𝛿𝛾
= | 𝑔 𝑗𝛼 𝑔 𝑗𝛽 𝑔 𝑗𝛾 |
𝑔𝑘𝛼 𝑔𝑘𝛽 𝑔𝑘𝛾
𝑗𝛽
𝛼 𝑔 𝑔 𝑗𝛾 𝛽 𝑔
𝑗𝛼
𝑔 𝑗𝛾 𝛾 𝑔
𝑗𝛼
𝑔 𝑗𝛽
= 𝛿𝛾 | 𝑘𝛽 | − 𝛿𝛾 | 𝑘𝛼 | + 𝛿𝛾 | 𝑘𝛼 |
𝑔 𝑔𝑘𝛾 𝑔 𝑔𝑘𝛾 𝑔 𝑔𝑘𝛽
𝑔 𝑗𝛽 𝑔 𝑗𝛼 𝑔 𝑗𝛼 𝑔 𝑗𝛽 𝑔 𝑗𝛼 𝑔 𝑗𝛽 𝑔 𝑗𝛼 𝑔 𝑗𝛽
= | 𝑘𝛽 𝑘𝛼
| − | 𝑘𝛼 𝑘𝛽
| + 3 | 𝑘𝛼 𝑘𝛽
| = | 𝑘𝛼 𝑘𝛽
|
𝑔 𝑔 𝑔 𝑔 𝑔 𝑔 𝑔 𝑔
= 𝑔𝛼𝑗 𝑔𝛽𝑘 − 𝑔𝛼𝑘 𝑔𝛽𝑗
𝐀
21. Given that 𝜑(𝑡) = |𝐀(𝑡)|, Show that 𝜑̇ (𝑡) = |𝐀(𝑡)| : 𝐀̇
𝜑2 ≡ 𝐀: 𝐀
Now,
𝑑 𝑑𝜑 𝑑𝐀 𝑑𝐀 𝑑𝐀
(𝜑2 ) = 2𝜑 = : 𝐀 + 𝐀: = 2𝐀:
𝑑𝑡 𝑑𝑡 𝑑𝑡 𝑑𝑡 𝑑𝑡
as inner product is commutative. We can therefore write that
𝑑𝜑 𝐀 𝑑𝐀 𝐀
= : = : 𝐀̇
𝑑𝑡 𝜑 𝑑𝑡 |𝐀(𝑡)|
as required.

22. Given a tensor field 𝐓, obtain the vector 𝐰 ≡ 𝐓 T 𝒗 and show that its divergence is
𝐓: (∇𝐯) + 𝐯 ⋅ div 𝐓
The divergence of 𝒘 is the scalar sum , (𝑻𝒋𝒊 𝒗𝒋 ),𝒊 . Expanding the product covariant
derivative we obtain,
div (𝐓 T 𝐯) = (𝑇𝑗𝑖 𝑣 𝑗 ),𝒊 = 𝑇𝑗𝑖 ,𝑖 𝑣 𝑗 + 𝑇𝑗𝑖 𝑣 𝑗 ,𝑖
= (div 𝐓) ⋅ 𝐯 + tr(𝐓 T grad 𝐯)
= (div 𝐓) ⋅ 𝐯 + 𝐓: (grad 𝐯)
Recall that scalar product of two vectors is commutative so that
div (𝐓 T 𝐯) = 𝐓: (grad 𝐯) + 𝐯 ⋅ div 𝐓

23. For a second-order tensor 𝐓 define curl 𝐓 ≡ 𝜖 𝑖𝑗𝑘 𝑇𝛼𝑘 ,𝑗 𝐠 𝑖 ⊗ 𝐠 𝛼 show that for
any constant vector 𝒂, (curl 𝐓) 𝒂 = curl (𝐓 T 𝒂)
Express vector 𝒂 in the invariant form with covariant components as 𝒂 = 𝑎𝛽 𝐠 𝛽 .
It follows that
(curl 𝐓) 𝒂 = 𝜖 𝑖𝑗𝑘 𝑇𝛼𝑘 ,𝑗 (𝐠 𝑖 ⊗ 𝐠 𝛼 )𝒂
= 𝜖 𝑖𝑗𝑘 𝑇𝛼𝑘 ,𝑗 𝑎𝛽 (𝐠 𝑖 ⊗ 𝐠 𝛼 )𝐠 𝛽
= 𝜖 𝑖𝑗𝑘 𝑇𝛼𝑘 ,𝑗 𝑎𝛽 𝐠 𝑖 𝛿𝛽𝛼
= 𝜖 𝑖𝑗𝑘 (𝑇𝛼𝑘 ),𝑗 𝐠 𝑖 𝑎𝛼
= 𝜖 𝑖𝑗𝑘 (𝑇𝛼𝑘 𝑎𝛼 ),𝑗 𝐠 𝑖
The last equality resulting from the fact that vector 𝒂 is a constant vector. Clearly,
(curl 𝐓) 𝒂 = curl (𝐓 T 𝒂)
24. For any two vectors 𝐮 and 𝐯, show that curl (𝐮 ⊗ 𝐯) = [(grad 𝐮)𝐯 ×]𝑇 +
(curl 𝐯) ⊗ 𝒖 where 𝐯 × is the skew tensor 𝜖 𝑖𝑘𝑗 𝑣𝑘 𝐠 𝑖 ⊗ 𝐠 𝑗 .
Recall that the curl of a tensor 𝑻 is defined by curl 𝑻 ≡ 𝜖 𝑖𝑗𝑘 𝑇𝛼𝑘 ,𝑗 𝐠 𝑖 ⊗ 𝐠 𝛼 .
Clearly therefore,
curl (𝒖 ⊗ 𝒗) = 𝜖 𝑖𝑗𝑘 (𝑢𝛼 𝑣𝑘 ),𝑗 𝐠 𝑖 ⊗ 𝐠 𝛼 = 𝜖 𝑖𝑗𝑘 (𝑢𝛼 ,𝑗 𝑣𝑘 + 𝑢𝛼 𝑣𝑘 ,𝑗 ) 𝐠 𝑖 ⊗ 𝐠 𝛼
= 𝜖 𝑖𝑗𝑘 𝑢𝛼 ,𝑗 𝑣𝑘 𝐠 𝑖 ⊗ 𝐠 𝛼 + 𝜖 𝑖𝑗𝑘 𝑢𝛼 𝑣𝑘 ,𝑗 𝐠 𝑖 ⊗ 𝐠 𝛼
= (𝜖 𝑖𝑗𝑘 𝑣𝑘 𝐠 𝑖 ) ⊗ (𝑢𝛼 ,𝑗 𝐠 𝛼 ) + (𝜖 𝑖𝑗𝑘 𝑣𝑘 ,𝑗 𝐠 𝑖 ) ⊗ (𝑢𝛼 𝐠 𝛼 )
= (𝜖 𝑖𝑗𝑘 𝑣𝑘 𝐠 𝑖 ⊗ 𝐠 𝑗 )(𝑢𝛼 ,𝛽 𝐠 𝛽 ⊗ 𝐠 𝛼 ) + (𝜖 𝑖𝑗𝑘 𝑣𝑘 ,𝑗 𝐠 𝑖 ) ⊗ (𝑢𝛼 𝐠 𝛼 )
= −(𝐯 ×)(grad 𝐮)𝑻 + (curl 𝐯) ⊗ 𝐮 = [(grad 𝐮)𝐯 ×]𝑻 + (curl 𝐯) ⊗ 𝐮
upon noting that the vector cross is a skew tensor.

25. Show that curl (𝐮 × 𝐯) = div(𝐮 ⊗ 𝐯 − 𝐯 ⊗ 𝐮)


The vector 𝐰 ≡ 𝐮 × 𝐯 = 𝑤𝒌 𝐠 𝒌 = 𝜖𝑘𝛼𝛽 𝑢𝛼 𝑣 𝛽 𝐠 𝒌 and curl 𝐰 = 𝝐𝒊𝒋𝒌 𝑤𝑘 ,𝑗 𝐠 𝑖 .
Therefore,
curl (𝐮 × 𝐯) = 𝝐𝒊𝒋𝒌 𝑤𝑘 ,𝑗 𝐠 𝑖
= 𝝐𝒊𝒋𝒌 𝜖𝑘𝛼𝛽 (𝑢𝛼 𝑣 𝛽 ),𝑗 𝐠 𝑖
𝑗 𝑗
= (𝛿𝛼𝑖 𝛿𝛽 − 𝛿𝛽𝑖 𝛿𝛼 ) (𝑢𝛼 𝑣 𝛽 ),𝑗 𝐠 𝑖
𝑗 𝑗
= (𝛿𝛼𝑖 𝛿𝛽 − 𝛿𝛽𝑖 𝛿𝛼 ) (𝑢𝛼 ,𝑗 𝑣 𝛽 + 𝑢𝛼 𝑣 𝛽 ,𝑗 )𝐠 𝑖
= [𝑢𝑖 ,𝑗 𝑣 𝑗 + 𝑢𝑖 𝑣 𝑗 ,𝑗 − (𝑢 𝑗 ,𝑗 𝑣 𝑖 + 𝑢 𝑗 𝑣 𝑖 ,𝑗 )]𝐠 𝑖
= [(𝑢𝑖 𝑣 𝑗 ),𝑗 − (𝑢 𝑗 𝑣 𝑖 ),𝑗 ]𝐠 𝑖
= div(𝐮 ⊗ 𝐯 − 𝐯 ⊗ 𝐮)
since div(𝐮 ⊗ 𝐯) = (𝑢𝑖 𝑣 𝑗 ),𝛼 𝐠 𝑖 ⊗ 𝐠 𝑗 ⋅ 𝐠 𝛼 = (𝑢𝑖 𝑣 𝑗 ),𝑗 𝐠 𝑖 .

26. Given a scalar point function 𝜙 and a second-order tensor field 𝐓, show that
curl (𝜙𝐓) = 𝜙 curl 𝐓 + ((∇𝜙) ×)𝐓 T where [(∇𝜙) ×] is the skew tensor
𝜖 𝑖𝑗𝑘 𝜙,𝑗 𝐠 𝑖 ⊗ 𝐠 𝑘
curl (𝜙𝑻) ≡ 𝜖 𝑖𝑗𝑘 (𝜙𝑇𝛼𝑘 ),𝑗 𝐠 𝑖 ⊗ 𝐠 𝛼
= 𝜖 𝑖𝑗𝑘 (𝜙,𝑗 𝑇𝛼𝑘 + 𝜙𝑇𝛼𝑘 ,𝑗 ) 𝐠 𝑖 ⊗ 𝐠 𝛼
= 𝜖 𝑖𝑗𝑘 𝜙,𝑗 𝑇𝛼𝑘 𝐠 𝑖 ⊗ 𝐠 𝛼 + 𝜙𝜖 𝑖𝑗𝑘 𝑇𝛼𝑘 ,𝑗 𝐠 𝑖 ⊗ 𝐠 𝛼
= (𝜖 𝑖𝑗𝑘 𝜙,𝑗 𝐠 𝑖 ⊗ 𝐠 𝑘 ) (𝑇𝛼𝛽 𝐠 𝛽 ⊗ 𝐠 𝛼 ) + 𝜙𝜖 𝑖𝑗𝑘 𝑇𝛼𝑘 ,𝑗 𝐠 𝑖 ⊗ 𝐠 𝛼
= 𝜙 curl 𝐓 + ((∇𝜙) ×)𝐓 T
27. For a second-order tensor field 𝑻, show that div(curl 𝐓) = curl(div 𝐓 T )
Define the second order tensor 𝑆 as
curl 𝐓 ≡ 𝜖 𝑖𝑗𝑘 𝑇𝛼𝑘 ,𝑗 𝐠 𝑖 ⊗ 𝐠 𝛼 = 𝑆.𝛼
𝑖
𝐠𝑖 ⊗ 𝐠𝛼
𝑖
The gradient of 𝑺 is 𝑆.𝛼 ,𝛽 𝐠 𝑖 ⊗ 𝐠 𝛼 ⊗ 𝐠 𝛽 = 𝜖 𝑖𝑗𝑘 𝑇𝛼𝑘 ,𝑗𝛽 𝐠 𝑖 ⊗ 𝐠 𝛼 ⊗ 𝐠 𝛽
Clearly,
div(curl 𝑻) = 𝜖 𝑖𝑗𝑘 𝑇𝛼𝑘 ,𝑗𝛽 𝐠 𝑖 ⊗ 𝐠 𝛼 ⋅ 𝐠 𝛽 = 𝜖 𝑖𝑗𝑘 𝑇𝛼𝑘 ,𝑗𝛽 𝐠 𝑖 𝑔𝛼𝛽
= 𝜖 𝑖𝑗𝑘 𝑇𝛽 𝑘 ,𝑗𝛽 𝐠 𝑖 = curl(div 𝐓 T )

28. Show that if 𝝋 defined in the space spanned by orthogonal coordinates 𝒙𝒊 , then
𝝏𝝋
𝛁 𝟐 (𝒙𝒊 𝝋) = 𝟐 𝝏𝒙𝒊 + 𝒙𝒊 𝛁 𝟐 𝝋 .
By definition, ∇2 (𝑥 𝑖 𝜑) = 𝑔 𝑗𝑘 (𝑥 𝑖 𝜑),𝑗𝑘 . Expanding, we have
𝑔 𝑗𝑘 (𝑥 𝑖 𝜑),𝑗𝑘 = 𝑔 𝑗𝑘 (𝑥 𝑖 ,𝑗 𝜑 + 𝑥 𝑖 𝜑,𝑗 ),𝑘 = 𝑔 𝑗𝑘 (𝛿𝑗𝑖 𝜑 + 𝑥 𝑖 𝜑,𝑗 )
,𝑘
= 𝑔 𝑗𝑘 (𝛿𝑗𝑖 𝜑,𝑘 + 𝑥 𝑖 ,𝑘 𝜑,𝑗 + 𝑥 𝑖 𝜑,𝑗𝑘 )
= 𝑔 𝑗𝑘 (𝛿𝑗𝑖 𝜑,𝑘 + 𝛿𝑘𝑖 𝜑,𝑗 + 𝑥 𝑖 𝜑,𝑗𝑘 )
= 𝑔𝑖𝑘 𝜑,𝑘 + 𝑔𝑖𝑗 𝜑,𝑗 + 𝑥 𝑖 𝑔 𝑗𝑘 𝜑,𝑗𝑘
When the coordinates are orthogonal, this becomes,
2 𝜕Φ
2 𝑖
+ 𝑥 𝑖 ∇2 Φ
(ℎ𝑖 ) 𝜕𝑥
where we have suspended the summation rule and ℎ𝑖 is the square root of the
appropriate metric tensor component.

29. In Cartesian coordinates, If the volume 𝑉 is enclosed by the surface 𝑆, the


position vector 𝒓 = 𝑥 𝑖 𝐠 𝑖 and 𝒏 is the external unit normal to each surface element,
1
show that ∫𝑆 ∇(𝒓 ⋅ 𝒓) ⋅ 𝒏𝑑𝑆 equals the volume contained in 𝑉.
6
𝒓 ⋅ 𝒓 = 𝑥 𝑖 𝑥 𝑗 𝐠 𝑖 ⋅ 𝐠 𝑗 = 𝑥 𝑖 𝑥 𝑗 𝑔𝑖𝑗
By the Divergence Theorem,
∫∇(𝒓 ⋅ 𝒓) ⋅ 𝒏𝑑𝑆 = ∫ ∇ ⋅ [∇(𝒓 ⋅ 𝒓)]𝑑𝑉 = ∫ 𝜕𝑙 [𝜕𝑘 (𝑥 𝑖 𝑥 𝑗 𝑔𝑖𝑗 )] 𝐠 𝑙 ⋅ 𝐠 𝑘 𝑑𝑉
𝑆 𝑉 𝑉

= ∫ 𝜕𝑙 [𝑔𝑖𝑗 (𝑥 𝑖 ,𝑘 𝑥 𝑗 + 𝑥 𝑖 𝑥 𝑗 ,𝑘 )] 𝐠 𝑙 ⋅ 𝐠 𝑘 𝑑𝑉
𝑉
𝑗
= ∫ 𝑔𝑖𝑗 𝑔𝑙𝑘 (𝛿𝑘𝑖 𝑥 𝑗 + 𝑥 𝑖 𝛿𝑘 ),𝑙 𝑑𝑉 = ∫ 2𝑔𝑖𝑘 𝑔𝑙𝑘 𝑥 𝑖 ,𝑙 𝑑𝑉 = ∫ 2𝛿𝑖𝑙 𝛿𝑙𝑖 𝑑𝑉
𝑉 𝑉 𝑉

= 6 ∫ 𝑑𝑉
𝑉

30. For any Euclidean coordinate system, show that 𝐝𝐢𝐯 𝐮 × 𝐯 = 𝐯 𝐜𝐮𝐫𝐥 𝐮 −
𝐮 𝐜𝐮𝐫𝐥 𝐯
Given the contravariant vector 𝑢𝑖 and 𝑣 𝑖 with their associated vectors 𝑢𝑖 and 𝑣𝑖 ,
the contravariant component of the above cross product is 𝜖 𝑖𝑗𝑘 𝑢𝑗 𝑣𝑘 .The
required divergence is simply the contraction of the covariant 𝑥 𝑖 derivative of this
quantity:
(𝜖 𝑖𝑗𝑘 𝑢𝑗 𝑣𝑘 ),𝑖 = 𝜖 𝑖𝑗𝑘 𝑢𝑗,𝑖 𝑣𝑘 + 𝜖 𝑖𝑗𝑘 𝑢𝑗 𝑣𝑘,𝑖
where we have treated the tensor 𝜀 𝑖𝑗𝑘 as a constant under the covariant
derivative.
Cyclically rearranging the RHS we obtain,
(𝜖 𝑖𝑗𝑘 𝑢𝑗 𝑣𝑘 ),𝑖 = 𝑣𝑘 𝜖 𝑘𝑖𝑗 𝑢𝑗,𝑖 + 𝑢𝑗 𝜖 𝑗𝑘𝑖 𝑣𝑘,𝑖 = 𝑣𝑘 𝜖 𝑘𝑖𝑗 𝑢𝑗,𝑖 + 𝑢𝑗 𝜖 𝑗𝑖𝑘 𝑣𝑘,𝑖
where we have used the anti-symmetric property of the tensor 𝜖 𝑖𝑗𝑘 . The last
expression shows clearly that
div 𝐮 × 𝐯 = 𝐯 curl 𝐮 − 𝐮 curl 𝐯
as required.

31. For a general tensor field 𝑻 show that, curl(curl 𝑻) = [∇2 (tr 𝑻) −
T
div(div 𝑻)]𝑰 + grad(div 𝑻) + (grad(div 𝑻)) − grad(grad (tr𝑻)) − ∇2 𝑻T
curl 𝑻 = 𝝐𝛼𝑠𝑡 𝑇𝛽𝑡 ,𝑠 𝐠 𝛼 ⊗ 𝐠 𝛽
= 𝑆 𝛼.𝛽 𝐠 𝛼 ⊗ 𝐠 𝛽
curl 𝑺 = 𝜖 𝑖𝑗𝑘 𝑆 𝛼.𝑘 ,𝑗 𝐠 𝑖 ⊗ 𝐠 𝛼
so that
curl 𝑺 = curl(curl 𝑻) = 𝜖 𝑖𝑗𝑘 𝜖 𝛼𝑠𝑡 𝑇𝑘𝑡 ,𝑠𝑗 𝐠 𝑖 ⊗ 𝐠 𝛼
𝑔𝑖𝛼 𝑔𝑖𝑠 𝑔𝑖𝑡
= | 𝑔 𝑗𝛼 𝑔 𝑗𝑠 𝑔 𝑗𝑡 | 𝑇𝑘𝑡 ,𝑠𝑗 𝐠 𝑖 ⊗ 𝐠 𝛼
𝑔𝑘𝛼 𝑔𝑘𝑠 𝑔𝑘𝑡
𝑔𝑖𝛼 (𝑔 𝑗𝑠 𝑔𝑘𝑡 − 𝑔 𝑗𝑡 𝑔𝑘𝑠 ) + 𝑔𝑖𝑠 (𝑔 𝑗𝑡 𝑔𝑘𝛼 − 𝑔 𝑗𝛼 𝑔𝑘𝑡 )
=[ 𝑖𝑡 𝑗𝛼 𝑘𝑠 𝑗𝑠 𝑘𝛼
] 𝑇𝑘𝑡 ,𝑠𝑗 𝐠 𝑖 ⊗ 𝐠 𝛼
+𝑔 (𝑔 𝑔 − 𝑔 𝑔 )
= [𝑔 𝑗𝑠 𝑇 𝑡.𝑡 ,𝑠𝑗 − 𝑇..𝑠𝑗 ,𝑠𝑗 ](𝐠 𝛼 ⊗ 𝐠 𝛼 ) + [𝑇 𝛼𝑗 𝑗𝛼 𝑡 𝑠
.. ,𝑠𝑗 − 𝑔 𝑇 .𝑡 ,𝑠𝑗 ](𝐠 ⊗ 𝐠 𝛼 )
+ [𝑔 𝑗𝛼 𝑇 .𝑡𝑠. ,𝑠𝑗 − 𝑔 𝑗𝑠 𝑇 𝛼. 𝑡
.𝑡 ,𝑠𝑗 ](𝐠 ⊗ 𝐠 𝛼 )
T
= [∇2 (tr 𝑻) − div(div 𝑻)]𝑰 + (grad(div 𝑻)) − grad(grad (tr𝐓))
+ (grad(div 𝐓)) − ∇2 𝐓 T

32. When 𝐓 is symmetric, show that tr(curl 𝐓) vanishes.


curl 𝐓 = 𝝐𝑖𝑗𝑘 𝑇𝛽𝑘 ,𝑗 𝐠 𝑖 ⊗ 𝐠 𝛽
tr(curl 𝐓) = 𝜖 𝑖𝑗𝑘 𝑇𝛽𝑘 ,𝑗 𝐠 𝑖 ⋅ 𝐠 𝛽
𝛽
= 𝜖 𝑖𝑗𝑘 𝑇𝛽𝑘 ,𝑗 𝛿𝑖 = 𝜖 𝑖𝑗𝑘 𝑇𝑖𝑘 ,𝑗
which obviously vanishes on account of the symmetry and antisymmetry in 𝑖 and
𝑘. In this case,
curl(curl 𝐓)
= [∇2 (tr 𝐓) − div(div 𝐓)]𝟏 − grad(grad (tr𝑻)) + 2(grad(div 𝑻))
− ∇2 𝐓
T
as (grad(div 𝐓)) = grad(div 𝐓) if the order of differentiation is immaterial and
𝐓 is symmetric.

33. For a scalar function 𝛷 and a vector 𝒗𝒊 show that the divergence of the vector
𝒗𝒊 𝚽 is equal to, 𝐯 ⋅ 𝛁𝛷 + 𝛷 𝑑𝑖𝑣 𝐯
(𝑣 𝑖 Φ),𝑖 = Φ𝑣 𝑖 ,𝑖 + 𝑣 𝑖 Φ,i
Hence the result.

34. Show that curl 𝐮 × 𝐯 = (𝐯 ∙ 𝛁𝐮) + (𝐮 ⋅ div 𝐯) − (𝐯 ⋅ div 𝐮) − (𝐮 ∙ 𝛁 𝐯)


Taking the associated (covariant) vector of the expression for the cross product in
the last example, it is straightforward to see that the LHS in indicial notation is,
𝜖 𝑙𝑚𝑖 (𝜖𝑖𝑗𝑘 𝑢 𝑗 𝑣 𝑘 ),𝑚
Expanding in the usual way, noting the relation between the alternating tensors
and the Kronecker deltas,
𝑙𝑚𝑖
𝜖 𝑙𝑚𝑖 (𝜀𝑖𝑗𝑘 𝑢 𝑗 𝑣 𝑘 ),𝑚 = 𝛿𝑗𝑘𝑖 (𝑢 𝑗 ,𝑚 𝑣 𝑘 − 𝑢 𝑗 𝑣 𝑘 ,𝑚 )

𝑙𝑚 𝛿𝑗𝑙 𝛿𝑗𝑚
= 𝛿𝑗𝑘 (𝑢 𝑗 ,𝑚 𝑣 𝑘 𝑗 𝑘
−𝑢 𝑣 ,𝑚 )
=| 𝑙 𝑚
| (𝑢 𝑗
,𝑚 𝑣 𝑘
− 𝑢 𝑗 𝑘
𝑣 ,𝑚 )
𝛿𝑘 𝛿𝑘
= (𝛿𝑗𝑙 𝛿𝑘𝑚 − 𝛿𝑘𝑙 𝛿𝑗𝑚 )(𝑢 𝑗 ,𝑚 𝑣 𝑘 − 𝑢 𝑗 𝑣 𝑘 ,𝑚 )
= 𝛿𝑗𝑙 𝛿𝑘𝑚 𝑢 𝑗 ,𝑚 𝑣 𝑘 − 𝛿𝑗𝑙 𝛿𝑘𝑚 𝑢 𝑗 𝑣 𝑘 ,𝑚 + 𝛿𝑘𝑙 𝛿𝑗𝑚 𝑢 𝑗 ,𝑚 𝑣 𝑘 − 𝛿𝑘𝑙 𝛿𝑗𝑚 𝑢 𝑗 𝑣 𝑘 ,𝑚
= 𝑢𝑙 ,𝑚 𝑣 𝑚 − 𝑢𝑚 ,𝑚 𝑣 𝑙 + 𝑢𝑙 𝑣 𝑚 ,𝑚 − 𝑢𝑚 𝑣 𝑙 ,𝑚
Which is the result we seek in indicial notation.

35. . In Cartesian coordinates let 𝑥 denote the magnitude of the position vector 𝐫 =
𝒙𝒋 𝟏 𝒙𝒊 𝒙𝒋 𝟐 𝟏
𝑥𝑖 𝐞𝑖 . Show that (a) 𝑥,𝒋 = 𝒙 , (b) 𝑥,𝒊𝒋 = 𝒙 𝛿𝒊𝒋 − (𝒙)𝟑, (c) 𝑥,𝒊𝒊 = 𝒙, (d) If 𝑼 = 𝒙, then 𝑼,𝒊𝒋 =
−𝜹𝒊𝒋 𝟑𝒙𝒊 𝒙𝒋 𝐫 2
+ 𝑼,𝒊𝒊 = 𝟎 and div (𝑥) = 𝑥.
𝒙𝟑 𝒙𝟓
(𝑎 ) 𝑥 = √𝑥𝑖 𝑥𝑖
𝜕√𝑥𝑖 𝑥𝑖 𝜕 √𝑥𝑖 𝑥𝑖 𝜕 (𝑥𝑖 𝑥𝑖 ) 1 𝑥𝑗
𝑥,𝑗 = = × = [𝑥𝑖 𝛿𝑖𝑗 + 𝑥𝑖 𝛿𝑖𝑗 ] = .
𝜕𝑥𝑗 𝜕 (𝑥𝑖 𝑥𝑖 ) 𝜕𝑥𝑗 2√𝑥𝑖 𝑥𝑖 𝑥
𝜕𝑥𝑖 𝜕𝑥 𝑥𝑖 𝑥𝑗
𝑥 − 𝑥𝑖
𝜕 𝜕√𝑥𝑖 𝑥𝑖 𝜕 𝑥𝑖 𝜕𝑥𝑗 𝜕𝑥𝑗 𝑥𝛿𝑖𝑗 − 𝑥
(𝑏) 𝑥,𝑖𝑗 = ( )= ( )= =
𝜕𝑥𝑗 𝜕𝑥𝑖 𝜕𝑥𝑗 𝑥 (𝑥 )2 (𝑥 )2
1 𝑥𝑖 𝑥𝑗
= 𝛿𝑖𝑗 −
𝑥 (𝑥 )3
1 𝑥𝑖 𝑥𝑖 3 (𝑥 )2 2
(𝑐 ) 𝑥,𝑖𝑖 = 𝛿𝑖𝑖 − = − = .
𝑥 (𝑥 )3 𝑥 (𝑥 )3 𝑥
1
(𝑑 ) 𝑈 = so that
𝑥
1 1
𝜕 𝑥 𝜕 𝑥 𝜕𝑥 1 1 𝑥𝑗
𝑈,𝑗 = = × = − 2 𝑥𝑗 = − 3
𝜕𝑥𝑗 𝜕𝑥 𝜕𝑥𝑗 𝑥 𝑥 𝑥
Consequently,
𝜕 𝜕
𝑥3 ( (−𝑥 2 )) + 𝑥𝑖 (𝑥 3 )
𝜕 𝜕 𝑥𝑖 𝜕𝑥𝑗 𝜕𝑥𝑗
𝑈,𝑖𝑗 = ( )
𝑈,𝑖 = − ( )=
𝜕𝑥𝑗 𝜕𝑥𝑗 𝑥 3 𝑥6
𝜕(𝑥 3 ) 𝜕𝑥
3
𝑥 (−𝛿𝑖𝑗 ) + 𝑥𝑖 ( ) −𝑥 3 𝛿 + 𝑥 (3𝑥 2 𝑥𝑗 )
𝜕𝑥 𝜕𝑥𝑗 𝑖𝑗 𝑖 𝑥 −𝛿𝑖𝑗 3𝑥𝑖 𝑥𝑗
= = = 3 + 5
𝑥6 𝑥6 𝑥 𝑥
−𝛿𝑖𝑖 3𝑥𝑖 𝑥𝑖 −3 3𝑥 2
𝑈,𝑖𝑖 = 3 + 5 = 3 + 5 = 0.
𝑥 𝑥 𝑥 𝑥
𝐫 𝑥𝑗 1 1 3 𝜕 1 𝑑𝑥
div ( ) = ( ) ,𝑗 = 𝑥𝑗 ,𝑗 + ( ) = + 𝑥𝑗 ( ( ) )
𝑥 𝑥 𝑥 𝑥 ,𝑗 𝑥 𝜕𝑥 𝑥 𝑑𝑥𝑗
3 1 𝑥𝒋 3 𝑥𝑗 𝑥𝑗 3 1 2
= + 𝑥𝑗 [− ( 2 ) ] = − 3 = − =
𝑥 𝑥 𝑥 𝑥 𝑥 𝑥 𝑥 𝑥

36. For vectors 𝐮, 𝐯 and 𝐰, show that (𝐮 ×)(𝐯 ×)(𝐰 ×) = 𝐮 ⊗(𝐯 × 𝐰) −


(𝐮 ⋅ 𝐯)𝐰 ×.
The tensor (𝐮 ×) = −𝜖𝑙𝑚𝑛 𝑢𝑛 𝐠 𝑙 ⊗ 𝐠 𝑚 similarly, (𝐯 ×) = −𝜖 𝛼𝛽𝛾 𝑣𝛾 𝐠 𝛼 ⊗ 𝐠 𝛽 and
(𝐰 ×) = −𝜖 𝑖𝑗𝑘 𝑤𝑘 𝐠 𝑖 ⊗ 𝐠 𝑗 . Clearly,
(𝐮 ×)(𝐯 ×)(𝐰 ×) = −𝜖𝑙𝑚𝑛 𝜖 𝛼𝛽𝛾 𝜖 𝑖𝑗𝑘 𝑢𝑛 𝑣𝛾 𝑤𝑘 (𝐠 𝛼 ⊗ 𝐠 𝛽 )(𝐠 𝑙 ⊗ 𝐠 𝑚 )(𝐠 𝑖 ⊗ 𝐠 𝑗 )
= −𝜖 𝛼𝛽𝛾 𝜖𝑙𝑚𝑛 𝜖 𝑖𝑗𝑘 𝑢𝑛 𝑣𝛾 𝑤𝑘 (𝐠 𝛼 ⊗ 𝐠 𝑗 )𝛿𝛽𝑙 𝛿𝑖𝑚
= −𝜖 𝛼𝑙𝛾 𝜖𝑙𝑖𝑛 𝜖 𝑖𝑗𝑘 𝑢𝑛 𝑣𝛾 𝑤𝑘 (𝐠 𝛼 ⊗ 𝐠 𝑗 )
= −𝜖 𝑙𝛼𝛾 𝜖𝑙𝑛𝑖 𝜖 𝑖𝑗𝑘 𝑢𝑛 𝑣𝛾 𝑤𝑘 (𝐠 𝛼 ⊗ 𝐠 𝑗 )
𝛾 𝛾
= −(𝛿𝑛𝛼 𝛿𝑖 − 𝛿𝑖𝛼 𝛿𝑛 )𝜖 𝑖𝑗𝑘 𝑢𝑛 𝑣𝛾 𝑤𝑘 (𝐠 𝛼 ⊗ 𝐠 𝑗 )
= −𝜖 𝑖𝑗𝑘 𝑢𝛼 𝑣𝑖 𝑤𝑘 (𝐠 𝛼 ⊗ 𝐠 𝑗 ) + 𝜖 𝑖𝑗𝑘 𝑢𝛾 𝑣𝛾 𝑤𝑘 (𝐠 𝑖 ⊗ 𝐠 𝑗 )
= [𝐮 ⊗ (𝐯 × 𝐰) − (𝐮 ⋅ 𝐯)𝐰 ×]

37. Show that [𝐮, 𝐯, 𝐰] = tr[(𝐮 ×)(𝐯 ×)(𝐰 ×)]


In the above we have shown that (𝐮 ×)(𝐯 ×)(𝐰 ×) = [𝐮 ⊗ (𝐯 × 𝐰) −
(𝐮 ⋅ 𝐯)𝐰 ×]
Because the vector cross is traceless, the trace of [(𝐮 ⋅ 𝐯)𝐰 ×] = 0. The trace of
the first term, 𝐮 ⊗ (𝐯 × 𝐰) is obviously the same as [𝐮, 𝐯, 𝐰] which completes
the proof.
38. Show that (𝐮 ×)(𝐯 ×) = (𝐮 ⋅ 𝐯)𝟏 − 𝐮 ⊗ 𝐯 and that tr[(𝐮 ×)(𝐯 ×)] =
2(𝐮 ⋅ 𝐯)
(𝐮 ×)(𝐯 ×) = −𝜖𝑙𝑚𝑛 𝜖 𝛼𝛽𝛾 𝑢𝑛 𝑣𝛾 (𝐠 𝛼 ⊗ 𝐠 𝛽 )(𝐠 𝑙 ⊗ 𝐠 𝑚 )
= −𝜖𝑙𝑚𝑛 𝜖 𝛼𝛽𝛾 𝑢𝑛 𝑣𝛾 (𝐠 𝛼 ⊗ 𝐠 𝑚 )𝛿𝛽𝑙 = −𝜖𝛽𝑚𝑛 𝜖 𝛽𝛾𝛼 𝑢𝑛 𝑣𝛾 (𝐠 𝛼 ⊗ 𝐠 𝑚 )
𝛾 𝛼 𝛾
= [𝛿𝑛 𝛿𝑚 − 𝛿𝑚 𝛿𝑛𝛼 ]𝑢𝑛 𝑣𝛾 (𝐠 𝛼 ⊗ 𝐠 𝑚 )
= 𝑢𝑛 𝑣𝑛 (𝐠 𝛼 ⊗ 𝐠 𝛼 ) − 𝑢𝑛 𝑣𝑚 (𝐠 𝑛 ⊗ 𝐠 𝑚 ) = (𝐮 ⋅ 𝐯)𝟏 − 𝐮 ⊗ 𝐯
Obviously, the trace of this tensor is 2(𝐮 ⋅ 𝐯)

39. The position vector in the above example 𝒓 = 𝑥𝑖 𝒆𝑖 . Show that (a) div 𝒓 = 𝟑, (b)
div (𝒓 ⊗ 𝒓) = 𝟒𝒓, (c) div 𝒓 = 3, and (d) grad 𝒓 = 𝟏 and (e) curl (𝒓 ⊗ 𝒓) = −𝒓 ×
grad 𝒓 = 𝑥𝑖 ,𝒋 𝒆𝑖 ⊗ 𝒆𝒋
= 𝛿𝑖𝑗 𝒆𝑖 ⊗ 𝒆𝒋 = 𝟏
div 𝒓 = 𝑥𝑖 ,𝒋 𝒆𝑖 ⋅ 𝒆𝒋
= 𝛿𝑖𝑗 𝛿𝑖𝑗 = 𝛿𝑗𝑗 = 3. 𝒓 ⊗ 𝒓 = 𝑥𝑖 𝒆𝑖 ⊗ 𝑥𝑗 𝒆𝑗 = 𝑥𝑖 𝑥𝑗 𝒆𝑖 ⊗ 𝒆𝒋 grad(𝒓 ⊗ 𝒓)
= (𝑥𝑖 𝑥𝑗 ),𝑘 𝒆𝑖 ⊗ 𝒆𝒋 ⊗ 𝒆𝒌 = (𝑥𝑖 ,𝑘 𝑥𝑗 + 𝑥𝑖 𝑥𝑗 ,𝑘 )𝒆𝑖 ⊗ 𝒆𝒋 ⋅ 𝒆𝒌
= (𝛿𝑖𝑘 𝒙𝒋 + 𝒙𝒊 𝛿𝑗𝑘 )𝛿𝑗𝑘 𝒆𝑖 = (𝛿𝑖𝑘 𝒙𝒌 + 𝒙𝒊 𝛿𝑗𝑗 )𝒆𝑖
= 4𝑥𝑖 𝒆𝑖 = 4𝒓
curl(𝒓 ⊗ 𝒓) = 𝜖𝛼𝛽𝛾 (𝑥𝑖 𝑥𝛾 ),𝛽 𝒆𝛼 ⊗ 𝒆𝒊
= 𝜖𝛼𝛽𝛾 (𝑥𝑖 ,𝛽 𝑥𝛾 + 𝑥𝑖 𝑥𝛾 ,𝛽 )𝒆𝛼 ⊗ 𝒆𝒊
= 𝜖𝛼𝛽𝛾 (𝛿𝑖𝛽 𝑥𝛾 + 𝑥𝑖 𝛿𝛾𝛽 )𝒆𝛼 ⊗ 𝒆𝒊
= 𝜖𝛼𝑖𝛾 𝑥𝛾 𝒆𝛼 ⊗ 𝒆𝒊 + 𝜖𝛼𝛽𝛽 𝑥𝑖 𝒆𝛼 ⊗ 𝒆𝒊 = −𝜖𝛼𝛾𝑖 𝑥𝛾 𝒆𝛼 ⊗ 𝒆𝒊 = −𝒓 ×
∂|𝐀| 𝐀
40. Define the magnitude of tensor 𝐀 as, |𝐀| = √tr(𝐀𝐀T ) Show that = |𝐀|
∂𝐀
By definition, given a scalar 𝛼, the derivative of a scalar function of a tensor 𝑓(𝐀)
is
∂𝑓(𝐀) ∂
: 𝐁 = lim 𝑓(𝐀 + 𝛼𝐁)
∂𝐀 𝛼→0 ∂𝛼
for any arbitrary tensor 𝐁.
In the case of 𝑓(𝐀) = |𝐀|,
∂|𝐀| ∂
: 𝐁 = lim |𝐀 + 𝛼𝐁|
∂𝐀 𝛼→0 ∂𝛼

|𝐀 + 𝛼𝐁| = √tr(𝐀 + 𝛼𝐁)(𝐀 + 𝛼𝐁)T = √tr(𝐀𝐀T + 𝛼𝐁𝐀T + 𝛼𝐀𝐁 T + 𝛼 2 𝐁𝐁 T )


Note that everything under the root sign here is scalar and that the trace
operation is linear. Consequently, we can write,
∂ tr (𝐁𝐀T ) + tr (𝐀𝐁 T ) + 2𝛼tr (𝐁𝐁 T ) 2𝐀: 𝐁
lim | |
𝐀 + 𝛼𝐁 = lim =
𝛼→0 ∂𝛼 𝛼→0 2√tr(𝐀𝐀T + 𝛼𝐁𝐀T + 𝛼𝐀𝐁 T + 𝛼 2 𝐁𝐁 T ) 2√𝐀: 𝐀
𝐀
= :𝐁
|𝐀|
So that,
∂|𝐀| 𝐀
:𝐁 = :𝐁
∂𝐀 |𝐀|
or,
∂|𝐀| 𝐀
=
∂𝐀 |𝐀|
as required since 𝐁 is arbitrary.
𝜕𝑰3 (𝑺) 𝜕det(𝑺)
41. Show that = = 𝑺𝒄 the cofactor of 𝑺.
𝜕𝑺 𝜕𝑺
Clearly 𝑺𝒄 = det(𝑺) 𝑺−T = 𝑰3 (𝑺) 𝑺−T . Details of this for the contravariant
components of a tensor is presented below. Let
1 𝑖𝑗𝑘 𝑟𝑠𝑡
det(𝑺) ≡ |𝑺| ≡ 𝑆 = 𝜖 𝜖 𝑆𝑖𝑟 𝑆𝑗𝑠 𝑆𝑘𝑡
3!
Differentiating wrt 𝑆𝛼𝛽 , we obtain,
𝜕𝑆 1 𝜕𝑆𝑖𝑟 𝜕𝑆𝑗𝑠 𝜕𝑆𝑘𝑡
𝐠 𝛼 ⊗ 𝐠 𝛽 = 𝜖 𝑖𝑗𝑘 𝜖 𝑟𝑠𝑡 [ 𝑆𝑗𝑠 𝑆𝑘𝑡 + 𝑆𝑖𝑟 𝑆𝑘𝑡 + 𝑆𝑖𝑟 𝑆𝑗𝑠 ] 𝐠 ⊗ 𝐠𝛽
𝜕𝑆𝛼𝛽 3! 𝜕𝑆𝛼𝛽 𝜕𝑆𝛼𝛽 𝜕𝑆𝛼𝛽 𝛼
1 𝛽 𝛽 𝛽
= 𝜖 𝑖𝑗𝑘 𝜖 𝑟𝑠𝑡 [𝛿𝑖𝛼 𝛿𝑟 𝑆𝑗𝑠 𝑆𝑘𝑡 + 𝑆𝑖𝑟 𝛿𝑗𝛼 𝛿𝑠 𝑆𝑘𝑡 + 𝑆𝑖𝑟 𝑆𝑗𝑠 𝛿𝑘𝛼 𝛿𝑡 ] 𝐠 𝛼 ⊗ 𝐠 𝛽
3!
1 𝛼𝑗𝑘 𝛽𝑠𝑡
= 𝜖 𝜖 [𝑆𝑗𝑠 𝑆𝑘𝑡 + 𝑆𝑗𝑠 𝑆𝑘𝑡 + 𝑆𝑗𝑠 𝑆𝑘𝑡 ]𝐠 𝛼 ⊗ 𝐠 𝛽
3!
1
= 𝜖 𝛼𝑗𝑘 𝜖 𝛽𝑠𝑡 𝑆𝑗𝑠 𝑆𝑘𝑡 𝐠 𝛼 ⊗ 𝐠 𝛽 ≡ [𝑆 c ]𝛼𝛽 𝐠 𝛼 ⊗ 𝐠 𝛽
2!
Which is the cofactor of [𝑆𝛼𝛽 ] or 𝑺
𝑑𝐓
42. For a scalar variable 𝛼, if the tensor 𝐓 = 𝐓(𝛼) and 𝐓̇ ≡ 𝑑𝛼, Show that
𝑑
det(𝐓) = det(𝐓) tr(𝐓̇𝐓 −𝟏 )
𝑑𝛼
Let 𝑨 ≡ 𝑻̇𝑻−1 so that, 𝑻̇ = 𝑨𝑻. In component form, we have 𝑇𝑗̇ 𝑖 = 𝐴𝑖𝑚 𝑇𝑗𝑚 .
Therefore,
𝑑 𝑑 𝑖𝑗𝑘 1 2 3
det(𝑻) = (𝜖 𝑇𝑖 𝑇𝑗 𝑇𝑘 ) = 𝜖 𝑖𝑗𝑘 (𝑇̇𝑖1 𝑇𝑗2 𝑇𝑘3 + 𝑇𝑖1 𝑇𝑗̇ 2 𝑇𝑘3 + 𝑇𝑖1 𝑇𝑗2 𝑇̇𝑘3 )
𝑑𝛼 𝑑𝛼
= 𝜖 𝑖𝑗𝑘 (𝐴1𝑙 𝑇𝑖𝑙 𝑇𝑗2 𝑇𝑘3 + 𝑇𝑖1 𝐴2𝑚 𝑇𝑗𝑚 𝑇𝑘3 + 𝑇𝑖1 𝑇𝑗2 𝐴3𝑛 𝑇𝑘𝑛 )
= 𝜖 𝑖𝑗𝑘 [(𝐴11 𝑇𝑖1 + 𝐴12 𝑇𝑖2 + 𝐴13 𝑇𝑖3 ) 𝑇𝑗2 𝑇𝑘3 + 𝑇𝑖1 ( 𝐴12 𝑇𝑗1 + 𝐴22 𝑇𝑗2

+ 𝐴23 𝑇𝑗3 ) 𝑇𝑘3 + 𝑇𝑖1 𝑇𝑗2 ( 𝐴13 𝑇𝑘1 + 𝐴32 𝑇𝑘2 + 𝐴33 𝑇𝑘3 )]
All the boxed terms in the above equation vanish on account of the contraction of
a symmetric tensor with an antisymmetric one.
(For example, the first boxed term yields, 𝜖 𝑖𝑗𝑘 𝐴12 𝑇𝑖2 𝑇𝑗2 𝑇𝑘3
Which is symmetric as well as antisymmetric in 𝑖 and 𝑗. It therefore vanishes. The
same is true for all other such terms.)
𝑑
det(𝐓) = 𝜖 𝑖𝑗𝑘 [(𝐴11 𝑇𝑖1 )𝑇𝑗2 𝑇𝑘3 + 𝑇𝑖1 (𝐴22 𝑇𝑗2 )𝑇𝑘3 + 𝑇𝑖1 𝑇𝑗2 (𝐴33 𝑇𝑘3 )]
𝑑𝛼
= 𝐴𝑚
𝑚𝜖 𝑇𝑖 𝑇𝑗 𝑇𝑘 = tr(𝐓̇𝐓 −1 ) det(𝐓)
𝑖𝑗𝑘 1 2 3

as required.
43. Without breaking down into components, establish the fact that 𝜕det(𝐓)
𝜕𝐓
= 𝐓𝒄

Start from Liouville’s Theorem, given a scalar parameter such that 𝐓 = 𝐓(𝛼 ),
∂ ∂𝐓 ∂𝐓
(det(𝐓)) = det(𝐓) tr [( ) 𝐓 −𝟏 ] = [det(𝐓) 𝐓 −𝐓 ] : ( )
∂𝛼 ∂𝛼 ∂𝛼
By the simple rules of multiple derivative,
∂ ∂ ∂𝐓
(det(𝐓)) = [ (det(𝐓))]: ( )
∂𝛼 ∂𝐓 ∂𝛼
It therefore follows that,
∂ −𝐓
∂𝐓
[ (det(𝐓)) − [det(𝐓) 𝐓 ]]: ( ) = 0
∂𝐓 ∂𝛼
Hence

(det(𝐓)) = [det(𝐓) 𝐓 −𝐓 ] = 𝐓 𝐜
∂𝐓

44. [Gurtin 3.4.2a] If T is invertible, show that ∂𝐓 (log det(𝐓)) = 𝐓 −𝐓
∂ ∂(log det(𝐓)) ∂det(𝐓)
(log det(𝐓)) =
∂𝐓 ∂det(𝐓) ∂𝐓
1 𝐜
1
= 𝐓 = det(𝐓) 𝐓 −𝐓
det(𝐓) det(𝐓)
= 𝐓 −𝐓

45. [Gurtin 3.4.2a] If 𝐓 is invertible, show that ∂𝐓 (log det(𝐓 −1 )) = −𝐓 −𝐓
∂ −1
∂(log det(𝐓 −1 )) ∂det(𝐓 −1 ) ∂𝐓 −1
(log det(𝐓 )) =
∂𝐓 ∂det(𝐓 −1 ) ∂𝐓 −1 ∂𝐓
1
= −1
𝐓 −𝐜 (−𝐓 −2 )
det(𝐓 )
1
= −1
det(𝐓 −1 ) 𝐓 𝐓 (−𝐓 −2 )
det(𝐓 )
= −𝐓 −𝐓


46. Given that 𝐀 is a constant tensor, Show that ∂𝐒 tr(𝐀𝐒) = 𝐀T
In invariant components terms, let 𝐀 = A𝑖𝑗 𝐠 𝑖 ⊗ 𝐠 𝑗 and let 𝐒 = S𝛼𝛽 𝐠 𝛼 ⊗ 𝐠 𝛽 .
𝐀𝐒 = A𝑖𝑗 S𝛼𝛽 (𝐠 𝑖 ⊗ 𝐠 𝑗 )(𝐠 𝛼 ⊗ 𝐠 𝛽 )
= A𝑖𝑗 S𝛼𝛽 (𝐠 𝑖 ⊗ 𝐠 𝛽 )𝛿𝑗𝛼
= A𝑖𝑗 S𝑗𝛽 (𝐠 𝑖 ⊗ 𝐠 𝛽 )
tr(𝐀𝐒) = A𝑖𝑗 S𝑗𝛽 (𝐠 𝑖 ⋅ 𝐠 𝛽 )
𝛽
= A𝑖𝑗 S𝑗𝛽 𝛿𝑖 = A𝑖𝑗 S𝑗𝑖
∂ ∂
tr(𝐀𝐒) = tr(𝐀𝐒)𝐠 𝛼 ⊗ 𝐠 𝛽
∂𝐒 ∂S𝛼𝛽
∂A𝑖𝑗 S𝑗𝑖
= 𝐠 ⊗ 𝐠𝛽
∂S𝛼𝛽 𝛼
𝛽 ∂ T
= A𝑖𝑗 𝛿𝑗𝛼 𝛿𝑖 𝐠 𝛼 𝑖𝑗 T
⊗ 𝐠𝛽 = A 𝐠𝑗 ⊗ 𝐠𝑖 = 𝐀 = (𝐀 : 𝐒)
∂𝐒
as required.

47. Given that 𝐀 and 𝐁 are constant tensors, show that ∂𝐒 tr(𝐀𝐒𝐁 T ) = 𝐀T 𝐁
First observe that tr(𝐀𝐒𝐁 T ) = tr(𝐁 T 𝐀𝐒). If we write, 𝐂 ≡ 𝐁 T 𝐀, it is obvious

from the above that ∂𝐒 tr(𝐂𝐒) = 𝐂 T . Therefore,

tr(𝐀𝐒𝐁 T ) = (𝐁 T 𝐀)𝐓 = 𝐀T 𝐁
∂𝐒

48. Given that 𝐀 and 𝐁 are constant tensors, show that ∂𝐒 tr(𝐀𝐒 T 𝐁 T ) = 𝐁 T 𝐀
Observe that tr(𝐀𝐒 T 𝐁 T ) = tr(𝐁 T 𝐀𝐒 T ) = tr[𝐒(𝐁 T 𝐀)T ] = tr[(𝐁 T 𝐀)T 𝐒]
[The transposition does not alter trace; neither does a cyclic permutation. Ensure
you understand why each equality here is true.] Consequently,
∂ T T)

(
tr 𝐀𝐒 𝐁 = tr[(𝐁 T 𝐀)T 𝐒] = [(𝐁 T 𝐀)T ]𝐓 = 𝐁 T 𝐀
∂𝐒 ∂𝐒
49. Let 𝑺 be a symmetric and positive definite tensor and let 𝐼1 (𝑺), 𝐼2 (𝑺)and𝐼3 (𝑺)
𝜕𝑰1 (𝑺)
be the three principal invariants of 𝑺 show that (a) = 𝟏 the identity tensor, (b)
𝜕𝑺
𝜕𝑰2 (𝑺) 𝜕𝐼3 (𝑺)
= 𝐼1 (𝑺)𝟏 − 𝑺 and (c) = 𝐼3 (𝑺) 𝑺−1
𝜕𝑺 𝜕𝑺
𝜕𝑰1 (𝑺)
can be written in the invariant component form as,
𝜕𝑺
𝜕𝐼1 (𝑺) 𝜕𝐼1 (𝑺)
= 𝑗
𝐠 𝑖 ⊗ 𝐠𝑗
𝜕𝑺 𝜕𝑆 𝑖
Recall that 𝐼1 (𝐒) = tr(𝐒) = 𝑆αα hence
𝜕𝐼1 (𝐒) 𝜕𝐼1 (𝐒) 𝑗
𝜕𝑆αα 𝑗
= 𝑗
𝐠 𝑖 ⊗ 𝐠 = 𝑗
𝐠 𝑖 ⊗ 𝐠
𝜕𝐒 𝜕𝑆𝑖 𝜕𝑆𝑖
= 𝛿𝛼𝑖 𝛿𝑗𝛼 𝐠 𝑖 ⊗ 𝐠𝑗 = 𝛿𝑗𝑖 𝐠 𝑖 ⊗ 𝐠𝑗
= 𝟏
which is the identity tensor as expected.
𝜕𝐼2 (𝐒)
in a similar way can be written in the invariant component form as,
𝜕𝐒
𝜕𝐼2 (𝐒) 1 𝜕𝐼1 (𝐒) α 𝛽 α 𝛽 𝑗
= [𝑆α 𝑆𝛽 − 𝑆𝛽 𝑆α ] 𝐠 𝑖 ⊗ 𝐠
𝜕𝐒 2 𝜕𝑆 𝑗
𝑖
1
where we have utilized the fact that 𝐼2 (𝐒) = 2 [tr 2 (𝐒) − tr(𝐒 2 )]. Consequently,
𝜕𝐼2 (𝐒) 1 𝜕 α 𝛽 α 𝛽 𝑗
= [𝑆α 𝑆𝛽 − 𝑆𝛽 𝑆α ] 𝐠 𝑖 ⊗ 𝐠
𝜕𝐒 2 𝜕𝑆 𝑗
𝑖
1 𝑖 𝛼 𝛽 𝛽 𝛽 𝛽
= [𝛿𝛼 𝛿𝑗 𝑆𝛽 + 𝛿𝛽𝑖 𝛿𝑗 𝑆αα − 𝛿𝛽𝑖 𝛿𝑗𝛼 𝑆α − 𝛿𝛼𝑖 𝛿𝑗 𝑆𝛽α ] 𝐠 𝑖 ⊗ 𝐠𝑗
2
1 𝛽 𝑗 𝑗 𝑗
= [𝛿𝑗𝑖 𝑆𝛽 + 𝛿𝑗𝑖 𝑆αα − 𝑆𝑖 − 𝑆𝑖 ] 𝐠 𝑖 ⊗ 𝐠𝑗 = (𝛿𝑗𝑖 𝑆αα − 𝑆𝑖 )𝐠 𝑖 ⊗ 𝐠𝑗
2
= 𝐼1 (𝐒)𝟏 − 𝐒
1 𝑖𝑗𝑘 𝑟𝑠𝑡
det(𝑺) ≡ |𝑺| ≡ 𝑆 = 𝜖 𝜖 𝑆𝑖𝑟 𝑆𝑗𝑠 𝑆𝑘𝑡
3!
Differentiating wrt 𝑆𝛼𝛽 , we obtain,
𝜕𝑆 1 𝜕𝑆𝑖𝑟 𝜕𝑆𝑗𝑠 𝜕𝑆𝑘𝑡
𝐠 𝛼 ⊗ 𝐠 𝛽 = 𝜖 𝑖𝑗𝑘 𝜖 𝑟𝑠𝑡 [ 𝑆𝑗𝑠 𝑆𝑘𝑡 + 𝑆𝑖𝑟 𝑆𝑘𝑡 + 𝑆𝑖𝑟 𝑆𝑗𝑠 ] 𝐠 ⊗ 𝐠𝛽
𝜕𝑆𝛼𝛽 3! 𝜕𝑆𝛼𝛽 𝜕𝑆𝛼𝛽 𝜕𝑆𝛼𝛽 𝛼
1 𝛽 𝛽 𝛽
= 𝜖 𝑖𝑗𝑘 𝜖 𝑟𝑠𝑡 [𝛿𝑖𝛼 𝛿𝑟 𝑆𝑗𝑠 𝑆𝑘𝑡 + 𝑆𝑖𝑟 𝛿𝑗𝛼 𝛿𝑠 𝑆𝑘𝑡 + 𝑆𝑖𝑟 𝑆𝑗𝑠 𝛿𝑘𝛼 𝛿𝑡 ] 𝐠 𝛼 ⊗ 𝐠 𝛽
3!
1 𝛼𝑗𝑘 𝛽𝑠𝑡
= 𝜖 𝜖 [𝑆𝑗𝑠 𝑆𝑘𝑡 + 𝑆𝑗𝑠 𝑆𝑘𝑡 + 𝑆𝑗𝑠 𝑆𝑘𝑡 ]𝐠 𝛼 ⊗ 𝐠 𝛽
3!
1
= 𝜖 𝛼𝑗𝑘 𝜖 𝛽𝑠𝑡 𝑆𝑗𝑠 𝑆𝑘𝑡 𝐠 𝛼 ⊗ 𝐠 𝛽 ≡ [𝑆 c ]𝛼𝛽 𝐠 𝛼 ⊗ 𝐠 𝛽
2!
Which is the cofactor of [𝑆𝛼𝛽 ] or 𝑺
50. For a tensor field 𝜩, The volume integral in the region Ω ⊂ ℰ, ∫Ω(grad 𝜩) 𝑑𝑣 =
∫∂Ω 𝜩 ⊗ 𝒏 𝑑𝑠 where 𝒏 is the outward drawn normal to 𝜕Ω – the boundary of Ω.
Show that for a vector field 𝒇
∫ (div 𝒇) 𝑑𝑣 = ∫ 𝒇 ⋅ 𝒏 𝑑𝑠
Ω 𝜕Ω
Replace 𝜩 by the vector field 𝒇 we have,

∫ (grad 𝒇) 𝑑𝑣 = ∫ 𝒇 ⊗ 𝒏 𝑑𝑠
Ω ∂Ω
Taking the trace of both sides and noting that both trace and the integral are
linear operations, therefore we have,
∫ (div 𝒇) 𝑑𝑣 = ∫ tr(grad 𝒇) 𝑑𝑣
Ω Ω

= ∫ tr(𝒇 ⊗ 𝒏) 𝑑𝑠
∂Ω

= ∫ 𝒇 ⋅ 𝒏 𝑑𝑠
𝜕Ω
51. Show that for a scalar function Hence the divergence theorem
becomes,∫Ω(grad 𝜙) 𝑑𝑣 = ∫𝜕Ω 𝜙𝒏 𝑑𝑠
Recall that for a vector field, that for a vector field 𝒇
∫ (div 𝒇) 𝑑𝑣 = ∫ 𝒇 ⋅ 𝒏 𝑑𝑠
Ω 𝜕Ω
if we write, 𝐟 = 𝜙𝒂 where 𝒂 is an arbitrary constant vector, we have,

∫ (div[𝜙𝒂]) 𝑑𝑣 = ∫ 𝜙𝒂 ⋅ 𝐧 𝑑𝑠 = 𝒂 ⋅ ∫ 𝜙𝐧 𝑑𝑠
Ω 𝜕Ω 𝜕Ω
For the LHS, note that, div[𝜙𝒂] = tr(grad[𝜙𝒂])
grad[𝜙𝒂] = (𝜙𝑎𝑖 ),𝑗 𝐠 𝑖 ⊗ 𝐠𝑗 = 𝑎𝑖 𝜙,𝑗 𝐠 𝑖 ⊗ 𝐠𝑗
The trace of which is,
𝑗
𝑎𝑖 𝜙,𝑗 𝐠 𝑖 ⋅ 𝐠𝑗 = 𝑎𝑖 𝜙,𝑗 𝛿𝑖 = 𝑎𝑖 𝜙,𝑖 = 𝒂 ⋅ grad 𝜙
For the arbitrary constant vector 𝒂, we therefore have that,

∫ (div[𝜙𝒂]) 𝑑𝑣 = 𝒂 ⋅ ∫ grad 𝜙 𝑑𝑣 = 𝒂 ⋅ ∫ 𝜙𝐧 𝑑𝑠
Ω Ω 𝜕Ω
∫ grad 𝜙 𝑑𝑣 = ∫ 𝜙𝐧 𝑑𝑠
Ω 𝜕Ω

Anda mungkin juga menyukai