Example 4 - Power of matrix


How can one compute An ?


Example 5 - Series of matrix


The exponential function of a matrix, A, can be defined in a series format as

eAI + A + A2

2!
+ A3

3!
+ A4

4!
+ …
(1)
where I is the identity matrix. Note that the exponential law holds for matrix exponentials, i.e.

Example 6 - Google and eigenvalues




Example
google1.jpg

lij =





0
0
1
0
1
0
0
0
1
1
0
1
1
1
1
0






.
(2)
After normalization,

lij

nj
=





0
0
1/2
0
1/3
0
0
0
1/3
1/2
0
1
1/3
1/2
1/2
0






.
(3)
The matrix, P, is

P =





0
0
1/2
0
1/3
0
0
0
1/3
1/2
0
1
1/3
1/2
1/2
0






,
(4)
or

P w = 1 w.
(5)
So the eigenvector for the eigenvalue of 1 for his matrix is (try verifying yourself)

w = (0.368577, 0.122859, 0.737154, 0.552866).
(6)
It can be concluded that Site 3 is the most important site.

Example 7 - Principal Component Analysis (PCA)

Another application of eigenvalues/vectors is found in Principal Component Analysis (PCA),one of the tools used in data mining.
Consider a set of vectors, (x1, y1), (x2, y2), (x3, y3) …(xN, yN), each of which consisting of a pair of numbers as
Table 1: Example data
# x-value y-value
u1 x1 y1
u2 x2 y2
uα xα yα
uN xN yN
pca_vectors.jpg
Figure 1: data points as vectors.

Example

Consider the following data set for 30 people where the first column is the height and the second column is the weight.
Table 2: Raw and centered data for height and weight
Height (u) Weight (v)
1171.958.5
2175.866.6
3159.347.0
4146.969.4
5143.458.4
6151.454.5
7159.955.3
8170.771.9
9140.662.4
10154.552.6
11154.060.4
12154.144.5
13150.658.8
14150.574.6
15157.340.1
16142.440.0
17161.962.6
18175.578.2
10171.055.7
20172.671.1
21144.061.9
22161.862.1
23151.750.1
24167.169.3
25162.957.4
26156.157.5
27141.249.8
28165.951.9
29165.065.2
30169.368.0
   
Height (x) Weight (y)
113.59-0.693333
217.497.40667
30.99-12.1933
4-11.4110.2067
5-14.91-0.793333
6-6.91-4.69333
71.59-3.89333
812.3912.7067
9-17.713.20667
10-3.81-6.59333
11-4.311.20667
12-4.21-14.6933
13-7.71-0.393333
14-7.8115.4067
15-1.01-19.0933
16-15.91-19.1933
173.593.40667
1817.1919.0067
1912.69-3.49333
2014.2911.9067
21-14.312.70667
223.492.90667
23-6.61-9.09333
248.7910.1067
254.59-1.79333
26-2.21-1.69333
27-17.11-9.39333
287.59-7.29333
296.696.00667
3010.998.80667
The mean values are
1

30

u = 158.3,     1

30

v = 59.2.
Shift the data so that the mean values are 0.

xα = uα
-
u
 
,     yα=vα
-
v
 
.
pca_raw.jpg        pca_centered.jpg
Figure 2: Distribution of raw and central data points
The covariant matrix is obtained as
V =


111.343
41.4137
41.4137
91.254



(7)
The eigenvalues and the corresponding eigenvectors are
λ1 = 143.913,     λ2 = 58.684

e1=


−0.786036
−0.61818



,   e2=


0.61818
−0.786036



,
Therefore,

PT=


−0.786036
−0.61818
0.61818
−0.786036



and the transformation from (x, y) to (x, y) is
-
x
 

α 
=
−0.786036 xα −0.61818 yα
(8)
-
y
 

α 
=
0.61818 xα − 0.786036 yα
(9)
The new variable, xα, represents the overall growth and yα represents the difference between the height and the weight. If yα is positive and large, it may represent slenderness (#15) and if yα is negative and large, it may represent obesity (#4).
Table 3: Principal Components
# x y
110.25368.94606
218.32644.99007
3-6.759510.1964
4-2.65911-15.0762
5-12.2102-8.59348
6-8.33284-0.582497
7-1.156984.04321
817.594-2.32865
9-11.9384-13.4685
10-7.070672.82733
11-2.64188-3.61284
12-12.39238.94695
13-6.30349-4.457
143.38516-16.9382
15-12.59714.3837
16-24.37085.25141
174.9278-0.458496
1825.2615-4.31341
197.8152910.5906
2018.5929-0.525273
21-9.57497-10.9737
224.54011-0.127296
23-10.8173.06152
2413.157-2.5104
252.49934.24707
26-2.78393-0.0351573
27-19.2559-3.19357
281.4574210.4248
298.97179-0.585831
3014.0826-0.128556
pca_final.jpg
Figure 3: Distribution of (x, y).



File translated from TEX by TTH, version 4.03.
On 18 Jan 2021, 22:23.