• Ingen resultater fundet

Statistics on Riemannian Manifolds

N/A
N/A
Info
Hent
Protected

Academic year: 2022

Del "Statistics on Riemannian Manifolds"

Copied!
93
0
0

Indlæser.... (se fuldtekst nu)

Hele teksten

(1)

Statistics on Riemannian Manifolds

Tom Fletcher

Scientific Computing and Imaging Institute University of Utah

August 19, 2009

(2)

Manifold Data

“Learned” Manifolds

I Raw data lies in Euclidean space

I Manifold + Noise

“Known” Manifolds

I Raw data lies in a manifold

I Typically given by some constraints on data

(3)

Manifold Data

“Learned” Manifolds

I Raw data lies in Euclidean space

I Manifold + Noise

“Known” Manifolds

I Raw data lies in a manifold

I Typically given by some constraints on data

(4)

Manifold Data

“Learned” Manifolds

I Raw data lies in Euclidean space

I Manifold + Noise

“Known” Manifolds

I Raw data lies in a manifold

I Typically given by some constraints on data

(5)

Manifold Data in Vision and Imaging

I Directional data

I Transformation groups (rotations, projective, affine)

I Shapes

I Diffusion tensors, structure tensors

I Diffeomorphisms (for deformable atlas building)

(6)

Manifold Data in Vision and Imaging

I Directional data

I Transformation groups (rotations, projective, affine)

I Shapes

I Diffusion tensors, structure tensors

I Diffeomorphisms (for deformable atlas building)

(7)

Manifold Data in Vision and Imaging

I Directional data

I Transformation groups (rotations, projective, affine)

I Shapes

I Diffusion tensors, structure tensors

I Diffeomorphisms (for deformable atlas building)

(8)

Manifold Data in Vision and Imaging

I Directional data

I Transformation groups (rotations, projective, affine)

I Shapes

I Diffusion tensors, structure tensors

I Diffeomorphisms (for deformable atlas building)

(9)

Manifold Data in Vision and Imaging

I Directional data

I Transformation groups (rotations, projective, affine)

I Shapes

I Diffusion tensors, structure tensors

I Diffeomorphisms (for deformable atlas building)

(10)

Manifold Statistics: Averages

(11)

Manifold Statistics: Averages

(12)

Manifold Statistics: Variability

Shape priors in segmentation

(13)

Manifold Statistics: Regression

Application: Healthy Brain Aging

35 37 39 41 43

45 47 49 51 53

(14)

What is Shape?

Shape is the geometry of an object modulo position, orientation, and size.

(15)

What is Shape?

Shape is the geometry of an object modulo position, orientation, and size.

(16)

Shape Representations

I Boundary models (points, curves, surfaces, level sets)

I Interior models (medial, solid mesh)

I Transformation models (splines, diffeomorphisms)

(17)

Shape Analysis

Shape Space

A shape is a point in a high-dimensional, nonlinear shape space.

(18)

Shape Analysis

Shape Space

A shape is a point in a high-dimensional, nonlinear shape space.

(19)

Shape Analysis

Shape Space

A shape is a point in a high-dimensional, nonlinear shape space.

(20)

Shape Analysis

Shape Space

A shape is a point in a high-dimensional, nonlinear shape space.

(21)

Shape Analysis

Shape Space

A metric space structure provides a comparison between two shapes.

(22)

Kendall’s Shape Space

I Define object with

k

points.

I Represent as a vector in

R

2k.

I Remove translation, rotation, and scale.

I End up with complex projective space,

CP

k2.

(23)

Quotient Spaces

What do we get when we “remove” scaling from

R

2?

x

Notation:

[x] ∈ R

2

/ R

+

(24)

Quotient Spaces

What do we get when we “remove” scaling from

R

2?

x [x]

Notation:

[x] ∈ R

2

/ R

+

(25)

Quotient Spaces

What do we get when we “remove” scaling from

R

2?

x [x]

Notation:

[x] ∈ R

2

/ R

+

(26)

Quotient Spaces

What do we get when we “remove” scaling from

R

2?

x [x]

Notation:

[x] ∈ R

2

/ R

+

(27)

Quotient Spaces

What do we get when we “remove” scaling from

R

2?

x [x]

Notation:

[x] ∈ R

2

/ R

+

(28)

Constructing Kendall’s Shape Space

I Consider planar landmarks to be points in the complex plane.

I An object is then a point

(z

1

, z

2

, . . . , z

k

) ∈ C

k.

I Removingtranslationleaves us with

C

k1.

I How to removescalingandrotation?

(29)

Constructing Kendall’s Shape Space

I Consider planar landmarks to be points in the complex plane.

I An object is then a point

(z

1

, z

2

, . . . , z

k

) ∈ C

k.

I Removingtranslationleaves us with

C

k1.

I How to removescalingandrotation?

(30)

Constructing Kendall’s Shape Space

I Consider planar landmarks to be points in the complex plane.

I An object is then a point

(z

1

, z

2

, . . . , z

k

) ∈ C

k.

I Removingtranslationleaves us with

C

k1.

I How to removescalingandrotation?

(31)

Constructing Kendall’s Shape Space

I Consider planar landmarks to be points in the complex plane.

I An object is then a point

(z

1

, z

2

, . . . , z

k

) ∈ C

k.

I Removingtranslationleaves us with

C

k1.

I How to removescalingandrotation?

(32)

Scaling and Rotation in the Complex Plane

Im

0 Re

! r

Recall a complex number can be writ- ten as

z = re

, with modulus

r

and argument

φ

.

Complex Multiplication:

se

∗ re

= (sr)e

i(θ+φ)

Multiplication by a complex number

se

is equivalent to scaling by

s

and rotation by

θ

.

(33)

Scaling and Rotation in the Complex Plane

Im

0 Re

! r

Recall a complex number can be writ- ten as

z = re

, with modulus

r

and argument

φ

.

Complex Multiplication:

se

∗ re

= (sr)e

i(θ+φ)

Multiplication by a complex number

se

is equivalent to scaling by

s

and rotation by

θ

.

(34)

Removing Scale and Translation

Multiplying a centered point set,

z = (z

1

, z

2

, . . . , z

k1

),

by a constant

w ∈ C

, just rotates and scales it.

Thus the shape of

z

is an equivalence class:

[z] = { (wz

1

, wz

2

, . . . , wz

k1

) : ∀ w ∈ C}

This gives complex projective space

CP

k2 – much like the sphere comes from equivalence classes of scalar multiplication in

R

n.

(35)

Removing Scale and Translation

Multiplying a centered point set,

z = (z

1

, z

2

, . . . , z

k1

),

by a constant

w ∈ C

, just rotates and scales it.

Thus the shape of

z

is an equivalence class:

[z] = { (wz

1

, wz

2

, . . . , wz

k1

) : ∀ w ∈ C}

This gives complex projective space

CP

k2 – much like the sphere comes from equivalence classes of scalar multiplication in

R

n.

(36)

Removing Scale and Translation

Multiplying a centered point set,

z = (z

1

, z

2

, . . . , z

k1

),

by a constant

w ∈ C

, just rotates and scales it.

Thus the shape of

z

is an equivalence class:

[z] = { (wz

1

, wz

2

, . . . , wz

k1

) : ∀ w ∈ C}

This gives complex projective space

CP

k2 – much like the sphere comes from equivalence classes of scalar multiplication in

R

n.

(37)

The M-rep Shape Space

n n

0

1

x

Medial Atom:

m = { x, r, n

0

, n

1

} ∈ M (1) M (1) = R

3

× R

+

× S

2

× S

2

M-rep Model with

n

atoms:

M ∈ M (n) = M (1)

n

Shape change in terms of local translation, bending, & widening.

(38)

The Exponential and Log Maps

p T M p

Exp (X)p X

M

I The exponential map takes tangent vectors to points along geodesics.

I The length of the tangent vector equals the length along the geodesic segment.

I Its inverse is the log map – it gives distance between points:

d(p, q) = k Log

p

(q) k

.

(39)

Intrinsic Means (Fr ´echet)

Theintrinsic meanof a collection of points

x

1

, . . . , x

N on a Riemannian manifold

M

is

µ = arg min

xM

X

N i=1

d(x, x

i

)

2

,

where

d( · , · )

denotes Riemannian distance on

M

.

(40)

Computing Means

Gradient Descent Algorithm:

Input:x1, . . . ,xN ∈M µ0 =x1

Repeat:

∆µ= N1 PN

i=1Logµ

k(xi) µk+1 =Expµ

k(∆µ)

(41)

Computing Means

Gradient Descent Algorithm:

Input:x1, . . . ,xN ∈M µ0 =x1

Repeat:

∆µ= N1 PN

i=1Logµ

k(xi) µk+1 =Expµ

k(∆µ)

(42)

Computing Means

Gradient Descent Algorithm:

Input:x1, . . . ,xN ∈M µ0 =x1

Repeat:

∆µ= N1 PN

i=1Logµ

k(xi) µk+1 =Expµ

k(∆µ)

(43)

Computing Means

Gradient Descent Algorithm:

Input:x1, . . . ,xN ∈M µ0 =x1

Repeat:

∆µ= N1 PN

i=1Logµ

k(xi) µk+1 =Expµ

k(∆µ)

(44)

Computing Means

Gradient Descent Algorithm:

Input:x1, . . . ,xN ∈M µ0 =x1

Repeat:

∆µ= N1 PN

i=1Logµ

k(xi) µk+1 =Expµ

k(∆µ)

(45)

Computing Means

Gradient Descent Algorithm:

Input:x1, . . . ,xN ∈M µ0 =x1

Repeat:

∆µ= N1 PN

i=1Logµ

k(xi) µk+1 =Expµ

k(∆µ)

(46)

Computing Means

Gradient Descent Algorithm:

Input:x1, . . . ,xN ∈M µ0 =x1

Repeat:

∆µ= N1 PN

i=1Logµ

k(xi) µk+1 =Expµ

k(∆µ)

(47)

Computing Means

Gradient Descent Algorithm:

Input:x1, . . . ,xN ∈M µ0 =x1

Repeat:

∆µ= N1 PN

i=1Logµ

k(xi) µk+1 =Expµ

k(∆µ)

(48)

Computing Means

Gradient Descent Algorithm:

Input:x1, . . . ,xN ∈M µ0 =x1

Repeat:

∆µ= N1 PN

i=1Logµ

k(xi) µk+1 =Expµ

k(∆µ)

(49)

Computing Means

Gradient Descent Algorithm:

Input:x1, . . . ,xN ∈M µ0 =x1

Repeat:

∆µ= N1 PN

i=1Logµ

k(xi) µk+1 =Expµ

k(∆µ)

(50)

Principal Geodesic Analysis

Linear Statistics (PCA) Curved Statistics (PGA)

(51)

Principal Geodesic Analysis

Linear Statistics (PCA) Curved Statistics (PGA)

(52)

Principal Geodesic Analysis

Linear Statistics (PCA) Curved Statistics (PGA)

(53)

Principal Geodesic Analysis

Linear Statistics (PCA) Curved Statistics (PGA)

(54)

Principal Geodesic Analysis

Linear Statistics (PCA) Curved Statistics (PGA)

(55)

Principal Geodesic Analysis

Linear Statistics (PCA) Curved Statistics (PGA)

(56)

Principal Geodesic Analysis

Linear Statistics (PCA) Curved Statistics (PGA)

(57)

Computing PGA

I Find nested linear subspaces

V

k

⊂ T

p

M

such that

Exp

µ

(V

k

)

maximizes variance of projected data.

I First-order approximation: PCA in tangent space of sample covariance matrix,

S = 1 N − 1

X

N i=1

Log

µ

(x

i

) Log

µ

(x

i

)

T

(58)

PGA of Kidney

Mode 1 Mode 2 Mode 3

(59)

Robust Statistics: Motivation

I The mean is overly influenced by outliers due to sum-of-squares.

I Robust statistical description of shape or other manifold data.

I Deal with outliers due to imaging noise or data corruption.

I Misdiagnosis, segmentation error, or outlier in a population study.

(60)

Mean vs. Median in R n

Mean: least-squares problem

µ = arg min

x∈Rn

X k x − x

i

k

2

Closed-form solution (arithmetic average)

Geometric Median, or Fermat-Weber Point:

m = arg min

x∈Rn

X k x − x

i

k

No closed-form solution

(61)

Mean vs. Median in R n

Mean: least-squares problem

µ = arg min

x∈Rn

X k x − x

i

k

2

Closed-form solution (arithmetic average)

Geometric Median, or Fermat-Weber Point:

m = arg min

x∈Rn

X k x − x

i

k

No closed-form solution

(62)

Weiszfeld Algorithm in R n

I Gradient descent on sum-of-distance:

m

k+1

= m

k

− αG

k

,

G

k

= X

iIk

m

k

− x

i

k x

i

− m

k

k

X

iIk

k x

i

− m

k

k

1

!

I Step size:

0 < α ≤ 2

I Exclude singular points:

I

k

= { i : m

k

6 = x

i

}

I Weiszfeld (1937), Ostresh (1978)

(63)

Geometric Median on a Manifold

The geometric median of data

x

i

∈ M

is the point that minimizes the sum of geodesic distances:

m = arg min

xM

X

N i=1

d(x, x

i

)

Fletcher, et al. CVPR 2008 and NeuroImage 2009.

(64)

Weiszfeld Algorithm for Manifolds

Gradient descent:

m

k+1

= Exp

mk

(αv

k

),

v

k

= X

iIk

Log

mk

(x

i

) d(m

k

, x

i

)

X

iIk

d(m

k

, x

i

)

1

(65)

Example: Rotations

Input data: 20 random rotations

Outlier set: random, rotated 90

(66)

Example: Rotations

Mean

Median

0 outliers 5 outliers 10 outliers 15 outliers

(67)

Application: Diffusion Tensor MRI

! !

(68)

Space of Positive-Definite Tensors

I Positive-definite, symmetric matrices

PD(n) = GL

+

(n)/SO(n)

I Riemannian manifold with nonpositive curvature

I Applications:

I Diffusion tensor MRI: Fletcher (2004), Pennec (2004)

I Structure tensor: Rathi (2007)

I Bookstein’s simplex shape space (1986)

(69)

Example: PD(2)

a c

b l

g p

p

0

1

A ∈ PD(2)

is of the form

A =

a b b c

,

ac − b

2

> 0, a > 0.

Similar situation for

PD(3)

(6-dimensional).

(70)

Example: Tensors

Input data: 20 random tensors

Outlier set: random, rotated 90

(71)

Example: Tensors

Mean

Median

0 outliers 5 outliers 10 outliers 15 outliers

(72)

Kendall’s Shape Space

I Define object with

k

points.

I Represent as a vector in

R

2k.

I Remove translation, rotation, and scale.

I End up with complex projective space,

CP

k2.

(73)

Example on Kendall Shape Spaces

Hand shapes

Outliers

(74)

Example on Kendall Shape Spaces

Mean:

# Outliers: 0 2 6 12

Median:

# Outliers: 0 2 6 12

(75)

Example on Kendall Shape Spaces

Mean:

# Outliers: 0 2 6 12

Median:

# Outliers: 0 2 6 12

(76)

Image Metamorphosis

I Metric between images

I Includes both deformation and intensity change

U(v

t

, I

t

) = Z

1

0

k v

t

k

2V

dt + 1 σ

2

Z

1 0

dI

t

dt + h∇ I

t

, v

t

i

2

L2

dt

Fig. 1. Metamorphosis geodesic between two 3D brain images. Mid-axial (top row) and mid-coronal (bottom row) slices are shown.

The theory of robust estimation has led to the devel- opment of numerous robust estimators, of which theL1- estimator, also known as the geometric median, is one of the best known. Given a set of points{xi, i= 1,· · ·, n} ∈ Rd, with the usual Euclidean norm"x", theL1-estimator is defined as the pointmRdminimizing!n

i=1"mxi". It can be shown (Lopuha¨a and Rousseeuw, 1991) that this estimator has a breakdown point of 0.5, which means that half of the dataneeds to be corrupted in order to corrupt this estimator. In Figure 2 we illustrate this by showing how the geometric median and the mean are displaced in the presence of a few outliers.

Fig. 2. The geometric median (marked with a!) and mean (marked with a×) for a collection of points in the plane. Notice how the few outliers at the top right of the picture have forced the mean away from the points, whereas the median remains centrally located.

The existence and uniqueness of the the median inRd follows directly from the convexity of the distance function. In one dimension, the geometric median is the point that divides the point set into equal halves on either side (ifnis odd) and is any point on the line segment connecting the two middle points (ifnis even). In general however, com-

puting the geometric median is difficult; Bajaj has shown that the solution cannot be expressed using radicals (arith- metic operations, andkthroots) (Bajaj, 1988).

There are two main approaches to computing the geo- metric median of a collection of points inRd. One way is to compute anapproximatemedian ˜msuch that!n

i=1"m˜

xi"is at most a (1 +!)-factor larger than cost of the op-

timal median. This can be computed using the ellipsoid method (Chandrasekaran and Tamir, 1990). A more effi- cient algorithm achieving the same result is due to Bose et al. (2003).

These algorithms do not generalize beyond Euclidean spaces. A more general iterative algorithm due to Weiszfeld (1937) and later improved by Kuhn and Kuenne (1962) and Ostresh (1978) converges to the optimal solution in Euclidean spaces (Kuhn, 1973), and was subsequently gen- eralized to Banach spaces by Eckhardt (1980).

Several other robust estimators of centrality have been proposed in the statistics literature (Maronna et al., 2006). Winsorized means, where a percentage of extreme values are clamped, and trimmed means, where extreme values are removed, have been used for univariate data. The drawback of these methods is that they require a somewhat arbitrary selection of a threshold. M-estimators (Huber, 1981) are a generalization of maximum likelihood methods in which some function of the data is minimized. The geometric me- dian is a special case of an M-estimator with anL1 cost function.

3. The Riemannian Geometric Median

Let M be a Riemannian manifold. Given points x1, . . . , xn M and corresponding positive real weights wi, . . . , wn, with!

iwi = 1, define the weighted sum-of- distances functionf(x) = !

iwid(x, xi), whered is the Riemannian distance function onM. Throughout, we will assume that thexilie in a convex setUM, i.e., any two points inUare connected by a unique shortest geodesic ly- ing entirely inU. We define theweighted geometric median, 3

(77)

Image Metamorphosis

I Metric between images

I Includes both deformation and intensity change

U(v

t

, I

t

) = Z

1

0

k v

t

k

2V

dt + 1 σ

2

Z

1 0

dI

t

dt + h∇ I

t

, v

t

i

2

L2

dt

Fig. 1. Metamorphosis geodesic between two 3D brain images. Mid-axial (top row) and mid-coronal (bottom row) slices are shown.

The theory of robust estimation has led to the devel- opment of numerous robust estimators, of which theL1- estimator, also known as the geometric median, is one of the best known. Given a set of points{xi, i= 1,· · ·, n} ∈ Rd, with the usual Euclidean norm"x", theL1-estimator is defined as the pointmRdminimizing!n

i=1"mxi". It can be shown (Lopuha¨a and Rousseeuw, 1991) that this estimator has a breakdown point of 0.5, which means that half of the dataneeds to be corrupted in order to corrupt this estimator. In Figure 2 we illustrate this by showing how the geometric median and the mean are displaced in the presence of a few outliers.

Fig. 2. The geometric median (marked with a!) and mean (marked with a×) for a collection of points in the plane. Notice how the few outliers at the top right of the picture have forced the mean away from the points, whereas the median remains centrally located.

The existence and uniqueness of the the median inRd follows directly from the convexity of the distance function.

In one dimension, the geometric median is the point that divides the point set into equal halves on either side (ifnis odd) and is any point on the line segment connecting the two middle points (ifnis even). In general however, com-

puting the geometric median is difficult; Bajaj has shown that the solution cannot be expressed using radicals (arith- metic operations, andkthroots) (Bajaj, 1988).

There are two main approaches to computing the geo- metric median of a collection of points inRd. One way is to compute anapproximatemedian ˜msuch that!n

i=1"m˜

xi"is at most a (1 +!)-factor larger than cost of the op-

timal median. This can be computed using the ellipsoid method (Chandrasekaran and Tamir, 1990). A more effi- cient algorithm achieving the same result is due to Bose et al. (2003).

These algorithms do not generalize beyond Euclidean spaces. A more general iterative algorithm due to Weiszfeld (1937) and later improved by Kuhn and Kuenne (1962) and Ostresh (1978) converges to the optimal solution in Euclidean spaces (Kuhn, 1973), and was subsequently gen- eralized to Banach spaces by Eckhardt (1980).

Several other robust estimators of centrality have been proposed in the statistics literature (Maronna et al., 2006).

Winsorized means, where a percentage of extreme values are clamped, and trimmed means, where extreme values are removed, have been used for univariate data. The drawback of these methods is that they require a somewhat arbitrary selection of a threshold. M-estimators (Huber, 1981) are a generalization of maximum likelihood methods in which some function of the data is minimized. The geometric me- dian is a special case of an M-estimator with anL1 cost function.

3. The Riemannian Geometric Median

Let M be a Riemannian manifold. Given points x1, . . . , xn M and corresponding positive real weights wi, . . . , wn, with!

iwi = 1, define the weighted sum-of- distances functionf(x) = !

iwid(x, xi), whered is the Riemannian distance function onM. Throughout, we will assume that thexilie in a convex setUM, i.e., any two points inUare connected by a unique shortest geodesic ly- ing entirely inU. We define theweighted geometric median, 3

(78)

Example: Metamorphosis

Input Data

Mean Median

ratio = 1.13 ratio = 1.04

(79)

Example: Metamorphosis

Input Data

Mean Median

ratio = 1.13 ratio = 1.04

(80)

Example: Metamorphosis

Fig. 12. Midaxial slices from the four input 3D MR images (left). The resulting geometric median atlas (right).

Grant R01EB007688-01A1.

References

Bajaj, C., 1988. The algebraic degree of geometric opti- mization problems. Discrete and Computational Geom- etry 3, 177–191.

Barmpoutis, A., Vemuri, B. C., Shepherd, T. M., Forder, J. R., 2007. Tensor splines for interpolation and approx- imation of DT-MRI with application to segmentation of isolated rat hippocampi. IEEE Transactions on Medical Imaging 26 (11), 1537–1546.

Basser, P. J., Mattiello, J., Bihan, D. L., 1994. MR diffusion tensor spectroscopy and imaging. Biophysics Journal 66, 259–267.

Batchelor, P., Moakher, M., Atkinson, D., Calamante, F., Connelly, A., 2005. A rigorous framework for diffusion tensor calculus. Magnetic Resonance in Medicine 53, 221–225.

Bigun, J., Granlund, G., Wiklund, J., 1991. Multidimen- sional orientation estimation with application to texture analysis and optical flow. IEEE Transactions on Pattern Analysis and Machine Intelligence 13 (8), 775–790.

Bookstein, F. L., 1986. Size and shape spaces for landmark data in two dimensions (with discussion). Statistical Sci- ence 1 (2), 181–242.

Bose, P., Maheshwari, A., Morin, P., 2003. Fast approxima- tions for sums of distances, clustering and the fermat–

weber problem. Comput. Geom. Theory Appl. 24 (3), 135–146.

Buss, S. R., Fillmore, J. P., 2001. Spherical averages and applications to spherical splines and interpolation. ACM Transactions on Graphics 20 (2), 95–126.

Chandrasekaran, R., Tamir, A., 1990. Algebraic optimiza- tion: The Fermat-Weber problem. Mathematical Pro- gramming 46, 219–224.

Cheeger, J., Ebin, D. G., 1975. Comparison Theorems in Riemannian Geometry. North-Holland.

Cootes, T. F., Taylor, C. J., Cooper, D. H., Graham, J., 1995. Active shape models – their training and appli- cation. Comp. Vision and Image Understanding 61 (1), 38–59.

Corouge, I., Fletcher, P. T., Joshi, S., Gouttard, S., Gerig, G., 2006. Fiber tract-oriented statistics for quantitative diffusion tensor MRI analysis. Medical Image Analysis 10 (5), 786–798.

Eckhardt, U., 1980. Weber’s problem and Weiszfeld’s algo- rithm in general spaces. Mathematical Programming 18, 186–196.

Fletcher, P. T., Joshi, S., 2004. Principal geodesic analysis on symmetric spaces: statistics of diffusion tensors. In:

Proceedings of ECCV Workshop on Computer Vision Approaches to Medical Image Analysis. pp. 87–98.

Fletcher, P. T., Lu, C., Joshi, S., 2003. Statistics of shape via principal geodesic analysis on Lie groups. In: Pro- ceedings of the IEEE Conference on Computer Vision and Pattern Recognition. pp. 95–101.

Fletcher, P. T., Tao, R., Jeong, W.-K., Whitaker, R. T., 2007. A volumetric approach to quantifying region-to- region white matter connectivity in diffusion tensor MRI.

In: Proceedings of Information Processing in Medical Imaging (IPMI). pp. 346–358.

Fr´echet, M., 1948. Les ´el´ements al´eatoires de nature quel- conque dans un espace distanci´e. Ann. Inst. H. Poincar´e 10 (3), 215–310.

Garcin, L., Younes, L., 2005. Geodesic image matching: a wavelet based energy minimization scheme. In: Work- shop on Energy Minimization Methods in Computer Vi- 11

Input Data Median Atlas

(81)

Preliminaries

I

x

i

∈ U ⊂ M

,

U

is a convex subset

I

diam(U) = max

x,yU

d(x, y)

(82)

Existence and Uniqueness

Theorem. The weighted geometric median exists and is unique if

1. the sectional curvatures of

M

are bounded above by

∆ > 0

and

diam(U) < π/(2 √

∆)

, or 2. the sectional curvatures of

M

are nonpositive.

Proof is by showing the convexity of geodesic distance. Identical conditions to ensure the mean (Karcher).

(83)

Existence and Uniqueness

Theorem. The weighted geometric median exists and is unique if

1. the sectional curvatures of

M

are bounded above by

∆ > 0

and

diam(U) < π/(2 √

∆)

, or 2. the sectional curvatures of

M

are nonpositive.

Proof is by showing the convexity of geodesic distance.

Identical conditions to ensure the mean (Karcher).

(84)

Robustness

I Breakdown point: percentage of points that can be moved to infinity before statistic goes to infinity

I Euclidean mean: 0%

I Euclidean geometric median: 50%

I Same result holds for noncompact manifolds

I Does not make sense for compact manifolds

(85)

Convergence Theorem for Manifold Weiszfeld Algorithm

Theorem. If the sectional curvatures of

M

are

nonnegative and the existence/uniqueness conditions are satisfied, then

lim

k→∞

m

k

= m

for

0 < α ≤ 2

.

(86)

Describing Shape Change

I How does shape change over time?

I Changes due to growth, aging, disease, etc.

I Example: 100 healthy subjects, 20–80 yrs. old

I We need regression of shape!

(87)

Regression Analysis

I Describe relationship between a dependent random variable

Y

to an independent random variable

T

.

I Given observations

(T

i

, Y

i

)

, find regression function:

Y = f (T )

.

I Often phrased as conditional expectation

E[Y | T = t] = f (t)

.

I Parametric (e.g., linear) or nonparametric (e.g., kernel).

(88)

Kernel Regression (Nadaraya-Watson)

Define regression function through weighted averaging:

f (t) = X

N

i=1

w

i

(t)Y

i

w

i

(t) = K

h

(t − T

i

) P

N

i=1

K

h

(t − T

i

)

(89)

Example: Gray Matter Volume

K (t-s)

t h

s ti

wi(t) = Kh(t−Ti) PN

i=1Kh(t−Ti)

f(t) = XN

i=1

wi(t)Yi

(90)

Manifold Kernel Regression

m

M pi

^h(t)

Using Fr ´echet weighted average:

m ˆ

h

(t) = arg min

y

X

N i=1

w

i

(t)d(y, Y

i

)

2

Davis, et al. ICCV 2007

(91)

Brain Shape Regression

Application: Healthy Brain Aging

35 37 39 41 43

45 47 49 51 53

(92)

Acknowledgements

Collaborators:

University of Utah

I Sarang Joshi

I Ross Whitaker

I Josh Cates

I Suresh Venkatasubramanian

I Steve Pizer (UNC)

I Brad Davis (Kitware)

Funding:

I NA-MIC, NIH U54 EB005149

I NIH R01 EB007688-01A1

(93)

Books

Dryden and Mardia,Statistical Shape Analysis, Wiley, 1998.

Small,The Statistical Theory of Shape, Springer-Verlag, 1996.

Kendall, Barden and Carne,Shape and Shape Theory, Wiley, 1999.

Krim and Yezzi,Statistics and Analysis of Shapes, Birkhauser, 2006.

Papers

Kendall, Shape manifolds, Procrustean metrics, and complex projective spaces.Bull. London Math. Soc., 16:18–121, 1984.

Fletcher, Joshi, Lu, Pizer, Principal geodesic analysis for the study of nonlinear statistics of shape,IEEE TMI, 23(8):995–1005, 2004.

Pennec, Intrinsic statistics on Riemannian manifolds: Basic Tools for Geometric Measurements.JMIV, 25(1):127-154, 2006.

Davis, Fletcher, Bullitt, Joshi. Population shape regression from random design data, ICCV 2007.

Fletcher, Venkatasubramanian, Joshi, The geometric median on Riemannian manifolds with application to robust atlas estimation.

Neuroimage, 45:S143-52, 2009.

Referencer

RELATEREDE DOKUMENTER

(d) 0y-2y-3y CDaR model based on scenarios with perturbed central moments statistics Figure 28: Sensitivity Analysis of CVaR and CDaR models: Accumulated Wealth..

In particular we use appropriate versions of the divergence theo- rems together with the comparison techniques for distance functions in Riemannian geometry and obtain bounds for

Boothby: Introduction to Differential Manifolds and Riemannian Geometry, Wiley. do Carmo: Riemannian

With this statistics as well as previous research in mind, this study aims at conducting research on the effects of different varieties of spoken language on brand perception in

performing artists in Denmark on the basis of national statistics. The paper uses three different criteria for defining performing artist, and looks at the implications observed

 Explain why principal component analysis can be beneficial when there is high data redundancy?.  Arrange a set of multivariate measurements into a matrix that is suitable for PCA

Reviewing experience Statistics in Medicine, ISCAS, Computational Intelligence and Neuroscience, EURASIP JASP, IEEE Transactions on Biomed- ical Engineering, Journal of

Kevin Sheppard’s Introduction to Python for Econometrics, Statistics and Data Analysis on 381 pages covers both Python basics and Python-based data analysis with Numpy,