Statistics on Riemannian Manifolds
Tom Fletcher
Scientific Computing and Imaging Institute University of Utah
August 19, 2009
Manifold Data
“Learned” Manifolds
I Raw data lies in Euclidean space
I Manifold + Noise
“Known” Manifolds
I Raw data lies in a manifold
I Typically given by some constraints on data
Manifold Data
“Learned” Manifolds
I Raw data lies in Euclidean space
I Manifold + Noise
“Known” Manifolds
I Raw data lies in a manifold
I Typically given by some constraints on data
Manifold Data
“Learned” Manifolds
I Raw data lies in Euclidean space
I Manifold + Noise
“Known” Manifolds
I Raw data lies in a manifold
I Typically given by some constraints on data
Manifold Data in Vision and Imaging
I Directional data
I Transformation groups (rotations, projective, affine)
I Shapes
I Diffusion tensors, structure tensors
I Diffeomorphisms (for deformable atlas building)
Manifold Data in Vision and Imaging
I Directional data
I Transformation groups (rotations, projective, affine)
I Shapes
I Diffusion tensors, structure tensors
I Diffeomorphisms (for deformable atlas building)
Manifold Data in Vision and Imaging
I Directional data
I Transformation groups (rotations, projective, affine)
I Shapes
I Diffusion tensors, structure tensors
I Diffeomorphisms (for deformable atlas building)
Manifold Data in Vision and Imaging
I Directional data
I Transformation groups (rotations, projective, affine)
I Shapes
I Diffusion tensors, structure tensors
I Diffeomorphisms (for deformable atlas building)
Manifold Data in Vision and Imaging
I Directional data
I Transformation groups (rotations, projective, affine)
I Shapes
I Diffusion tensors, structure tensors
I Diffeomorphisms (for deformable atlas building)
Manifold Statistics: Averages
→
Manifold Statistics: Averages
→
Manifold Statistics: Variability
Shape priors in segmentation
Manifold Statistics: Regression
Application: Healthy Brain Aging
35 37 39 41 43
45 47 49 51 53
What is Shape?
Shape is the geometry of an object modulo position, orientation, and size.
What is Shape?
Shape is the geometry of an object modulo position, orientation, and size.
Shape Representations
I Boundary models (points, curves, surfaces, level sets)
I Interior models (medial, solid mesh)
I Transformation models (splines, diffeomorphisms)
Shape Analysis
Shape Space
A shape is a point in a high-dimensional, nonlinear shape space.
Shape Analysis
Shape Space
A shape is a point in a high-dimensional, nonlinear shape space.
Shape Analysis
Shape Space
A shape is a point in a high-dimensional, nonlinear shape space.
Shape Analysis
Shape Space
A shape is a point in a high-dimensional, nonlinear shape space.
Shape Analysis
Shape Space
A metric space structure provides a comparison between two shapes.
Kendall’s Shape Space
I Define object with
k
points.I Represent as a vector in
R
2k.I Remove translation, rotation, and scale.
I End up with complex projective space,
CP
k−2.Quotient Spaces
What do we get when we “remove” scaling from
R
2?x
Notation:
[x] ∈ R
2/ R
+Quotient Spaces
What do we get when we “remove” scaling from
R
2?x [x]
Notation:
[x] ∈ R
2/ R
+Quotient Spaces
What do we get when we “remove” scaling from
R
2?x [x]
Notation:
[x] ∈ R
2/ R
+Quotient Spaces
What do we get when we “remove” scaling from
R
2?x [x]
Notation:
[x] ∈ R
2/ R
+Quotient Spaces
What do we get when we “remove” scaling from
R
2?x [x]
Notation:
[x] ∈ R
2/ R
+Constructing Kendall’s Shape Space
I Consider planar landmarks to be points in the complex plane.
I An object is then a point
(z
1, z
2, . . . , z
k) ∈ C
k.I Removingtranslationleaves us with
C
k−1.I How to removescalingandrotation?
Constructing Kendall’s Shape Space
I Consider planar landmarks to be points in the complex plane.
I An object is then a point
(z
1, z
2, . . . , z
k) ∈ C
k.I Removingtranslationleaves us with
C
k−1.I How to removescalingandrotation?
Constructing Kendall’s Shape Space
I Consider planar landmarks to be points in the complex plane.
I An object is then a point
(z
1, z
2, . . . , z
k) ∈ C
k.I Removingtranslationleaves us with
C
k−1.I How to removescalingandrotation?
Constructing Kendall’s Shape Space
I Consider planar landmarks to be points in the complex plane.
I An object is then a point
(z
1, z
2, . . . , z
k) ∈ C
k.I Removingtranslationleaves us with
C
k−1.I How to removescalingandrotation?
Scaling and Rotation in the Complex Plane
Im
0 Re
! r
Recall a complex number can be writ- ten as
z = re
iφ, with modulusr
and argumentφ
.Complex Multiplication:
se
iθ∗ re
iφ= (sr)e
i(θ+φ)Multiplication by a complex number
se
iθ is equivalent to scaling bys
and rotation byθ
.Scaling and Rotation in the Complex Plane
Im
0 Re
! r
Recall a complex number can be writ- ten as
z = re
iφ, with modulusr
and argumentφ
.Complex Multiplication:
se
iθ∗ re
iφ= (sr)e
i(θ+φ)Multiplication by a complex number
se
iθ is equivalent to scaling bys
and rotation byθ
.Removing Scale and Translation
Multiplying a centered point set,
z = (z
1, z
2, . . . , z
k−1),
by a constant
w ∈ C
, just rotates and scales it.Thus the shape of
z
is an equivalence class:[z] = { (wz
1, wz
2, . . . , wz
k−1) : ∀ w ∈ C}
This gives complex projective space
CP
k−2 – much like the sphere comes from equivalence classes of scalar multiplication inR
n.Removing Scale and Translation
Multiplying a centered point set,
z = (z
1, z
2, . . . , z
k−1),
by a constant
w ∈ C
, just rotates and scales it.Thus the shape of
z
is an equivalence class:[z] = { (wz
1, wz
2, . . . , wz
k−1) : ∀ w ∈ C}
This gives complex projective space
CP
k−2 – much like the sphere comes from equivalence classes of scalar multiplication inR
n.Removing Scale and Translation
Multiplying a centered point set,
z = (z
1, z
2, . . . , z
k−1),
by a constant
w ∈ C
, just rotates and scales it.Thus the shape of
z
is an equivalence class:[z] = { (wz
1, wz
2, . . . , wz
k−1) : ∀ w ∈ C}
This gives complex projective space
CP
k−2 – much like the sphere comes from equivalence classes of scalar multiplication inR
n.The M-rep Shape Space
n n
0
1
x
Medial Atom:
m = { x, r, n
0, n
1} ∈ M (1) M (1) = R
3× R
+× S
2× S
2M-rep Model with
n
atoms:M ∈ M (n) = M (1)
nShape change in terms of local translation, bending, & widening.
The Exponential and Log Maps
p T M p
Exp (X)p X
M
I The exponential map takes tangent vectors to points along geodesics.
I The length of the tangent vector equals the length along the geodesic segment.
I Its inverse is the log map – it gives distance between points:
d(p, q) = k Log
p(q) k
.Intrinsic Means (Fr ´echet)
Theintrinsic meanof a collection of points
x
1, . . . , x
N on a Riemannian manifoldM
isµ = arg min
x∈M
X
N i=1d(x, x
i)
2,
where
d( · , · )
denotes Riemannian distance onM
.Computing Means
Gradient Descent Algorithm:
Input:x1, . . . ,xN ∈M µ0 =x1
Repeat:
∆µ= N1 PN
i=1Logµ
k(xi) µk+1 =Expµ
k(∆µ)
Computing Means
Gradient Descent Algorithm:
Input:x1, . . . ,xN ∈M µ0 =x1
Repeat:
∆µ= N1 PN
i=1Logµ
k(xi) µk+1 =Expµ
k(∆µ)
Computing Means
Gradient Descent Algorithm:
Input:x1, . . . ,xN ∈M µ0 =x1
Repeat:
∆µ= N1 PN
i=1Logµ
k(xi) µk+1 =Expµ
k(∆µ)
Computing Means
Gradient Descent Algorithm:
Input:x1, . . . ,xN ∈M µ0 =x1
Repeat:
∆µ= N1 PN
i=1Logµ
k(xi) µk+1 =Expµ
k(∆µ)
Computing Means
Gradient Descent Algorithm:
Input:x1, . . . ,xN ∈M µ0 =x1
Repeat:
∆µ= N1 PN
i=1Logµ
k(xi) µk+1 =Expµ
k(∆µ)
Computing Means
Gradient Descent Algorithm:
Input:x1, . . . ,xN ∈M µ0 =x1
Repeat:
∆µ= N1 PN
i=1Logµ
k(xi) µk+1 =Expµ
k(∆µ)
Computing Means
Gradient Descent Algorithm:
Input:x1, . . . ,xN ∈M µ0 =x1
Repeat:
∆µ= N1 PN
i=1Logµ
k(xi) µk+1 =Expµ
k(∆µ)
Computing Means
Gradient Descent Algorithm:
Input:x1, . . . ,xN ∈M µ0 =x1
Repeat:
∆µ= N1 PN
i=1Logµ
k(xi) µk+1 =Expµ
k(∆µ)
Computing Means
Gradient Descent Algorithm:
Input:x1, . . . ,xN ∈M µ0 =x1
Repeat:
∆µ= N1 PN
i=1Logµ
k(xi) µk+1 =Expµ
k(∆µ)
Computing Means
Gradient Descent Algorithm:
Input:x1, . . . ,xN ∈M µ0 =x1
Repeat:
∆µ= N1 PN
i=1Logµ
k(xi) µk+1 =Expµ
k(∆µ)
Principal Geodesic Analysis
Linear Statistics (PCA) Curved Statistics (PGA)
Principal Geodesic Analysis
Linear Statistics (PCA) Curved Statistics (PGA)
Principal Geodesic Analysis
Linear Statistics (PCA) Curved Statistics (PGA)
Principal Geodesic Analysis
Linear Statistics (PCA) Curved Statistics (PGA)
Principal Geodesic Analysis
Linear Statistics (PCA) Curved Statistics (PGA)
Principal Geodesic Analysis
Linear Statistics (PCA) Curved Statistics (PGA)
Principal Geodesic Analysis
Linear Statistics (PCA) Curved Statistics (PGA)
Computing PGA
I Find nested linear subspaces
V
k⊂ T
pM
such thatExp
µ(V
k)
maximizes variance of projected data.I First-order approximation: PCA in tangent space of sample covariance matrix,
S = 1 N − 1
X
N i=1Log
µ(x
i) Log
µ(x
i)
TPGA of Kidney
Mode 1 Mode 2 Mode 3
Robust Statistics: Motivation
I The mean is overly influenced by outliers due to sum-of-squares.
I Robust statistical description of shape or other manifold data.
I Deal with outliers due to imaging noise or data corruption.
I Misdiagnosis, segmentation error, or outlier in a population study.
Mean vs. Median in R n
Mean: least-squares problem
µ = arg min
x∈Rn
X k x − x
ik
2Closed-form solution (arithmetic average)
Geometric Median, or Fermat-Weber Point:
m = arg min
x∈Rn
X k x − x
ik
No closed-form solution
Mean vs. Median in R n
Mean: least-squares problem
µ = arg min
x∈Rn
X k x − x
ik
2Closed-form solution (arithmetic average)
Geometric Median, or Fermat-Weber Point:
m = arg min
x∈Rn
X k x − x
ik
No closed-form solution
Weiszfeld Algorithm in R n
I Gradient descent on sum-of-distance:
m
k+1= m
k− αG
k,
G
k= X
i∈Ik
m
k− x
ik x
i− m
kk
X
i∈Ik
k x
i− m
kk
−1!
I Step size:
0 < α ≤ 2
I Exclude singular points:
I
k= { i : m
k6 = x
i}
I Weiszfeld (1937), Ostresh (1978)
Geometric Median on a Manifold
The geometric median of data
x
i∈ M
is the point that minimizes the sum of geodesic distances:m = arg min
x∈M
X
N i=1d(x, x
i)
Fletcher, et al. CVPR 2008 and NeuroImage 2009.
Weiszfeld Algorithm for Manifolds
Gradient descent:
m
k+1= Exp
mk(αv
k),
v
k= X
i∈Ik
Log
mk(x
i) d(m
k, x
i)
X
i∈Ik
d(m
k, x
i)
−1Example: Rotations
Input data: 20 random rotations
Outlier set: random, rotated 90◦
Example: Rotations
Mean
Median
0 outliers 5 outliers 10 outliers 15 outliers
Application: Diffusion Tensor MRI
! !
Space of Positive-Definite Tensors
I Positive-definite, symmetric matrices
PD(n) = GL
+(n)/SO(n)
I Riemannian manifold with nonpositive curvature
I Applications:
I Diffusion tensor MRI: Fletcher (2004), Pennec (2004)
I Structure tensor: Rathi (2007)
I Bookstein’s simplex shape space (1986)
Example: PD(2)
a c
b l
g p
p
0
1
A ∈ PD(2)
is of the formA =
a b b c
,
ac − b
2> 0, a > 0.
Similar situation for
PD(3)
(6-dimensional).Example: Tensors
Input data: 20 random tensors
Outlier set: random, rotated 90◦
Example: Tensors
Mean
Median
0 outliers 5 outliers 10 outliers 15 outliers
Kendall’s Shape Space
I Define object with
k
points.I Represent as a vector in
R
2k.I Remove translation, rotation, and scale.
I End up with complex projective space,
CP
k−2.Example on Kendall Shape Spaces
Hand shapes
Outliers
Example on Kendall Shape Spaces
Mean:
# Outliers: 0 2 6 12
Median:
# Outliers: 0 2 6 12
Example on Kendall Shape Spaces
Mean:
# Outliers: 0 2 6 12
Median:
# Outliers: 0 2 6 12
Image Metamorphosis
I Metric between images
I Includes both deformation and intensity change
U(v
t, I
t) = Z
10
k v
tk
2Vdt + 1 σ
2Z
1 0dI
tdt + h∇ I
t, v
ti
2
L2
dt
Fig. 1. Metamorphosis geodesic between two 3D brain images. Mid-axial (top row) and mid-coronal (bottom row) slices are shown.
The theory of robust estimation has led to the devel- opment of numerous robust estimators, of which theL1- estimator, also known as the geometric median, is one of the best known. Given a set of points{xi, i= 1,· · ·, n} ∈ Rd, with the usual Euclidean norm"x", theL1-estimator is defined as the pointm∈Rdminimizing!n
i=1"m−xi". It can be shown (Lopuha¨a and Rousseeuw, 1991) that this estimator has a breakdown point of 0.5, which means that half of the dataneeds to be corrupted in order to corrupt this estimator. In Figure 2 we illustrate this by showing how the geometric median and the mean are displaced in the presence of a few outliers.
Fig. 2. The geometric median (marked with a!) and mean (marked with a×) for a collection of points in the plane. Notice how the few outliers at the top right of the picture have forced the mean away from the points, whereas the median remains centrally located.
The existence and uniqueness of the the median inRd follows directly from the convexity of the distance function. In one dimension, the geometric median is the point that divides the point set into equal halves on either side (ifnis odd) and is any point on the line segment connecting the two middle points (ifnis even). In general however, com-
puting the geometric median is difficult; Bajaj has shown that the solution cannot be expressed using radicals (arith- metic operations, andkthroots) (Bajaj, 1988).
There are two main approaches to computing the geo- metric median of a collection of points inRd. One way is to compute anapproximatemedian ˜msuch that!n
i=1"m˜−
xi"is at most a (1 +!)-factor larger than cost of the op-
timal median. This can be computed using the ellipsoid method (Chandrasekaran and Tamir, 1990). A more effi- cient algorithm achieving the same result is due to Bose et al. (2003).
These algorithms do not generalize beyond Euclidean spaces. A more general iterative algorithm due to Weiszfeld (1937) and later improved by Kuhn and Kuenne (1962) and Ostresh (1978) converges to the optimal solution in Euclidean spaces (Kuhn, 1973), and was subsequently gen- eralized to Banach spaces by Eckhardt (1980).
Several other robust estimators of centrality have been proposed in the statistics literature (Maronna et al., 2006). Winsorized means, where a percentage of extreme values are clamped, and trimmed means, where extreme values are removed, have been used for univariate data. The drawback of these methods is that they require a somewhat arbitrary selection of a threshold. M-estimators (Huber, 1981) are a generalization of maximum likelihood methods in which some function of the data is minimized. The geometric me- dian is a special case of an M-estimator with anL1 cost function.
3. The Riemannian Geometric Median
Let M be a Riemannian manifold. Given points x1, . . . , xn ∈ M and corresponding positive real weights wi, . . . , wn, with!
iwi = 1, define the weighted sum-of- distances functionf(x) = !
iwid(x, xi), whered is the Riemannian distance function onM. Throughout, we will assume that thexilie in a convex setU⊂M, i.e., any two points inUare connected by a unique shortest geodesic ly- ing entirely inU. We define theweighted geometric median, 3
Image Metamorphosis
I Metric between images
I Includes both deformation and intensity change
U(v
t, I
t) = Z
10
k v
tk
2Vdt + 1 σ
2Z
1 0dI
tdt + h∇ I
t, v
ti
2
L2
dt
Fig. 1. Metamorphosis geodesic between two 3D brain images. Mid-axial (top row) and mid-coronal (bottom row) slices are shown.
The theory of robust estimation has led to the devel- opment of numerous robust estimators, of which theL1- estimator, also known as the geometric median, is one of the best known. Given a set of points{xi, i= 1,· · ·, n} ∈ Rd, with the usual Euclidean norm"x", theL1-estimator is defined as the pointm∈Rdminimizing!n
i=1"m−xi". It can be shown (Lopuha¨a and Rousseeuw, 1991) that this estimator has a breakdown point of 0.5, which means that half of the dataneeds to be corrupted in order to corrupt this estimator. In Figure 2 we illustrate this by showing how the geometric median and the mean are displaced in the presence of a few outliers.
Fig. 2. The geometric median (marked with a!) and mean (marked with a×) for a collection of points in the plane. Notice how the few outliers at the top right of the picture have forced the mean away from the points, whereas the median remains centrally located.
The existence and uniqueness of the the median inRd follows directly from the convexity of the distance function.
In one dimension, the geometric median is the point that divides the point set into equal halves on either side (ifnis odd) and is any point on the line segment connecting the two middle points (ifnis even). In general however, com-
puting the geometric median is difficult; Bajaj has shown that the solution cannot be expressed using radicals (arith- metic operations, andkthroots) (Bajaj, 1988).
There are two main approaches to computing the geo- metric median of a collection of points inRd. One way is to compute anapproximatemedian ˜msuch that!n
i=1"m˜−
xi"is at most a (1 +!)-factor larger than cost of the op-
timal median. This can be computed using the ellipsoid method (Chandrasekaran and Tamir, 1990). A more effi- cient algorithm achieving the same result is due to Bose et al. (2003).
These algorithms do not generalize beyond Euclidean spaces. A more general iterative algorithm due to Weiszfeld (1937) and later improved by Kuhn and Kuenne (1962) and Ostresh (1978) converges to the optimal solution in Euclidean spaces (Kuhn, 1973), and was subsequently gen- eralized to Banach spaces by Eckhardt (1980).
Several other robust estimators of centrality have been proposed in the statistics literature (Maronna et al., 2006).
Winsorized means, where a percentage of extreme values are clamped, and trimmed means, where extreme values are removed, have been used for univariate data. The drawback of these methods is that they require a somewhat arbitrary selection of a threshold. M-estimators (Huber, 1981) are a generalization of maximum likelihood methods in which some function of the data is minimized. The geometric me- dian is a special case of an M-estimator with anL1 cost function.
3. The Riemannian Geometric Median
Let M be a Riemannian manifold. Given points x1, . . . , xn ∈ M and corresponding positive real weights wi, . . . , wn, with!
iwi = 1, define the weighted sum-of- distances functionf(x) = !
iwid(x, xi), whered is the Riemannian distance function onM. Throughout, we will assume that thexilie in a convex setU⊂M, i.e., any two points inUare connected by a unique shortest geodesic ly- ing entirely inU. We define theweighted geometric median, 3
Example: Metamorphosis
Input Data
Mean Median
ratio = 1.13 ratio = 1.04
Example: Metamorphosis
Input Data
Mean Median
ratio = 1.13 ratio = 1.04
Example: Metamorphosis
Fig. 12. Midaxial slices from the four input 3D MR images (left). The resulting geometric median atlas (right).
Grant R01EB007688-01A1.
References
Bajaj, C., 1988. The algebraic degree of geometric opti- mization problems. Discrete and Computational Geom- etry 3, 177–191.
Barmpoutis, A., Vemuri, B. C., Shepherd, T. M., Forder, J. R., 2007. Tensor splines for interpolation and approx- imation of DT-MRI with application to segmentation of isolated rat hippocampi. IEEE Transactions on Medical Imaging 26 (11), 1537–1546.
Basser, P. J., Mattiello, J., Bihan, D. L., 1994. MR diffusion tensor spectroscopy and imaging. Biophysics Journal 66, 259–267.
Batchelor, P., Moakher, M., Atkinson, D., Calamante, F., Connelly, A., 2005. A rigorous framework for diffusion tensor calculus. Magnetic Resonance in Medicine 53, 221–225.
Bigun, J., Granlund, G., Wiklund, J., 1991. Multidimen- sional orientation estimation with application to texture analysis and optical flow. IEEE Transactions on Pattern Analysis and Machine Intelligence 13 (8), 775–790.
Bookstein, F. L., 1986. Size and shape spaces for landmark data in two dimensions (with discussion). Statistical Sci- ence 1 (2), 181–242.
Bose, P., Maheshwari, A., Morin, P., 2003. Fast approxima- tions for sums of distances, clustering and the fermat–
weber problem. Comput. Geom. Theory Appl. 24 (3), 135–146.
Buss, S. R., Fillmore, J. P., 2001. Spherical averages and applications to spherical splines and interpolation. ACM Transactions on Graphics 20 (2), 95–126.
Chandrasekaran, R., Tamir, A., 1990. Algebraic optimiza- tion: The Fermat-Weber problem. Mathematical Pro- gramming 46, 219–224.
Cheeger, J., Ebin, D. G., 1975. Comparison Theorems in Riemannian Geometry. North-Holland.
Cootes, T. F., Taylor, C. J., Cooper, D. H., Graham, J., 1995. Active shape models – their training and appli- cation. Comp. Vision and Image Understanding 61 (1), 38–59.
Corouge, I., Fletcher, P. T., Joshi, S., Gouttard, S., Gerig, G., 2006. Fiber tract-oriented statistics for quantitative diffusion tensor MRI analysis. Medical Image Analysis 10 (5), 786–798.
Eckhardt, U., 1980. Weber’s problem and Weiszfeld’s algo- rithm in general spaces. Mathematical Programming 18, 186–196.
Fletcher, P. T., Joshi, S., 2004. Principal geodesic analysis on symmetric spaces: statistics of diffusion tensors. In:
Proceedings of ECCV Workshop on Computer Vision Approaches to Medical Image Analysis. pp. 87–98.
Fletcher, P. T., Lu, C., Joshi, S., 2003. Statistics of shape via principal geodesic analysis on Lie groups. In: Pro- ceedings of the IEEE Conference on Computer Vision and Pattern Recognition. pp. 95–101.
Fletcher, P. T., Tao, R., Jeong, W.-K., Whitaker, R. T., 2007. A volumetric approach to quantifying region-to- region white matter connectivity in diffusion tensor MRI.
In: Proceedings of Information Processing in Medical Imaging (IPMI). pp. 346–358.
Fr´echet, M., 1948. Les ´el´ements al´eatoires de nature quel- conque dans un espace distanci´e. Ann. Inst. H. Poincar´e 10 (3), 215–310.
Garcin, L., Younes, L., 2005. Geodesic image matching: a wavelet based energy minimization scheme. In: Work- shop on Energy Minimization Methods in Computer Vi- 11
Input Data Median Atlas
Preliminaries
I
x
i∈ U ⊂ M
,U
is a convex subsetI
diam(U) = max
x,y∈Ud(x, y)
Existence and Uniqueness
Theorem. The weighted geometric median exists and is unique if
1. the sectional curvatures of
M
are bounded above by∆ > 0
anddiam(U) < π/(2 √
∆)
, or 2. the sectional curvatures ofM
are nonpositive.Proof is by showing the convexity of geodesic distance. Identical conditions to ensure the mean (Karcher).
Existence and Uniqueness
Theorem. The weighted geometric median exists and is unique if
1. the sectional curvatures of
M
are bounded above by∆ > 0
anddiam(U) < π/(2 √
∆)
, or 2. the sectional curvatures ofM
are nonpositive.Proof is by showing the convexity of geodesic distance.
Identical conditions to ensure the mean (Karcher).
Robustness
I Breakdown point: percentage of points that can be moved to infinity before statistic goes to infinity
I Euclidean mean: 0%
I Euclidean geometric median: 50%
I Same result holds for noncompact manifolds
I Does not make sense for compact manifolds
Convergence Theorem for Manifold Weiszfeld Algorithm
Theorem. If the sectional curvatures of
M
arenonnegative and the existence/uniqueness conditions are satisfied, then
lim
k→∞m
k= m
for0 < α ≤ 2
.Describing Shape Change
I How does shape change over time?
I Changes due to growth, aging, disease, etc.
I Example: 100 healthy subjects, 20–80 yrs. old
I We need regression of shape!
Regression Analysis
I Describe relationship between a dependent random variable
Y
to an independent random variableT
.I Given observations
(T
i, Y
i)
, find regression function:Y = f (T )
.I Often phrased as conditional expectation
E[Y | T = t] = f (t)
.I Parametric (e.g., linear) or nonparametric (e.g., kernel).
Kernel Regression (Nadaraya-Watson)
Define regression function through weighted averaging:
f (t) = X
Ni=1
w
i(t)Y
iw
i(t) = K
h(t − T
i) P
Ni=1
K
h(t − T
i)
Example: Gray Matter Volume
K (t-s)
t h
s ti
wi(t) = Kh(t−Ti) PN
i=1Kh(t−Ti)
f(t) = XN
i=1
wi(t)Yi
Manifold Kernel Regression
m
M pi
^h(t)
Using Fr ´echet weighted average:
m ˆ
h(t) = arg min
y
X
N i=1w
i(t)d(y, Y
i)
2Davis, et al. ICCV 2007
Brain Shape Regression
Application: Healthy Brain Aging
35 37 39 41 43
45 47 49 51 53
Acknowledgements
Collaborators:
University of Utah
I Sarang Joshi
I Ross Whitaker
I Josh Cates
I Suresh Venkatasubramanian
I Steve Pizer (UNC)
I Brad Davis (Kitware)
Funding:
I NA-MIC, NIH U54 EB005149
I NIH R01 EB007688-01A1
Books
Dryden and Mardia,Statistical Shape Analysis, Wiley, 1998.
Small,The Statistical Theory of Shape, Springer-Verlag, 1996.
Kendall, Barden and Carne,Shape and Shape Theory, Wiley, 1999.
Krim and Yezzi,Statistics and Analysis of Shapes, Birkhauser, 2006.
Papers
Kendall, Shape manifolds, Procrustean metrics, and complex projective spaces.Bull. London Math. Soc., 16:18–121, 1984.
Fletcher, Joshi, Lu, Pizer, Principal geodesic analysis for the study of nonlinear statistics of shape,IEEE TMI, 23(8):995–1005, 2004.
Pennec, Intrinsic statistics on Riemannian manifolds: Basic Tools for Geometric Measurements.JMIV, 25(1):127-154, 2006.
Davis, Fletcher, Bullitt, Joshi. Population shape regression from random design data, ICCV 2007.
Fletcher, Venkatasubramanian, Joshi, The geometric median on Riemannian manifolds with application to robust atlas estimation.
Neuroimage, 45:S143-52, 2009.