Factor Analysis

1. Model

is linearly dependent upon a few common factors and specific factors, with

where the factor loading, the factor (common factor), and the noise (specific factor).

note that the representation is not unique, since

There are some assumptions: factors are independent as well as noise

and

2. Covariance Matrix

Total covariance matrix is

and variance can be decomposited into two parts: communality and specific variance

Notice that communality is not affected by .

3. Methods of Estimation

3.1. PCA

  1. Initial

  2. Find , the largest eigenvectors of the eigen decomposition of

  3. Update

  4. Repeat...

3.2. Maximum Likelihood Method

Assumption: the common factors and the specific factors are jointly normally distributed.

It is not well defined because of multiplicity of choices of L, we can impose computationally convenient uniqueness condition:

4. Factor Rotation

We have different choice of . Ideally, we should like to see a pattern of loadings such that each variable loads highly on a single factor and has small to moderate loadings on the remaining factors.

We here introduce Varimax Criterion, Varimax procedure selects the orthogonal transformation that maximizes

5. Reference

  • Andrew Ng's Lecture Notes on Factor Analysis.

  • Chapter 9 of Johnson & Wichern, Applied Multivariate Statistical Analysis, 6th edition.

  • See below Lin Hou's lecture notes (Tsinghua, Multivariate Statistical Analysis) on Factor analysis.

This browser does not support PDFs.
Please download the PDF to view it: Download PDF.

results matching ""

    No results matching ""