Variance The Estimation Eigen Value of Principal Component Analysis and Nonlinear Principal Component Analysis

. Nonlinear Principal Component Analysis (PRINCALS) is an extension of Principal Component Analysis (Linear), which can reduce the variables of mixed scale multivariable data (nominal, ordinal, interval, and ratio) simultaneously. This study investigated variance the estimation eigen value of Principal Component Analysis Linear and Nonlinear. The result showed that variance the estimation eigen value of Principal Component Analysis is = ) ~ ˆ ( S


Introduction
Principal Component Analysis has been discussed, among others, by Joliffe, Bolton, and Vichi is an analysis to reduce variables, where the variables formed do not correlate with each other [1].[2] discuss Principal Component Analysis in ordinal scale data, while Nonlinear Principal Component Analysis is a method for reducing variables from multivariable mixed scale data simultaneously.
Speaking of Principal Component Analysis, it will be faced with a determination of the eigen value (  ~) which is the basis of determining the main components of the Principal Component Analysis.The eigen value obtained later is the eigen value of the samples because to study the population will be faced with limitations such as problems of time, cost, energy, etc.
Because the eigen value (  ~) obtained is the eigen value of the samples that are expected to represent the population, then of course the eigen value is only an estimator of the true eigen value (  ~).An estimator is said to be good if it meets the criteria of being unbiased, efficient, and consistent.
This research is to determine variance the estimation eigen value (Var (  ~)) of Linear and Nonlinear Principal Component Analysis with Delta Method.

Principal Component Analysis
Principal Component Analysis (Linear Principal Component Analysis) is an analysis to reduce variables, where the variables formed do not correlate with each other ( [1] and [3]).
For more details the application of Principal Component Analysis can be seen in [4] and [5].

PRINCALS (Principal Component Analysis by means of Alternating Least Squares) or
Principal Component Analysis using the alternative least squares approach commonly called Nonlinear Principal Component Analysis is a development of Principal Component Analysis [6].
PRINCALS is a method that analyzes mixed scale data simultaneously by grouping variables whose linear correlations are aligned into one principal [7].
If R is the matrix correlation sample order of m x m with the pair variance the estimation eigen value and eigen vector is where , and j  ˆ is the estimation eigen value of R matrix and has an unbiased property: PRINCALS here is based on the Meet Loss Theory, with the loss function homogeneity is: If multivariable data without missing data, the estimation eigen value can be searched by PRINCALS from the formula [6]: R(Q) = matrix the correlation matrix between the combined linear scores (The linear composite scores) of all sets of Q matrix on all dimensions.

Delta Method
Delta method is a method for estimating the expected value and variance of random variables [10].In this study, the Delta Method will be used variance the estimation eigen.By looking at the equation: where: Estimator B is B ˆ obtained from the Maximum Likelihood Estimation method and Vec( B ˆ) with a Multivariate Normal distribution [11].
with n to infinitely the asymptotic distribution of m-variate is normal with the mean vector zero and the variancecovariance matrix is nH ' VH where H as mentioned above and V is the matrix obtained from submatrix with dimension [m(m+1)/2] x [m(m+1)/2] at the bottom of the matrix  V by multiplying the elements by the corresponding numbers 1/4 and 1/2, so that:

Research Methods
Broadly speaking, this study was conducted by determining of variance-covariance matrix S samples, the estimation eigen value and eigen vector from the S, matrix H, matrix V * , matrix s 2 and determine estimation confidence interval of the estimation eigen value  ~).From the results of the steps above, we will get a variety of feature root estimators (Var (  ~)) from Principal Component Analysis (Linear) and Nonlinear Principal Component Analysis.

Results and Discussion
Research in everyday life often cannot be done on populations for various reasons, as time problems, cost issues, and others.This is where the foresight of researchers to determine the "good" parameter estimator that represents the population, namely the unbiased estimator, consistent, and efficient [9].Principal Component Analysis (Linear) and Nonlinear Principal Component Analysis which is an analysis to reduce variables, the application can be seen in the research of [12].This analysis can also be used for data derived from multivariate data (see [13]), especially those that do not have outliers.In the case of data in multivariate linear models [9], it is necessary to detect whether the model has an outlier or not as in [14], [15], [16], and in the research of [17].
For example, a population has eigen value .
 If a sample is taken from that population, then the estimation eigen value can be searched . To check that it  ˆ is a "good" estimator for . Next, the expectation, variance and distribution of the estimation eigen value will be searched. 4

Variance the Estimation Eigen Value of Principal Component Analysis (Linear)
A sample that comes from a multivariate normal distribution population with n objects and m variables, it can be written in the form of a matrix as follows: Table 1.The Form of a Matrix. If = vector average for the population = vector average for the sample  = variance-covariance matrix for the population.
From equation ( 13), it can be concluded that variance the estimation eigen value of Linear and Nonlinear Principal Component Analysis is:

Variance the Estimation Eigen Value of Nonlinear Principal Component Analysis
Place the figure as close as possible after the point where it is first referenced in the text.If there is a large number of figures and tables it might be necessary to place some before their text citation.
A sample coming from a multivariate normal distribution population with n objects and m variable.
random sample of size n and dimension m.This sample has an average vector X ~, variance-covariance matrix S and correlation matrix R. If S is variance-covariance matrix order of m x m with a pair of the estimation eigen value and eigen vector, namely:

N 1 −
then vector average for the sample the i th from n objects and m buah variables: Estimator  is Sn or Σ ˆ = Sn = variance-covariance matrix of sample.Estimator  is S, if each element is multiplied by , where Sn is a biased estimator for .In order not to be biased, a correction was made with factor C, so that: So the unbiased estimator of  is S, with elements:

If
then vector average for the sample the i th from n objects and m buah variables: