Skip to main content

Posts

Showing posts with the label PCA

Use PCA in Machine Learning

Matlab Code % Source % http://www.dcs.gla.ac.uk/~srogers/firstcourseml/matlab/chapter7/pcaexample.html#2 clear all;close all; % Generate the data Y = [randn(20,2);randn(20,2)+5;randn(20,2)-5]; % Add 5 random dimensions N = size(Y,1); Y = [Y randn(N,5)];

Using PCA on Three Dimensional Dataset

In this work, We use PCA three dimensional data.  Matlab Code % PCA Model clear all, clc , close all hold on axis equal axis([-2 2 -2 2 -2 2]) % Step 1: Get some data X = [1 2 -1 -2 0; 0.2 0 0.1 0.2 -0.4; 1.2 0.3 -1 -0.1 -0.4]'; % Step 2: Substract the mean plot3(X(:,1),X(:,2),X(:,3),'ko'); XAdjust = X-repmat(mean(X),size(X,1),1); plot3(XAdjust(:,1),XAdjust(:,2),XAdjust(:,3),'ro'); % Step 3: Calculate the covariance matrix CM = cov(X); % Step 4: Eigenvalue and Eigenvector [V D]= eig(CM); % Step 5: Choosing component f1 = V(:,1)'; f2 = V(:,2)'; f3 = V(:,3)'; F=[f1; f2; f3];

Principal Components Analysis

PCA is a useful statistical technique that has found application in fields such as face recognition and image compression, and is a common technique for finding patterns in data of high dimension. PCA is a way of identifying patterns in data, and expressing the data in such a way as to highlight their similarities and differences. Since patterns in data can be hard to find in data of high dimension, where the luxury of graphical representation is not available, PCA is a powerful tool for analysing data. The other main advantage of PCA is that once you have found these patterns in the data, and you compress the data, ie. by reducing the number of dimensions, without much loss of information. This technique used in image compression, as we will see in a later section. In this work, I will show the PCA with six step. Get some data Substract the mean Calculate the covariance matrix Calculate the eigenvectors and eigenvalues of the covariance matrix Choosing components and fo...