专栏首页翻译scikit-learn CookbookUsing factor analysis for decomposition分解之因子分析

Using factor analysis for decomposition分解之因子分析

Factor analysis is another technique we can use to reduce dimensionality. However, factor analysis makes assumptions and PCA does not. The basic assumption is that there are implicit features responsible for the features of the dataset.

因子分析是我们能用于降维的另一项技术,然而,因子分析使用假设,而PCA不需要,最基本的假设是有隐藏函数能代表数据集的特征。

This recipe will boil down to the explicit features from our samples in an attempt to understand the independent variables as much as the dependent variables.

这个方法把我们例子送确定的实例压缩,企图像理解因变量那样来理解自变量

Getting ready准备工作

To compare PCA and factor analysis, let's use the iris dataset again, but we'll first need to load the factor analysis class:为了比较PCA和因子分析,我们再次使用iris数据集,单我们首先需要载入因子分析类:

from sklearn.decomposition import FactorAnalysis

How to do it...怎么做

From a programming perspective, factor analysis isn't much different from PCA:从程序的角度来看,因子分析与PCA没什么不同,

fa = FactorAnalysis()
iris_two_dim = fa.fit_transform(iris.data)
iris_two_dim[:5]
array([[-1.33125848, 0.55846779],
       [-1.33914102, -0.00509715],
       [-1.40258715, -0.307983 ],
       [-1.29839497, -0.71854288],
       [-1.33587575, 0.36533259]])

Compare the following plot to the plot in the last section:比较上节最后讲到的图中的这些散点

Since factor analysis is a probabilistic transform, we can examine different aspects such as the log likelihood of the observations under the model, and better still, compare the log likelihoods across models.

Factor analysis is not without flaws. The reason is that you're not fitting a model to predict an outcome, you're fitting a model as a preparation step. This isn't a bad thing per se, but errors here compound when training the actual model.

因为因子分析是基于概率的变换,我们能测试不同层面,例如通过模型观察形如log的形式,甚至可以做关于log的交叉方法。

因子分析并不是没有错误,因为你无法你和一个模型来预测一个结果,你是在拟合一个模型的预处理阶段。

How it works...如何工作的

Factor analysis is similar to PCA, which was covered previously. However, there is an important distinction to be made. PCA is a linear transformation of the data to a different space where the first component "explains" the variance of the data, and each subsequent component is orthogonal to the first component.

因子分析很像恰恰相反的PCA,然而,有一个很大的区别,PCA是通过对数据进行线性变换到不同的能够用一个结构解释数据偏移量的空间,并且随后的成分都与第一个成分垂直。

For example, you can think of PCA as taking a dataset of N dimensions and going down to some space of M dimensions, where M < N.

例如,你能想象PCA是吧一个N维数据集降维到其他为M维的空间,其中M<N

Factor analysis, on the other hand, works under the assumption that there are only M important features and a linear combination of these features (plus noise) creates the dataset in N dimensions. To put it another way, you don't do regression on an outcome variable, you do regression on the features to determine the latent factors of the dataset.

因子分析则相反,在假设成立的情况下,只有M个重要的特征,是经由N维数据集通过线性结合成的,换句话说,我们不对输出进行回归分析,而是通过对数据集进行回归分析来确定其中潜在的因子

原文链接:http://www.packtpub.com

原文作者:Trent Hauck

相关文章

  • Feature selection特征选择

    This recipe along with the two following it will be centered around automatic fe...

    到不了的都叫做远方
  • Evaluating the linear regression model评估线性回归模型

    In this recipe, we'll look at how well our regression fits the underlying data. ...

    到不了的都叫做远方
  • Scaling data to the standard normal缩放数据到标准正态形式

    A preprocessing step that is almost recommended is to scale columns to the stand...

    到不了的都叫做远方
  • 在犯罪会话数据中使用网络知识改善说话人识别(CS SI)

    刑事调查依赖于对话数据的收集。必须评估说话者的身份,以建立或改善现有犯罪网络的准确性。研究人员使用社交网络分析工具来识别网络中最核心的人物和不同社区。我们介绍犯...

    小童
  • Kubernetes Scheduler Extender浅析

    Scheduler 组件可以视为一种监视 watche 和将 Pod 分配 assign 到 Node 的特殊类型控制器 controller。在 Kubern...

    runzhliu
  • SAP CDS view自学教程之一:如何测试基于SAP CDS view自动生成的OData服务

    I am a newbie of CDS view related topic and recently I have to learn it. I will ...

    Jerry Wang
  • 1D卷积入门:一维卷积是如何处理数字信号的

    卷积是对两个函数(f和g)进行的一种数学运算,它产生的第三个函数表示其中一个函数的形状如何被另一个函数修改。

    deephub
  • A Day in the Life of a Web Page Request

    Author: bakari   Date: 2012.5.23 老师上课的时候给了一张图,个人感觉非常经典,几乎将请求一个网页所要进行的流程都弄得非常详细,对...

    CloudDeveloper
  • Educational Codeforces Round 44 (Rated for Div. 2)A. Chess Placing

    You are given a chessboard of size 1 × n. It is guaranteed that n is even. The c...

    用户2965768
  • Duke@coursera 数据分析与统计推断unit6introduction to linear regression

    properties (I) the magnitude (absolutevalue) of the correlation coefficient meas...

    统计学家

扫码关注云+社区

领取腾讯云代金券