我有几种算法: rpart,kNN,logistic回归,randomForest,朴素贝叶斯和支持向量机。我想使用前向/后向和遗传算法选择来找到用于特定算法的最佳特征子集。
我如何在R中实现包装器类型的向前/向后和特征的遗传选择?
发布于 2017-11-30 18:05:15
我现在正在测试包装器,所以我会在R中给你几个Pacckage名称。什么是wrapper?

现在使用的方法是: MASS Package:在逐步算法中选择一个模型
stepAIC(model, direction = "both", trace = FALSE) stepAIC(model, direction = "backward", trace = FALSE) stepAIC(model, direction = "forward", trace = FALSE)
点菜套餐:Backwards Feature Selection
control <- rfeControl(functions = lmFuncs, method = "repeatedcv", number = 5, verbose = TRUE)
rfe_results <- rfe(x, y, sizes = c(1:10), rfeControl = control)或使用遗传算法的监督特征选择
gafs_results <- gafs(x, y, gafsControl = control)或模拟退火特征选择
safs_results <- safs(x, y, iters = 10, safsControl = control)希望我能给你一个很好的概述。有更多的方法...
发布于 2016-04-20 22:39:57
R中的caret包具有丰富的功能,可以很容易地在您提到的算法之间进行切换。
他们的网站上也有很多文档:
希望这能有所帮助
发布于 2017-11-19 10:43:44
下面是一些用于正向特征选择的代码
selectFeature <- function(train, test, cls.train, cls.test, features) {
## identify a feature to be selected
current.best.accuracy <- -Inf #nagtive infinity
selected.i <- NULL
for(i in 1:ncol(train)) {
current.f <- colnames(train)[i]
if(!current.f %in% features) {
model <- knn(train=train[,c(features, current.f)], test=test[,c(features, current.f)], cl=cls.train, k=3)
test.acc <- sum(model == cls.test) / length(cls.test)
if(test.acc > current.best.accuracy) {
current.best.accuracy <- test.acc
selected.i <- colnames(train)[i]
}
}
}
return(selected.i)
}
##
library(caret)
set.seed(1)
inTrain <- createDataPartition(Sonar$Class, p = .6)[[1]]
allFeatures <- colnames(Sonar)[-61]
train <- Sonar[ inTrain,-61]
test <- Sonar[-inTrain,-61]
cls.train <- Sonar$Class[inTrain]
cls.test <- Sonar$Class[-inTrain]
# use correlation to determine the first feature
cls.train.numeric <- rep(c(0, 1), c(sum(cls.train == "R"), sum(cls.train == "M")))
features <- c()
current.best.cor <- 0
for(i in 1:ncol(train[,-61])) {
if(current.best.cor < abs(cor(train[,i], cls.train.numeric))) {
current.best.cor <- abs(cor(train[,i], cls.train.numeric))
features <- colnames(train)[i]
}
}
print(features)
# select the 2 to 10 best features using knn as a wrapper classifier
for (j in 2:10) {
selected.i <- selectFeature(train, test, cls.train, cls.test, features)
print(selected.i)
# add the best feature from current run
features <- c(features, selected.i)
}https://stackoverflow.com/questions/36746575
复制相似问题