我试图在一个非常大的数据集(数百万行)上运行一个广义线性模型。但是,R似乎无法处理分析,因为我一直收到内存分配错误(无法分配size...etc的向量)。
数据适合RAM,但似乎太大,无法估计复杂的模型。作为一种解决方案,我正在探索使用ff包来用磁盘上存储取代r的内存存储机制。
我已经成功地(我想)将数据卸载到我的硬盘上,但是当我试图估计glm (通过biglm包)时,我会得到以下错误:
Error: $ operator is invalid for atomic vectors我不知道为什么当我使用更大的‘m函数时,我会得到这个特定的错误。当我在完整的数据集上运行glm时,它并没有给出这个特定的错误,尽管在“操作符无效”错误触发之前,r可能已经耗尽了内存。
我提供了下面的示例数据集和代码。注意,标准glm在这个示例数据上运行得很好。使用biglm时会出现问题。
如果你有任何问题请告诉我。
提前谢谢你!
#Load required packages
library(readr)
library(ff)
library(ffbase)
library(LaF)
library(biglm)
#Create sample data
df <- data.frame("id" = as.character(1:20), "group" = rep(seq(1:5), 4),
"x1" = as.character(rep(c("a", "b", "c", "d"), 5)),
"x2" = rnorm(20, 50, 1), y = sample(0:1, 20, replace=T),
stringsAsFactors = FALSE)
#Write data to file
write_csv(df, "df.csv")
#Create connection to sample data using laf
con <- laf_open_csv(filename = "df.csv",
column_types = c("string", "string", "string",
"double", "string"),
column_names = c("id", "group", "x1", "x2", "y"),
skip = 1)
#Use LaF to import data into ffdf object
ff <- laf_to_ffdf(laf = con)
#Fit glm on data stored in RAM (note this model runs fine)
fit.glm <- glm(y ~ factor(x1) + x2 + factor(group), data=df,
family="binomial")
#Fit glm on data stored on hard-drive (note this model fails)
fit.big <- bigglm(y ~ factor(x1) + x2 + factor(group), data=ff,
family="binomial")发布于 2019-05-09 10:49:25
你用错了家庭论点。
library(ffbase)
library(biglm)
df <- data.frame("id" = factor(as.character(1:20)), "group" = factor(rep(seq(1:5), 4)),
"x1" = factor(as.character(rep(c("a", "b", "c", "d"), 5))),
"x2" = rnorm(20, 50, 1), y = sample(0:1, 20, replace=T),
stringsAsFactors = FALSE)
d <- as.ffdf(df)
fit.big <- bigglm.ffdf(y ~ x1 + x2 , data = d,
family = binomial(link = "logit"), chunksize = 3)https://stackoverflow.com/questions/56049950
复制相似问题