首页
学习
活动
专区
圈层
工具
发布
首页
学习
活动
专区
圈层
工具
MCP广场
社区首页 >问答首页 >Scala Spark中的groupBy函数需要Lzocodec吗?

Scala Spark中的groupBy函数需要Lzocodec吗?
EN

Stack Overflow用户
提问于 2018-01-31 09:09:47
回答 1查看 288关注 0票数 0

我在Scala Spark中做了一个函数,看起来像这样。

代码语言:javascript
运行
复制
def prepareSequences(data: RDD[String], splitChar: Char = '\t') = {
    val x = data.map(line => {
    val Array(id, se, offset, hour) = line.split(splitChar)
    (id + "-" + se,
    Step(offset = if (offset == "NULL") {
    -5
    } else {
    offset.toInt
    },
    hour = hour.toInt))
    })

    val y = x.groupBy(_._1)}

我需要groupBy,但是一旦我添加它,我就会得到一个错误。请求Lzocodec时出现错误。

代码语言:javascript
运行
复制
        Exception in thread "main" java.lang.RuntimeException: Error in configuring object
    at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:112)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:78)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
    at org.apache.spark.rdd.HadoopRDD.getInputFormat(HadoopRDD.scala:188)
    at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:201)
    at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:252)
    at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:250)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.rdd.RDD.partitions(RDD.scala:250)
    at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
    at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:252)
    at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:250)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.rdd.RDD.partitions(RDD.scala:250)
    at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
    at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:252)
    at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:250)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.rdd.RDD.partitions(RDD.scala:250)
    at org.apache.spark.Partitioner$$anonfun$defaultPartitioner$2.apply(Partitioner.scala:66)
    at org.apache.spark.Partitioner$$anonfun$defaultPartitioner$2.apply(Partitioner.scala:66)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
    at scala.collection.immutable.List.foreach(List.scala:381)
    at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
    at scala.collection.immutable.List.map(List.scala:285)
    at org.apache.spark.Partitioner$.defaultPartitioner(Partitioner.scala:66)
    at org.apache.spark.rdd.RDD$$anonfun$groupBy$1.apply(RDD.scala:687)
    at org.apache.spark.rdd.RDD$$anonfun$groupBy$1.apply(RDD.scala:687)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
    at org.apache.spark.rdd.RDD.withScope(RDD.scala:362)
    at org.apache.spark.rdd.RDD.groupBy(RDD.scala:686)
    at com.savagebeast.mypackage.DataPreprocessing$.prepareSequences(DataPreprocessing.scala:42)
    at com.savagebeast.mypackage.activity_mapper$.main(activity_mapper.scala:31)
    at com.savagebeast.mypackage.activity_mapper.main(activity_mapper.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
    Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109)
    ... 44 more
    Caused by: java.lang.IllegalArgumentException: Compression codec com.hadoop.compression.lzo.LzoCodec not found.
    at org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:139)
    at org.apache.hadoop.io.compress.CompressionCodecFactory.<init>(CompressionCodecFactory.java:180)
    at org.apache.hadoop.mapred.TextInputFormat.configure(TextInputFormat.java:45)
    ... 49 more
    Caused by: java.lang.ClassNotFoundException: Class com.hadoop.compression.lzo.LzoCodec not found
    at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2101)
    at org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:132)
    ... 51 more

我按照这个Class com.hadoop.compression.lzo.LzoCodec not found for Spark on CDH 5?安装了lzo和其他所需的东西

我是不是遗漏了什么?

更新:找到解决方案。

像这样对RDD进行分区解决了这个问题。

代码语言:javascript
运行
复制
val y = x.groupByKey(50)

50是我想要用于RDD的分区数量。它可以是任何数字。

然而,我不确定为什么这是可行的。如果有人能解释的话我会很感激。

更新-2:到目前为止,下面的工作更明智,而且是稳定的。

我将hadoop-lzo-0.4.21-SNAPSHOT.jar/Users/<username>/hadoop-lzo/target复制到/usr/local/Cellar/apache-spark/2.1.0/libexec/jars。实质上是将jar复制到spark的类路径中。

EN

回答 1

Stack Overflow用户

发布于 2018-01-31 09:19:15

不是的。这不是groupBy所必需的。如果你看一下堆栈跟踪(用于发布它的kudos),你会发现它在输入格式中的某个地方失败了:

org.apache.hadoop.mapred.TextInputFormat.configure(TextInputFormat.java:45)上的

这表明您的输入是压缩的。当您调用groupBy时,它会失败,因为这是Spark必须决定分区数量并触摸输入的地方。

在实践中-是的,似乎你需要lzo编解码器来执行你的工作。

票数 1
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/48532951

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档